JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Exploring clinical associations using -omics based enrichment analyses.
PUBLISHED: 03-15-2009
The vast amounts of clinical data collected in electronic health records (EHR) is analogous to the data explosion from the "-omics" revolution. In the EHR clinicians often maintain patient-specific problem summary lists which are used to provide a concise overview of significant medical diagnoses. We hypothesized that by tapping into the collective wisdom generated by hundreds of physicians entering problems into the EHR we could detect significant associations among diagnoses that are not described in the literature.
Authors: Pawan Kumar, Allison E. Bartoszek, Thomas M. Moran, Jack Gorski, Sanjib Bhattacharyya, Jose F. Navidad, Monica S. Thakar, Subramaniam Malarkannan.
Published: 02-04-2012
Influenza virus is a respiratory pathogen that causes a high degree of morbidity and mortality every year in multiple parts of the world. Therefore, precise diagnosis of the infecting strain and rapid high-throughput screening of vast numbers of clinical samples is paramount to control the spread of pandemic infections. Current clinical diagnoses of influenza infections are based on serologic testing, polymerase chain reaction, direct specimen immunofluorescence and cell culture 1,2. Here, we report the development of a novel diagnostic technique used to detect live influenza viruses. We used the mouse-adapted human A/PR/8/34 (PR8, H1N1) virus 3 to test the efficacy of this technique using MDCK cells 4. MDCK cells (104 or 5 x 103 per well) were cultured in 96- or 384-well plates, infected with PR8 and viral proteins were detected using anti-M2 followed by an IR dye-conjugated secondary antibody. M2 5 and hemagglutinin 1 are two major marker proteins used in many different diagnostic assays. Employing IR-dye-conjugated secondary antibodies minimized the autofluorescence associated with other fluorescent dyes. The use of anti-M2 antibody allowed us to use the antigen-specific fluorescence intensity as a direct metric of viral quantity. To enumerate the fluorescence intensity, we used the LI-COR Odyssey-based IR scanner. This system uses two channel laser-based IR detections to identify fluorophores and differentiate them from background noise. The first channel excites at 680 nm and emits at 700 nm to help quantify the background. The second channel detects fluorophores that excite at 780 nm and emit at 800 nm. Scanning of PR8-infected MDCK cells in the IR scanner indicated a viral titer-dependent bright fluorescence. A positive correlation of fluorescence intensity to virus titer starting from 102-105 PFU could be consistently observed. Minimal but detectable positivity consistently seen with 102-103 PFU PR8 viral titers demonstrated the high sensitivity of the near-IR dyes. The signal-to-noise ratio was determined by comparing the mock-infected or isotype antibody-treated MDCK cells. Using the fluorescence intensities from 96- or 384-well plate formats, we constructed standard titration curves. In these calculations, the first variable is the viral titer while the second variable is the fluorescence intensity. Therefore, we used the exponential distribution to generate a curve-fit to determine the polynomial relationship between the viral titers and fluorescence intensities. Collectively, we conclude that IR dye-based protein detection system can help diagnose infecting viral strains and precisely enumerate the titer of the infecting pathogens.
28 Related JoVE Articles!
Play Button
Clinical Assessment of Spatiotemporal Gait Parameters in Patients and Older Adults
Authors: Julia F. Item-Glatthorn, Nicola A. Maffiuletti.
Institutions: Schulthess Clinic.
Spatial and temporal characteristics of human walking are frequently evaluated to identify possible gait impairments, mainly in orthopedic and neurological patients1-4, but also in healthy older adults5,6. The quantitative gait analysis described in this protocol is performed with a recently-introduced photoelectric system (see Materials table) which has the potential to be used in the clinic because it is portable, easy to set up (no subject preparation is required before a test), and does not require maintenance and sensor calibration. The photoelectric system consists of series of high-density floor-based photoelectric cells with light-emitting and light-receiving diodes that are placed parallel to each other to create a corridor, and are oriented perpendicular to the line of progression7. The system simply detects interruptions in light signal, for instance due to the presence of feet within the recording area. Temporal gait parameters and 1D spatial coordinates of consecutive steps are subsequently calculated to provide common gait parameters such as step length, single limb support and walking velocity8, whose validity against a criterion instrument has recently been demonstrated7,9. The measurement procedures are very straightforward; a single patient can be tested in less than 5 min and a comprehensive report can be generated in less than 1 min.
Medicine, Issue 93, gait analysis, walking, floor-based photocells, spatiotemporal, elderly, orthopedic patients, neurological patients
Play Button
Training Synesthetic Letter-color Associations by Reading in Color
Authors: Olympia Colizoli, Jaap M. J. Murre, Romke Rouw.
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Play Button
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Authors: Alla Gagarinova, Mohan Babu, Jack Greenblatt, Andrew Emili.
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g. protein-protein) and functional (e.g. gene-gene or genetic) interactions (GI)1. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7, but GI information remains sparse for prokaryotes8, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10. Here, we present the key steps required to perform quantitative E. coli Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format. Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g. the 'Keio' collection11) and essential gene hypomorphic mutations (i.e. alleles conferring reduced protein expression, stability, or activity9, 12, 13) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e. slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2 as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Colorimetric Paper-based Detection of Escherichia coli, Salmonella spp., and Listeria monocytogenes from Large Volumes of Agricultural Water
Authors: Bledar Bisha, Jaclyn A. Adkins, Jana C. Jokerst, Jeffrey C. Chandler, Alma Pérez-Méndez, Shannon M. Coleman, Adrian O. Sbodio, Trevor V. Suslow, Michelle D. Danyluk, Charles S. Henry, Lawrence D. Goodridge.
Institutions: University of Wyoming, Colorado State University, Colorado State University, Colorado State University, University of California, Davis, University of Florida, McGill University.
This protocol describes rapid colorimetric detection of Escherichia coli, Salmonella spp., and Listeria monocytogenes from large volumes (10 L) of agricultural waters. Here, water is filtered through sterile Modified Moore Swabs (MMS), which consist of a simple gauze filter enclosed in a plastic cartridge, to concentrate bacteria. Following filtration, non-selective or selective enrichments for the target bacteria are performed in the MMS. For colorimetric detection of the target bacteria, the enrichments are then assayed using paper-based analytical devices (µPADs) embedded with bacteria-indicative substrates. Each substrate reacts with target-indicative bacterial enzymes, generating colored products that can be detected visually (qualitative detection) on the µPAD. Alternatively, digital images of the reacted µPADs can be generated with common scanning or photographic devices and analyzed using ImageJ software, allowing for more objective and standardized interpretation of results. Although the biochemical screening procedures are designed to identify the aforementioned bacterial pathogens, in some cases enzymes produced by background microbiota or the degradation of the colorimetric substrates may produce a false positive. Therefore, confirmation using a more discriminatory diagnostic is needed. Nonetheless, this bacterial concentration and detection platform is inexpensive, sensitive (0.1 CFU/ml detection limit), easy to perform, and rapid (concentration, enrichment, and detection are performed within approximately 24 hr), justifying its use as an initial screening method for the microbiological quality of agricultural water.
Environmental Sciences, Issue 88, Paper-based analytical device (µPAD), Colorimetric enzymatic detection, Salmonella spp., Listeria monocytogenes, Escherichia coli, Modified Moore Swab (MMS), agricultural water, food safety, environmental microbiology
Play Button
gDNA Enrichment by a Transposase-based Technology for NGS Analysis of the Whole Sequence of BRCA1, BRCA2, and 9 Genes Involved in DNA Damage Repair
Authors: Sandy Chevrier, Romain Boidot.
Institutions: Centre Georges-François Leclerc.
The widespread use of Next Generation Sequencing has opened up new avenues for cancer research and diagnosis. NGS will bring huge amounts of new data on cancer, and especially cancer genetics. Current knowledge and future discoveries will make it necessary to study a huge number of genes that could be involved in a genetic predisposition to cancer. In this regard, we developed a Nextera design to study 11 complete genes involved in DNA damage repair. This protocol was developed to safely study 11 genes (ATM, BARD1, BRCA1, BRCA2, BRIP1, CHEK2, PALB2, RAD50, RAD51C, RAD80, and TP53) from promoter to 3'-UTR in 24 patients simultaneously. This protocol, based on transposase technology and gDNA enrichment, gives a great advantage in terms of time for the genetic diagnosis thanks to sample multiplexing. This protocol can be safely used with blood gDNA.
Genetics, Issue 92, gDNA enrichment, Nextera, NGS, DNA damage, BRCA1, BRCA2
Play Button
Isolation of Myeloid Dendritic Cells and Epithelial Cells from Human Thymus
Authors: Christina Stoeckle, Ioanna A. Rota, Eva Tolosa, Christoph Haller, Arthur Melms, Eleni Adamopoulou.
Institutions: Hertie Institute for Clinical Brain Research, University of Bern, University Medical Center Hamburg-Eppendorf, University Clinic Tuebingen, University Hospital Erlangen.
In this protocol we provide a method to isolate dendritic cells (DC) and epithelial cells (TEC) from the human thymus. DC and TEC are the major antigen presenting cell (APC) types found in a normal thymus and it is well established that they play distinct roles during thymic selection. These cells are localized in distinct microenvironments in the thymus and each APC type makes up only a minor population of cells. To further understand the biology of these cell types, characterization of these cell populations is highly desirable but due to their low frequency, isolation of any of these cell types requires an efficient and reproducible procedure. This protocol details a method to obtain cells suitable for characterization of diverse cellular properties. Thymic tissue is mechanically disrupted and after different steps of enzymatic digestion, the resulting cell suspension is enriched using a Percoll density centrifugation step. For isolation of myeloid DC (CD11c+), cells from the low-density fraction (LDF) are immunoselected by magnetic cell sorting. Enrichment of TEC populations (mTEC, cTEC) is achieved by depletion of hematopoietic (CD45hi) cells from the low-density Percoll cell fraction allowing their subsequent isolation via fluorescence activated cell sorting (FACS) using specific cell markers. The isolated cells can be used for different downstream applications.
Immunology, Issue 79, Immune System Processes, Biological Processes, immunology, Immune System Diseases, Immune System Phenomena, Life Sciences (General), immunology, human thymus, isolation, dendritic cells, mTEC, cTEC
Play Button
High-throughput, Automated Extraction of DNA and RNA from Clinical Samples using TruTip Technology on Common Liquid Handling Robots
Authors: Rebecca C. Holmberg, Alissa Gindlesperger, Tinsley Stokes, Dane Brady, Nitu Thakore, Philip Belgrader, Christopher G. Cooney, Darrell P. Chandler.
Institutions: Akonni Biosystems, Inc., Akonni Biosystems, Inc., Akonni Biosystems, Inc., Akonni Biosystems, Inc..
TruTip is a simple nucleic acid extraction technology whereby a porous, monolithic binding matrix is inserted into a pipette tip. The geometry of the monolith can be adapted for specific pipette tips ranging in volume from 1.0 to 5.0 ml. The large porosity of the monolith enables viscous or complex samples to readily pass through it with minimal fluidic backpressure. Bi-directional flow maximizes residence time between the monolith and sample, and enables large sample volumes to be processed within a single TruTip. The fundamental steps, irrespective of sample volume or TruTip geometry, include cell lysis, nucleic acid binding to the inner pores of the TruTip monolith, washing away unbound sample components and lysis buffers, and eluting purified and concentrated nucleic acids into an appropriate buffer. The attributes and adaptability of TruTip are demonstrated in three automated clinical sample processing protocols using an Eppendorf epMotion 5070, Hamilton STAR and STARplus liquid handling robots, including RNA isolation from nasopharyngeal aspirate, genomic DNA isolation from whole blood, and fetal DNA extraction and enrichment from large volumes of maternal plasma (respectively).
Genetics, Issue 76, Bioengineering, Biomedical Engineering, Molecular Biology, Automation, Laboratory, Clinical Laboratory Techniques, Molecular Diagnostic Techniques, Analytic Sample Preparation Methods, Clinical Laboratory Techniques, Molecular Diagnostic Techniques, Genetic Techniques, Molecular Diagnostic Techniques, Automation, Laboratory, Chemistry, Clinical, DNA/RNA extraction, automation, nucleic acid isolation, sample preparation, nasopharyngeal aspirate, blood, plasma, high-throughput, sequencing
Play Button
Multi-step Preparation Technique to Recover Multiple Metabolite Compound Classes for In-depth and Informative Metabolomic Analysis
Authors: Charmion Cruickshank-Quinn, Kevin D. Quinn, Roger Powell, Yanhui Yang, Michael Armstrong, Spencer Mahaffey, Richard Reisdorph, Nichole Reisdorph.
Institutions: National Jewish Health, University of Colorado Denver.
Metabolomics is an emerging field which enables profiling of samples from living organisms in order to obtain insight into biological processes. A vital aspect of metabolomics is sample preparation whereby inconsistent techniques generate unreliable results. This technique encompasses protein precipitation, liquid-liquid extraction, and solid-phase extraction as a means of fractionating metabolites into four distinct classes. Improved enrichment of low abundance molecules with a resulting increase in sensitivity is obtained, and ultimately results in more confident identification of molecules. This technique has been applied to plasma, bronchoalveolar lavage fluid, and cerebrospinal fluid samples with volumes as low as 50 µl.  Samples can be used for multiple downstream applications; for example, the pellet resulting from protein precipitation can be stored for later analysis. The supernatant from that step undergoes liquid-liquid extraction using water and strong organic solvent to separate the hydrophilic and hydrophobic compounds. Once fractionated, the hydrophilic layer can be processed for later analysis or discarded if not needed. The hydrophobic fraction is further treated with a series of solvents during three solid-phase extraction steps to separate it into fatty acids, neutral lipids, and phospholipids. This allows the technician the flexibility to choose which class of compounds is preferred for analysis. It also aids in more reliable metabolite identification since some knowledge of chemical class exists.
Bioengineering, Issue 89, plasma, chemistry techniques, analytical, solid phase extraction, mass spectrometry, metabolomics, fluids and secretions, profiling, small molecules, lipids, liquid chromatography, liquid-liquid extraction, cerebrospinal fluid, bronchoalveolar lavage fluid
Play Button
The ChroP Approach Combines ChIP and Mass Spectrometry to Dissect Locus-specific Proteomic Landscapes of Chromatin
Authors: Monica Soldi, Tiziana Bonaldi.
Institutions: European Institute of Oncology.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
Biochemistry, Issue 86, chromatin, histone post-translational modifications (hPTMs), epigenetics, mass spectrometry, proteomics, SILAC, chromatin immunoprecipitation , histone variants, chromatome, hPTMs cross-talks
Play Button
Measuring Frailty in HIV-infected Individuals. Identification of Frail Patients is the First Step to Amelioration and Reversal of Frailty
Authors: Hilary C. Rees, Voichita Ianas, Patricia McCracken, Shannon Smith, Anca Georgescu, Tirdad Zangeneh, Jane Mohler, Stephen A. Klotz.
Institutions: University of Arizona, University of Arizona.
A simple, validated protocol consisting of a battery of tests is available to identify elderly patients with frailty syndrome. This syndrome of decreased reserve and resistance to stressors increases in incidence with increasing age. In the elderly, frailty may pursue a step-wise loss of function from non-frail to pre-frail to frail. We studied frailty in HIV-infected patients and found that ~20% are frail using the Fried phenotype using stringent criteria developed for the elderly1,2. In HIV infection the syndrome occurs at a younger age. HIV patients were checked for 1) unintentional weight loss; 2) slowness as determined by walking speed; 3) weakness as measured by a grip dynamometer; 4) exhaustion by responses to a depression scale; and 5) low physical activity was determined by assessing kilocalories expended in a week's time. Pre-frailty was present with any two of five criteria and frailty was present if any three of the five criteria were abnormal. The tests take approximately 10-15 min to complete and they can be performed by medical assistants during routine clinic visits. Test results are scored by referring to standard tables. Understanding which of the five components contribute to frailty in an individual patient can allow the clinician to address relevant underlying problems, many of which are not evident in routine HIV clinic visits.
Medicine, Issue 77, Infection, Virology, Infectious Diseases, Anatomy, Physiology, Molecular Biology, Biomedical Engineering, Retroviridae Infections, Body Weight Changes, Diagnostic Techniques and Procedures, Physical Examination, Muscle Strength, Behavior, Virus Diseases, Pathological Conditions, Signs and Symptoms, Diagnosis, Musculoskeletal and Neural Physiological Phenomena, HIV, HIV-1, AIDS, Frailty, Depression, Weight Loss, Weakness, Slowness, Exhaustion, Aging, clinical techniques
Play Button
Microvascular Decompression: Salient Surgical Principles and Technical Nuances
Authors: Jonathan Forbes, Calvin Cooper, Walter Jermakowicz, Joseph Neimat, Peter Konrad.
Institutions: Vanderbilt University Medical Center, Vanderbilt University Medical Center.
Trigeminal neuralgia is a disorder associated with severe episodes of lancinating pain in the distribution of the trigeminal nerve. Previous reports indicate that 80-90% of cases are related to compression of the trigeminal nerve by an adjacent vessel. The majority of patients with trigeminal neuralgia eventually require surgical management in order to achieve remission of symptoms. Surgical options for management include ablative procedures (e.g., radiosurgery, percutaneous radiofrequency lesioning, balloon compression, glycerol rhizolysis, etc.) and microvascular decompression. Ablative procedures fail to address the root cause of the disorder and are less effective at preventing recurrence of symptoms over the long term than microvascular decompression. However, microvascular decompression is inherently more invasive than ablative procedures and is associated with increased surgical risks. Previous studies have demonstrated a correlation between surgeon experience and patient outcome in microvascular decompression. In this series of 59 patients operated on by two neurosurgeons (JSN and PEK) since 2006, 93% of patients demonstrated substantial improvement in their trigeminal neuralgia following the procedure—with follow-up ranging from 6 weeks to 2 years. Moreover, 41 of 66 patients (approximately 64%) have been entirely pain-free following the operation. In this publication, video format is utilized to review the microsurgical pathology of this disorder. Steps of the operative procedure are reviewed and salient principles and technical nuances useful in minimizing complications and maximizing efficacy are discussed.
Medicine, Issue 53, microvascular, decompression, trigeminal, neuralgia, operation, video
Play Button
Quantitative Analysis of Chromatin Proteomes in Disease
Authors: Emma Monte, Haodong Chen, Maria Kolmakova, Michelle Parvatiyar, Thomas M. Vondriska, Sarah Franklin.
Institutions: David Geffen School of Medicine at UCLA, David Geffen School of Medicine at UCLA, David Geffen School of Medicine at UCLA, Nora Eccles Harrison Cardiovascular Research and Training Institute, University of Utah.
In the nucleus reside the proteomes whose functions are most intimately linked with gene regulation. Adult mammalian cardiomyocyte nuclei are unique due to the high percentage of binucleated cells,1 the predominantly heterochromatic state of the DNA, and the non-dividing nature of the cardiomyocyte which renders adult nuclei in a permanent state of interphase.2 Transcriptional regulation during development and disease have been well studied in this organ,3-5 but what remains relatively unexplored is the role played by the nuclear proteins responsible for DNA packaging and expression, and how these proteins control changes in transcriptional programs that occur during disease.6 In the developed world, heart disease is the number one cause of mortality for both men and women.7 Insight on how nuclear proteins cooperate to regulate the progression of this disease is critical for advancing the current treatment options. Mass spectrometry is the ideal tool for addressing these questions as it allows for an unbiased annotation of the nuclear proteome and relative quantification for how the abundance of these proteins changes with disease. While there have been several proteomic studies for mammalian nuclear protein complexes,8-13 until recently14 there has been only one study examining the cardiac nuclear proteome, and it considered the entire nucleus, rather than exploring the proteome at the level of nuclear sub compartments.15 In large part, this shortage of work is due to the difficulty of isolating cardiac nuclei. Cardiac nuclei occur within a rigid and dense actin-myosin apparatus to which they are connected via multiple extensions from the endoplasmic reticulum, to the extent that myocyte contraction alters their overall shape.16 Additionally, cardiomyocytes are 40% mitochondria by volume17 which necessitates enrichment of the nucleus apart from the other organelles. Here we describe a protocol for cardiac nuclear enrichment and further fractionation into biologically-relevant compartments. Furthermore, we detail methods for label-free quantitative mass spectrometric dissection of these fractions-techniques amenable to in vivo experimentation in various animal models and organ systems where metabolic labeling is not feasible.
Medicine, Issue 70, Molecular Biology, Immunology, Genetics, Genomics, Physiology, Protein, DNA, Chromatin, cardiovascular disease, proteomics, mass spectrometry
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
A Research Method For Detecting Transient Myocardial Ischemia In Patients With Suspected Acute Coronary Syndrome Using Continuous ST-segment Analysis
Authors: Michele M. Pelter, Teri M. Kozik, Denise L. Loranger, Mary G. Carey.
Institutions: University of Nevada, Reno, St. Joseph's Medical Center, University of Rochester Medical Center .
Each year, an estimated 785,000 Americans will have a new coronary attack, or acute coronary syndrome (ACS). The pathophysiology of ACS involves rupture of an atherosclerotic plaque; hence, treatment is aimed at plaque stabilization in order to prevent cellular death. However, there is considerable debate among clinicians, about which treatment pathway is best: early invasive using percutaneous coronary intervention (PCI/stent) when indicated or a conservative approach (i.e., medication only with PCI/stent if recurrent symptoms occur). There are three types of ACS: ST elevation myocardial infarction (STEMI), non-ST elevation MI (NSTEMI), and unstable angina (UA). Among the three types, NSTEMI/UA is nearly four times as common as STEMI. Treatment decisions for NSTEMI/UA are based largely on symptoms and resting or exercise electrocardiograms (ECG). However, because of the dynamic and unpredictable nature of the atherosclerotic plaque, these methods often under detect myocardial ischemia because symptoms are unreliable, and/or continuous ECG monitoring was not utilized. Continuous 12-lead ECG monitoring, which is both inexpensive and non-invasive, can identify transient episodes of myocardial ischemia, a precursor to MI, even when asymptomatic. However, continuous 12-lead ECG monitoring is not usual hospital practice; rather, only two leads are typically monitored. Information obtained with 12-lead ECG monitoring might provide useful information for deciding the best ACS treatment. Purpose. Therefore, using 12-lead ECG monitoring, the COMPARE Study (electroCardiographic evaluatiOn of ischeMia comParing invAsive to phaRmacological trEatment) was designed to assess the frequency and clinical consequences of transient myocardial ischemia, in patients with NSTEMI/UA treated with either early invasive PCI/stent or those managed conservatively (medications or PCI/stent following recurrent symptoms). The purpose of this manuscript is to describe the methodology used in the COMPARE Study. Method. Permission to proceed with this study was obtained from the Institutional Review Board of the hospital and the university. Research nurses identify hospitalized patients from the emergency department and telemetry unit with suspected ACS. Once consented, a 12-lead ECG Holter monitor is applied, and remains in place during the patient's entire hospital stay. Patients are also maintained on the routine bedside ECG monitoring system per hospital protocol. Off-line ECG analysis is done using sophisticated software and careful human oversight.
Medicine, Issue 70, Anatomy, Physiology, Cardiology, Myocardial Ischemia, Cardiovascular Diseases, Health Occupations, Health Care, transient myocardial ischemia, Acute Coronary Syndrome, electrocardiogram, ST-segment monitoring, Holter monitoring, research methodology
Play Button
Adaptation of Semiautomated Circulating Tumor Cell (CTC) Assays for Clinical and Preclinical Research Applications
Authors: Lori E. Lowes, Benjamin D. Hedley, Michael Keeney, Alison L. Allan.
Institutions: London Health Sciences Centre, Western University, London Health Sciences Centre, Lawson Health Research Institute, Western University.
The majority of cancer-related deaths occur subsequent to the development of metastatic disease. This highly lethal disease stage is associated with the presence of circulating tumor cells (CTCs). These rare cells have been demonstrated to be of clinical significance in metastatic breast, prostate, and colorectal cancers. The current gold standard in clinical CTC detection and enumeration is the FDA-cleared CellSearch system (CSS). This manuscript outlines the standard protocol utilized by this platform as well as two additional adapted protocols that describe the detailed process of user-defined marker optimization for protein characterization of patient CTCs and a comparable protocol for CTC capture in very low volumes of blood, using standard CSS reagents, for studying in vivo preclinical mouse models of metastasis. In addition, differences in CTC quality between healthy donor blood spiked with cells from tissue culture versus patient blood samples are highlighted. Finally, several commonly discrepant items that can lead to CTC misclassification errors are outlined. Taken together, these protocols will provide a useful resource for users of this platform interested in preclinical and clinical research pertaining to metastasis and CTCs.
Medicine, Issue 84, Metastasis, circulating tumor cells (CTCs), CellSearch system, user defined marker characterization, in vivo, preclinical mouse model, clinical research
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Generation of Comprehensive Thoracic Oncology Database - Tool for Translational Research
Authors: Mosmi Surati, Matthew Robinson, Suvobroto Nandi, Leonardo Faoro, Carley Demchuk, Rajani Kanteti, Benjamin Ferguson, Tara Gangadhar, Thomas Hensing, Rifat Hasina, Aliya Husain, Mark Ferguson, Theodore Karrison, Ravi Salgia.
Institutions: University of Chicago, University of Chicago, Northshore University Health Systems, University of Chicago, University of Chicago, University of Chicago.
The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.
Medicine, Issue 47, Database, Thoracic oncology, Bioinformatics, Biorepository, Microsoft Access, Proteomics, Genomics
Play Button
Deep Neuromuscular Blockade Leads to a Larger Intraabdominal Volume During Laparoscopy
Authors: Astrid Listov Lindekaer, Henrik Halvor Springborg, Olav Istre.
Institutions: Aleris-Hamlet Hospitals, Soeborg, Denmark, Aleris-Hamlet Hospitals, Soeborg, Denmark.
Shoulder pain is a commonly reported symptom following laparoscopic procedures such as myomectomy or hysterectomy, and recent studies have shown that lowering the insufflation pressure during surgery may reduce the risk of post-operative pain. In this pilot study, a method is presented for measuring the intra-abdominal space available to the surgeon during laproscopy, in order to examine whether the relaxation produced by deep neuromuscular blockade can increase the working surgical space sufficiently to permit a reduction in the CO2 insufflation pressure. Using the laproscopic grasper, the distance from the promontory to the skin is measured at two different insufflation pressures: 8 mm Hg and 12 mm Hg. After the initial measurements, a neuromuscular blocking agent (rocuronium) is administered to the patient and the intra-abdominal volume is measured again. Pilot data collected from 15 patients shows that the intra-abdominal space at 8 mm Hg with blockade is comparable to the intra-abdominal space measured at 12 mm Hg without blockade. The impact of neuromuscular blockade was not correlated with patient height, weight, BMI, and age. Thus, using neuromuscular blockade to maintain a steady volume while reducing insufflation pressure may produce improved patient outcomes.
Medicine, Issue 76, Anatomy, Physiology, Neurobiology, Surgery, gynecology, laparoscopy, deep neuromuscular blockade, reversal, rocuronium, sugammadex, laparoscopic surgery, clinical techniques, surgical techniques
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Play Button
Using Visual and Narrative Methods to Achieve Fair Process in Clinical Care
Authors: Laura S. Lorenz, Jon A. Chilingerian.
Institutions: Brandeis University, Brandeis University.
The Institute of Medicine has targeted patient-centeredness as an important area of quality improvement. A major dimension of patient-centeredness is respect for patient's values, preferences, and expressed needs. Yet specific approaches to gaining this understanding and translating it to quality care in the clinical setting are lacking. From a patient perspective quality is not a simple concept but is best understood in terms of five dimensions: technical outcomes; decision-making efficiency; amenities and convenience; information and emotional support; and overall patient satisfaction. Failure to consider quality from this five-pronged perspective results in a focus on medical outcomes, without considering the processes central to quality from the patient's perspective and vital to achieving good outcomes. In this paper, we argue for applying the concept of fair process in clinical settings. Fair process involves using a collaborative approach to exploring diagnostic issues and treatments with patients, explaining the rationale for decisions, setting expectations about roles and responsibilities, and implementing a core plan and ongoing evaluation. Fair process opens the door to bringing patient expertise into the clinical setting and the work of developing health care goals and strategies. This paper provides a step by step illustration of an innovative visual approach, called photovoice or photo-elicitation, to achieve fair process in clinical work with acquired brain injury survivors and others living with chronic health conditions. Applying this visual tool and methodology in the clinical setting will enhance patient-provider communication; engage patients as partners in identifying challenges, strengths, goals, and strategies; and support evaluation of progress over time. Asking patients to bring visuals of their lives into the clinical interaction can help to illuminate gaps in clinical knowledge, forge better therapeutic relationships with patients living with chronic conditions such as brain injury, and identify patient-centered goals and possibilities for healing. The process illustrated here can be used by clinicians, (primary care physicians, rehabilitation therapists, neurologists, neuropsychologists, psychologists, and others) working with people living with chronic conditions such as acquired brain injury, mental illness, physical disabilities, HIV/AIDS, substance abuse, or post-traumatic stress, and by leaders of support groups for the types of patients described above and their family members or caregivers.
Medicine, Issue 48, person-centered care, participatory visual methods, photovoice, photo-elicitation, narrative medicine, acquired brain injury, disability, rehabilitation, palliative care
Play Button
Expired CO2 Measurement in Intubated or Spontaneously Breathing Patients from the Emergency Department
Authors: Franck Verschuren, Maidei Gugu Kabayadondo, Frédéric Thys.
Institutions: Universit Catholique de Louvain Cliniques Universitaires Saint-Luc.
Carbon dioxide (CO2) along with oxygen (O2) share the role of being the most important gases in the human body. The measuring of expired CO2 at the mouth has solicited growing clinical interest among physicians in the emergency department for various indications: (1) surveillance et monitoring of the intubated patient; (2) verification of the correct positioning of an endotracheal tube; (3) monitoring of a patient in cardiac arrest; (4) achieving normocapnia in intubated head trauma patients; (5) monitoring ventilation during procedural sedation. The video allows physicians to familiarize themselves with the use of capnography and the text offers a review of the theory and principals involved. In particular, the importance of CO2 for the organism, the relevance of measuring expired CO2, the differences between arterial and expired CO2, the material used in capnography with their artifacts and traps, will be reviewed. Since the main reluctance in the use of expired CO2 measurement is due to lack of correct knowledge concerning the physiopathology of CO2 by the physician, we hope that this explanation and the video sequences accompanying will help resolve this limitation.
Medicine, Issue 47, capnography, CO2, emergency medicine, end-tidal CO2
Play Button
Ole Isacson: Development of New Therapies for Parkinson's Disease
Authors: Ole Isacson.
Institutions: Harvard Medical School.
Medicine, Issue 3, Parkinson' disease, Neuroscience, dopamine, neuron, L-DOPA, stem cell, transplantation
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.