JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Description and prediction of the development of metabolic syndrome: a longitudinal analysis using a markov model approach.
PLoS ONE
PUBLISHED: 01-01-2013
Delineating the natural history of metabolic syndrome (MetS) is prerequisite to prevention. This study aimed to build Markov models to simulate each components progress and to test the effect of different initial states on the development of MetS.
Authors: Caroline J. Ketcham, Eric Hall, Walter R. Bixby, Srikant Vallabhajosula, Stephen E. Folger, Matthew C. Kostek, Paul C. Miller, Kenneth P. Barnes, Kirtida Patel.
Published: 12-08-2014
ABSTRACT
Concussions are occurring at alarming rates in the United States and have become a serious public health concern. The CDC estimates that 1.6 to 3.8 million concussions occur in sports and recreational activities annually. Concussion as defined by the 2013 Concussion Consensus Statement “may be caused either by a direct blow to the head, face, neck or elsewhere on the body with an ‘impulsive’ force transmitted to the head.” Concussions leave the individual with both short- and long-term effects. The short-term effects of sport related concussions may include changes in playing ability, confusion, memory disturbance, the loss of consciousness, slowing of reaction time, loss of coordination, headaches, dizziness, vomiting, changes in sleep patterns and mood changes. These symptoms typically resolve in a matter of days. However, while some individuals recover from a single concussion rather quickly, many experience lingering effects that can last for weeks or months. The factors related to concussion susceptibility and the subsequent recovery times are not well known or understood at this time. Several factors have been suggested and they include the individual’s concussion history, the severity of the initial injury, history of migraines, history of learning disabilities, history of psychiatric comorbidities, and possibly, genetic factors. Many studies have individually investigated certain factors both the short-term and long-term effects of concussions, recovery time course, susceptibility and recovery. What has not been clearly established is an effective multifaceted approach to concussion evaluation that would yield valuable information related to the etiology, functional changes, and recovery. The purpose of this manuscript is to show one such multifaceted approached which examines concussions using computerized neurocognitive testing, event related potentials, somatosensory perceptual responses, balance assessment, gait assessment and genetic testing.
24 Related JoVE Articles!
Play Button
A Novel Bayesian Change-point Algorithm for Genome-wide Analysis of Diverse ChIPseq Data Types
Authors: Haipeng Xing, Willey Liao, Yifan Mo, Michael Q. Zhang.
Institutions: Stony Brook University, Cold Spring Harbor Laboratory, University of Texas at Dallas.
ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein1. For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment2. Reliably identifying these regions was the focus of our work. Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics3-5 to more rigorous statistical models, e.g. Hidden Markov Models (HMMs)6-8. We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized. Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types. To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs9, which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how our Bayesian Change Point (BCP) algorithm had a reduced computational complexity-evidenced by an abridged run time and memory footprint. The BCP algorithm was successfully applied to both punctate peak and diffuse island identification with robust accuracy and limited user-defined parameters. This illustrated both its versatility and ease of use. Consequently, we believe it can be implemented readily across broad ranges of data types and end users in a manner that is easily compared and contrasted, making it a great tool for ChIPseq data analysis that can aid in collaboration and corroboration between research groups. Here, we demonstrate the application of BCP to existing transcription factor10,11 and epigenetic data12 to illustrate its usefulness.
Genetics, Issue 70, Bioinformatics, Genomics, Molecular Biology, Cellular Biology, Immunology, Chromatin immunoprecipitation, ChIP-Seq, histone modifications, segmentation, Bayesian, Hidden Markov Models, epigenetics
4273
Play Button
Fat Preference: A Novel Model of Eating Behavior in Rats
Authors: James M Kasper, Sarah B Johnson, Jonathan D. Hommel.
Institutions: University of Texas Medical Branch.
Obesity is a growing problem in the United States of America, with more than a third of the population classified as obese. One factor contributing to this multifactorial disorder is the consumption of a high fat diet, a behavior that has been shown to increase both caloric intake and body fat content. However, the elements regulating preference for high fat food over other foods remain understudied. To overcome this deficit, a model to quickly and easily test changes in the preference for dietary fat was developed. The Fat Preference model presents rats with a series of choices between foods with differing fat content. Like humans, rats have a natural bias toward consuming high fat food, making the rat model ideal for translational studies. Changes in preference can be ascribed to the effect of either genetic differences or pharmacological interventions. This model allows for the exploration of determinates of fat preference and screening pharmacotherapeutic agents that influence acquisition of obesity.
Behavior, Issue 88, obesity, fat, preference, choice, diet, macronutrient, animal model
51575
Play Button
Quantitative Visualization and Detection of Skin Cancer Using Dynamic Thermal Imaging
Authors: Cila Herman, Muge Pirtini Cetingul.
Institutions: The Johns Hopkins University.
In 2010 approximately 68,720 melanomas will be diagnosed in the US alone, with around 8,650 resulting in death 1. To date, the only effective treatment for melanoma remains surgical excision, therefore, the key to extended survival is early detection 2,3. Considering the large numbers of patients diagnosed every year and the limitations in accessing specialized care quickly, the development of objective in vivo diagnostic instruments to aid the diagnosis is essential. New techniques to detect skin cancer, especially non-invasive diagnostic tools, are being explored in numerous laboratories. Along with the surgical methods, techniques such as digital photography, dermoscopy, multispectral imaging systems (MelaFind), laser-based systems (confocal scanning laser microscopy, laser doppler perfusion imaging, optical coherence tomography), ultrasound, magnetic resonance imaging, are being tested. Each technique offers unique advantages and disadvantages, many of which pose a compromise between effectiveness and accuracy versus ease of use and cost considerations. Details about these techniques and comparisons are available in the literature 4. Infrared (IR) imaging was shown to be a useful method to diagnose the signs of certain diseases by measuring the local skin temperature. There is a large body of evidence showing that disease or deviation from normal functioning are accompanied by changes of the temperature of the body, which again affect the temperature of the skin 5,6. Accurate data about the temperature of the human body and skin can provide a wealth of information on the processes responsible for heat generation and thermoregulation, in particular the deviation from normal conditions, often caused by disease. However, IR imaging has not been widely recognized in medicine due to the premature use of the technology 7,8 several decades ago, when temperature measurement accuracy and the spatial resolution were inadequate and sophisticated image processing tools were unavailable. This situation changed dramatically in the late 1990s-2000s. Advances in IR instrumentation, implementation of digital image processing algorithms and dynamic IR imaging, which enables scientists to analyze not only the spatial, but also the temporal thermal behavior of the skin 9, allowed breakthroughs in the field. In our research, we explore the feasibility of IR imaging, combined with theoretical and experimental studies, as a cost effective, non-invasive, in vivo optical measurement technique for tumor detection, with emphasis on the screening and early detection of melanoma 10-13. In this study, we show data obtained in a patient study in which patients that possess a pigmented lesion with a clinical indication for biopsy are selected for imaging. We compared the difference in thermal responses between healthy and malignant tissue and compared our data with biopsy results. We concluded that the increased metabolic activity of the melanoma lesion can be detected by dynamic infrared imaging.
Medicine, Issue 51, Infrared imaging, quantitative thermal analysis, image processing, skin cancer, melanoma, transient thermal response, skin thermal models, skin phantom experiment, patient study
2679
Play Button
Tissue Engineering of a Human 3D in vitro Tumor Test System
Authors: Corinna Moll, Jenny Reboredo, Thomas Schwarz, Antje Appelt, Sebastian Schürlein, Heike Walles, Sarah Nietzer.
Institutions: University Hospital Würzburg.
Cancer is one of the leading causes of death worldwide. Current therapeutic strategies are predominantly developed in 2D culture systems, which inadequately reflect physiological conditions in vivo. Biological 3D matrices provide cells an environment in which cells can self-organize, allowing the study of tissue organization and cell differentiation. Such scaffolds can be seeded with a mixture of different cell types to study direct 3D cell-cell-interactions. To mimic the 3D complexity of cancer tumors, our group has developed a 3D in vitro tumor test system. Our 3D tissue test system models the in vivo situation of malignant peripheral nerve sheath tumors (MPNSTs), which we established with our decellularized porcine jejunal segment derived biological vascularized scaffold (BioVaSc). In our model, we reseeded a modified BioVaSc matrix with primary fibroblasts, microvascular endothelial cells (mvECs) and the S462 tumor cell line. For static culture, the vascular structure of the BioVaSc is removed and the remaining scaffold is cut open on one side (Small Intestinal Submucosa SIS-Muc). The resulting matrix is then fixed between two metal rings (cell crowns). Another option is to culture the cell-seeded SIS-Muc in a flow bioreactor system that exposes the cells to shear stress. Here, the bioreactor is connected to a peristaltic pump in a self-constructed incubator. A computer regulates the arterial oxygen and nutrient supply via parameters such as blood pressure, temperature, and flow rate. This setup allows for a dynamic culture with either pressure-regulated pulsatile or constant flow. In this study, we could successfully establish both a static and dynamic 3D culture system for MPNSTs. The ability to model cancer tumors in a more natural 3D environment will enable the discovery, testing, and validation of future pharmaceuticals in a human-like model.
Cancer Biology, Issue 78, Biomedical Engineering, Bioengineering, Medicine, Anatomy, Physiology, Molecular Biology, Cellular Biology, Tissue Engineering, Tumor Cells, Cultured, Biotechnology, Culture Techniques, Cell Engineering, Cellular Microenvironment, Equipment and Supplies, Decellularization, BioVaSc, primary cell isolation, tumor test system, dynamic culture conditions, bioreactor, 3D in vitro models, cell culture
50460
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
3259
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
50427
Play Button
Annotation of Plant Gene Function via Combined Genomics, Metabolomics and Informatics
Authors: Takayuki Tohge, Alisdair R. Fernie.
Institutions: Max-Planck-Institut.
Given the ever expanding number of model plant species for which complete genome sequences are available and the abundance of bio-resources such as knockout mutants, wild accessions and advanced breeding populations, there is a rising burden for gene functional annotation. In this protocol, annotation of plant gene function using combined co-expression gene analysis, metabolomics and informatics is provided (Figure 1). This approach is based on the theory of using target genes of known function to allow the identification of non-annotated genes likely to be involved in a certain metabolic process, with the identification of target compounds via metabolomics. Strategies are put forward for applying this information on populations generated by both forward and reverse genetics approaches in spite of none of these are effortless. By corollary this approach can also be used as an approach to characterise unknown peaks representing new or specific secondary metabolites in the limited tissues, plant species or stress treatment, which is currently the important trial to understanding plant metabolism.
Plant Biology, Issue 64, Genetics, Bioinformatics, Metabolomics, Plant metabolism, Transcriptome analysis, Functional annotation, Computational biology, Plant biology, Theoretical biology, Spectroscopy and structural analysis
3487
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
52070
Play Button
Quantification of Global Diastolic Function by Kinematic Modeling-based Analysis of Transmitral Flow via the Parametrized Diastolic Filling Formalism
Authors: Sina Mossahebi, Simeng Zhu, Howard Chen, Leonid Shmuylovich, Erina Ghosh, Sándor J. Kovács.
Institutions: Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis.
Quantitative cardiac function assessment remains a challenge for physiologists and clinicians. Although historically invasive methods have comprised the only means available, the development of noninvasive imaging modalities (echocardiography, MRI, CT) having high temporal and spatial resolution provide a new window for quantitative diastolic function assessment. Echocardiography is the agreed upon standard for diastolic function assessment, but indexes in current clinical use merely utilize selected features of chamber dimension (M-mode) or blood/tissue motion (Doppler) waveforms without incorporating the physiologic causal determinants of the motion itself. The recognition that all left ventricles (LV) initiate filling by serving as mechanical suction pumps allows global diastolic function to be assessed based on laws of motion that apply to all chambers. What differentiates one heart from another are the parameters of the equation of motion that governs filling. Accordingly, development of the Parametrized Diastolic Filling (PDF) formalism has shown that the entire range of clinically observed early transmitral flow (Doppler E-wave) patterns are extremely well fit by the laws of damped oscillatory motion. This permits analysis of individual E-waves in accordance with a causal mechanism (recoil-initiated suction) that yields three (numerically) unique lumped parameters whose physiologic analogues are chamber stiffness (k), viscoelasticity/relaxation (c), and load (xo). The recording of transmitral flow (Doppler E-waves) is standard practice in clinical cardiology and, therefore, the echocardiographic recording method is only briefly reviewed. Our focus is on determination of the PDF parameters from routinely recorded E-wave data. As the highlighted results indicate, once the PDF parameters have been obtained from a suitable number of load varying E-waves, the investigator is free to use the parameters or construct indexes from the parameters (such as stored energy 1/2kxo2, maximum A-V pressure gradient kxo, load independent index of diastolic function, etc.) and select the aspect of physiology or pathophysiology to be quantified.
Bioengineering, Issue 91, cardiovascular physiology, ventricular mechanics, diastolic function, mathematical modeling, Doppler echocardiography, hemodynamics, biomechanics
51471
Play Button
'Bioluminescent' Reporter Phage for the Detection of Category A Bacterial Pathogens
Authors: David A. Schofield, Ian J. Molineux, Caroline Westwater.
Institutions: Guild Associates, Inc., University of Texas at Austin, Medical University of South Carolina.
Yersinia pestis and Bacillus anthracis are Category A bacterial pathogens that are the causative agents of the plague and anthrax, respectively 1. Although the natural occurrence of both diseases' is now relatively rare, the possibility of terrorist groups using these pathogens as a bioweapon is real. Because of the disease's inherent communicability, rapid clinical course, and high mortality rate, it is critical that an outbreak be detected quickly. Therefore methodologies that provide rapid detection and diagnosis are essential to ensure immediate implementation of public health measures and activation of crisis management. Recombinant reporter phage may provide a rapid and specific approach for the detection of Y. pestis and B. anthracis. The Centers for Disease Control and Prevention currently use the classical phage lysis assays for the confirmed identification of these bacterial pathogens 2-4. These assays take advantage of naturally occurring phage which are specific and lytic for their bacterial hosts. After overnight growth of the cultivated bacterium in the presence of the specific phage, the formation of plaques (bacterial lysis) provides a positive identification of the bacterial target. Although these assays are robust, they suffer from three shortcomings: 1) they are laboratory based; 2) they require bacterial isolation and cultivation from the suspected sample, and 3) they take 24-36 h to complete. To address these issues, recombinant "light-tagged" reporter phage were genetically engineered by integrating the Vibrio harveyi luxAB genes into the genome of Y. pestis and B. anthracis specific phage 5-8. The resulting luxAB reporter phage were able to detect their specific target by rapidly (within minutes) and sensitively conferring a bioluminescent phenotype to recipient cells. Importantly, detection was obtained either with cultivated recipient cells or with mock-infected clinical specimens 7. For demonstration purposes, here we describe the method for the phage-mediated detection of a known Y. pestis isolate using a luxAB reporter phage constructed from the CDC plague diagnostic phage ΦA1122 6,7 (Figure 1). A similar method, with minor modifications (e.g. change in growth temperature and media), may be used for the detection of B. anthracis isolates using the B. anthracis reporter phage Wβ::luxAB 8. The method describes the phage-mediated transduction of a biolumescent phenotype to cultivated Y. pestis cells which are subsequently measured using a microplate luminometer. The major advantages of this method over the traditional phage lysis assays is the ease of use, the rapid results, and the ability to test multiple samples simultaneously in a 96-well microtiter plate format. Figure 1. Detection schematic. The phage are mixed with the sample, the phage infects the cell, luxAB are expressed, and the cell bioluminesces. Sample processing is not necessary; the phage and cells are mixed and subsequently measured for light.
Immunology, Issue 53, Reporter phage, bioluminescence, detection, plague, anthrax
2740
Play Button
Preterm EEG: A Multimodal Neurophysiological Protocol
Authors: Susanna Stjerna, Juha Voipio, Marjo Metsäranta, Kai Kaila, Sampsa Vanhatalo.
Institutions: University of Helsinki , University of Helsinki , University of Helsinki , University of Helsinki .
Since its introduction in early 1950s, electroencephalography (EEG) has been widely used in the neonatal intensive care units (NICU) for assessment and monitoring of brain function in preterm and term babies. Most common indications are the diagnosis of epileptic seizures, assessment of brain maturity, and recovery from hypoxic-ischemic events. EEG recording techniques and the understanding of neonatal EEG signals have dramatically improved, but these advances have been slow to penetrate through the clinical traditions. The aim of this presentation is to bring theory and practice of advanced EEG recording available for neonatal units. In the theoretical part, we will present animations to illustrate how a preterm brain gives rise to spontaneous and evoked EEG activities, both of which are unique to this developmental phase, as well as crucial for a proper brain maturation. Recent animal work has shown that the structural brain development is clearly reflected in early EEG activity. Most important structures in this regard are the growing long range connections and the transient cortical structure, subplate. Sensory stimuli in a preterm baby will generate responses that are seen at a single trial level, and they have underpinnings in the subplate-cortex interaction. This brings neonatal EEG readily into a multimodal study, where EEG is not only recording cortical function, but it also tests subplate function via different sensory modalities. Finally, introduction of clinically suitable dense array EEG caps, as well as amplifiers capable of recording low frequencies, have disclosed multitude of brain activities that have as yet been overlooked. In the practical part of this video, we show how a multimodal, dense array EEG study is performed in neonatal intensive care unit from a preterm baby in the incubator. The video demonstrates preparation of the baby and incubator, application of the EEG cap, and performance of the sensory stimulations.
Neuroscience, Issue 60, neurophysiology, preterm baby, neonatal, EEG, evoked response, high density EEG, FbEEG, sensory evoked response, neonatal intensive care unit
3774
Play Button
Using Continuous Data Tracking Technology to Study Exercise Adherence in Pulmonary Rehabilitation
Authors: Amanda K. Rizk, Rima Wardini, Emilie Chan-Thim, Barbara Trutschnigg, Amélie Forget, Véronique Pepin.
Institutions: Concordia University, Concordia University, Hôpital du Sacré-Coeur de Montréal.
Pulmonary rehabilitation (PR) is an important component in the management of respiratory diseases. The effectiveness of PR is dependent upon adherence to exercise training recommendations. The study of exercise adherence is thus a key step towards the optimization of PR programs. To date, mostly indirect measures, such as rates of participation, completion, and attendance, have been used to determine adherence to PR. The purpose of the present protocol is to describe how continuous data tracking technology can be used to measure adherence to a prescribed aerobic training intensity on a second-by-second basis. In our investigations, adherence has been defined as the percent time spent within a specified target heart rate range. As such, using a combination of hardware and software, heart rate is measured, tracked, and recorded during cycling second-by-second for each participant, for each exercise session. Using statistical software, the data is subsequently extracted and analyzed. The same protocol can be applied to determine adherence to other measures of exercise intensity, such as time spent at a specified wattage, level, or speed on the cycle ergometer. Furthermore, the hardware and software is also available to measure adherence to other modes of training, such as the treadmill, elliptical, stepper, and arm ergometer. The present protocol, therefore, has a vast applicability to directly measure adherence to aerobic exercise.
Medicine, Issue 81, Data tracking, exercise, rehabilitation, adherence, patient compliance, health behavior, user-computer interface.
50643
Play Button
Construction of Vapor Chambers Used to Expose Mice to Alcohol During the Equivalent of all Three Trimesters of Human Development
Authors: Russell A. Morton, Marvin R. Diaz, Lauren A. Topper, C. Fernando Valenzuela.
Institutions: University of New Mexico Health Sciences Center.
Exposure to alcohol during development can result in a constellation of morphological and behavioral abnormalities that are collectively known as Fetal Alcohol Spectrum Disorders (FASDs). At the most severe end of the spectrum is Fetal Alcohol Syndrome (FAS), characterized by growth retardation, craniofacial dysmorphology, and neurobehavioral deficits. Studies with animal models, including rodents, have elucidated many molecular and cellular mechanisms involved in the pathophysiology of FASDs. Ethanol administration to pregnant rodents has been used to model human exposure during the first and second trimesters of pregnancy. Third trimester ethanol consumption in humans has been modeled using neonatal rodents. However, few rodent studies have characterized the effect of ethanol exposure during the equivalent to all three trimesters of human pregnancy, a pattern of exposure that is common in pregnant women. Here, we show how to build vapor chambers from readily obtainable materials that can each accommodate up to six standard mouse cages. We describe a vapor chamber paradigm that can be used to model exposure to ethanol, with minimal handling, during all three trimesters. Our studies demonstrate that pregnant dams developed significant metabolic tolerance to ethanol. However, neonatal mice did not develop metabolic tolerance and the number of fetuses, fetus weight, placenta weight, number of pups/litter, number of dead pups/litter, and pup weight were not significantly affected by ethanol exposure. An important advantage of this paradigm is its applicability to studies with genetically-modified mice. Additionally, this paradigm minimizes handling of animals, a major confound in fetal alcohol research.
Medicine, Issue 89, fetal, ethanol, exposure, paradigm, vapor, development, alcoholism, teratogenic, animal, mouse, model
51839
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
In Vivo Modeling of the Morbid Human Genome using Danio rerio
Authors: Adrienne R. Niederriter, Erica E. Davis, Christelle Golzio, Edwin C. Oh, I-Chun Tsai, Nicholas Katsanis.
Institutions: Duke University Medical Center, Duke University, Duke University Medical Center.
Here, we present methods for the development of assays to query potentially clinically significant nonsynonymous changes using in vivo complementation in zebrafish. Zebrafish (Danio rerio) are a useful animal system due to their experimental tractability; embryos are transparent to enable facile viewing, undergo rapid development ex vivo, and can be genetically manipulated.1 These aspects have allowed for significant advances in the analysis of embryogenesis, molecular processes, and morphogenetic signaling. Taken together, the advantages of this vertebrate model make zebrafish highly amenable to modeling the developmental defects in pediatric disease, and in some cases, adult-onset disorders. Because the zebrafish genome is highly conserved with that of humans (~70% orthologous), it is possible to recapitulate human disease states in zebrafish. This is accomplished either through the injection of mutant human mRNA to induce dominant negative or gain of function alleles, or utilization of morpholino (MO) antisense oligonucleotides to suppress genes to mimic loss of function variants. Through complementation of MO-induced phenotypes with capped human mRNA, our approach enables the interpretation of the deleterious effect of mutations on human protein sequence based on the ability of mutant mRNA to rescue a measurable, physiologically relevant phenotype. Modeling of the human disease alleles occurs through microinjection of zebrafish embryos with MO and/or human mRNA at the 1-4 cell stage, and phenotyping up to seven days post fertilization (dpf). This general strategy can be extended to a wide range of disease phenotypes, as demonstrated in the following protocol. We present our established models for morphogenetic signaling, craniofacial, cardiac, vascular integrity, renal function, and skeletal muscle disorder phenotypes, as well as others.
Molecular Biology, Issue 78, Genetics, Biomedical Engineering, Medicine, Developmental Biology, Biochemistry, Anatomy, Physiology, Bioengineering, Genomics, Medical, zebrafish, in vivo, morpholino, human disease modeling, transcription, PCR, mRNA, DNA, Danio rerio, animal model
50338
Play Button
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Authors: Madeleine E. Hackney, Kathleen McKee.
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
52066
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
51631
Play Button
Design and Operation of a Continuous 13C and 15N Labeling Chamber for Uniform or Differential, Metabolic and Structural, Plant Isotope Labeling
Authors: Jennifer L Soong, Dan Reuss, Colin Pinney, Ty Boyack, Michelle L Haddix, Catherine E Stewart, M. Francesca Cotrufo.
Institutions: Colorado State University, USDA-ARS, Colorado State University.
Tracing rare stable isotopes from plant material through the ecosystem provides the most sensitive information about ecosystem processes; from CO2 fluxes and soil organic matter formation to small-scale stable-isotope biomarker probing. Coupling multiple stable isotopes such as 13C with 15N, 18O or 2H has the potential to reveal even more information about complex stoichiometric relationships during biogeochemical transformations. Isotope labeled plant material has been used in various studies of litter decomposition and soil organic matter formation1-4. From these and other studies, however, it has become apparent that structural components of plant material behave differently than metabolic components (i.e. leachable low molecular weight compounds) in terms of microbial utilization and long-term carbon storage5-7. The ability to study structural and metabolic components separately provides a powerful new tool for advancing the forefront of ecosystem biogeochemical studies. Here we describe a method for producing 13C and 15N labeled plant material that is either uniformly labeled throughout the plant or differentially labeled in structural and metabolic plant components. Here, we present the construction and operation of a continuous 13C and 15N labeling chamber that can be modified to meet various research needs. Uniformly labeled plant material is produced by continuous labeling from seedling to harvest, while differential labeling is achieved by removing the growing plants from the chamber weeks prior to harvest. Representative results from growing Andropogon gerardii Kaw demonstrate the system's ability to efficiently label plant material at the targeted levels. Through this method we have produced plant material with a 4.4 atom%13C and 6.7 atom%15N uniform plant label, or material that is differentially labeled by up to 1.29 atom%13C and 0.56 atom%15N in its metabolic and structural components (hot water extractable and hot water residual components, respectively). Challenges lie in maintaining proper temperature, humidity, CO2 concentration, and light levels in an airtight 13C-CO2 atmosphere for successful plant production. This chamber description represents a useful research tool to effectively produce uniformly or differentially multi-isotope labeled plant material for use in experiments on ecosystem biogeochemical cycling.
Environmental Sciences, Issue 83, 13C, 15N, plant, stable isotope labeling, Andropogon gerardii, metabolic compounds, structural compounds, hot water extraction
51117
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
51318
Play Button
A Zebrafish Model of Diabetes Mellitus and Metabolic Memory
Authors: Robert V. Intine, Ansgar S. Olsen, Michael P. Sarras Jr..
Institutions: Rosalind Franklin University of Medicine and Science, Rosalind Franklin University of Medicine and Science.
Diabetes mellitus currently affects 346 million individuals and this is projected to increase to 400 million by 2030. Evidence from both the laboratory and large scale clinical trials has revealed that diabetic complications progress unimpeded via the phenomenon of metabolic memory even when glycemic control is pharmaceutically achieved. Gene expression can be stably altered through epigenetic changes which not only allow cells and organisms to quickly respond to changing environmental stimuli but also confer the ability of the cell to "memorize" these encounters once the stimulus is removed. As such, the roles that these mechanisms play in the metabolic memory phenomenon are currently being examined. We have recently reported the development of a zebrafish model of type I diabetes mellitus and characterized this model to show that diabetic zebrafish not only display the known secondary complications including the changes associated with diabetic retinopathy, diabetic nephropathy and impaired wound healing but also exhibit impaired caudal fin regeneration. This model is unique in that the zebrafish is capable to regenerate its damaged pancreas and restore a euglycemic state similar to what would be expected in post-transplant human patients. Moreover, multiple rounds of caudal fin amputation allow for the separation and study of pure epigenetic effects in an in vivo system without potential complicating factors from the previous diabetic state. Although euglycemia is achieved following pancreatic regeneration, the diabetic secondary complication of fin regeneration and skin wound healing persists indefinitely. In the case of impaired fin regeneration, this pathology is retained even after multiple rounds of fin regeneration in the daughter fin tissues. These observations point to an underlying epigenetic process existing in the metabolic memory state. Here we present the methods needed to successfully generate the diabetic and metabolic memory groups of fish and discuss the advantages of this model.
Medicine, Issue 72, Genetics, Genomics, Physiology, Anatomy, Biomedical Engineering, Metabolomics, Zebrafish, diabetes, metabolic memory, tissue regeneration, streptozocin, epigenetics, Danio rerio, animal model, diabetes mellitus, diabetes, drug discovery, hyperglycemia
50232
Play Button
TMS: Using the Theta-Burst Protocol to Explore Mechanism of Plasticity in Individuals with Fragile X Syndrome and Autism
Authors: Lindsay M. Oberman, Jared C. Horvath, Alvaro Pascual-Leone.
Institutions: Beth Israel Deaconess Medical Center.
Fragile X Syndrome (FXS), also known as Martin-Bell Syndrome, is a genetic abnormality found on the X chromosome.1,2 Individuals suffering from FXS display abnormalities in the expression of FMR1 - a protein required for typical, healthy neural development.3 Recent data has suggested that the loss of this protein can cause the cortex to be hyperexcitable thereby affecting overall patterns of neural plasticity.4,5 In addition, Fragile X shows a strong comorbidity with autism: in fact, 30% of children with FXS are diagnosed with autism, and 2 - 5% of autistic children suffer from FXS.6 Transcranial Magnetic Stimulation (a non-invasive neurostimulatory and neuromodulatory technique that can transiently or lastingly modulate cortical excitability via the application of localized magnetic field pulses 7,8) represents a unique method of exploring plasticity and the manifestations of FXS within affected individuals. More specifically, Theta-Burst Stimulation (TBS), a specific stimulatory protocol shown to modulate cortical plasticity for a duration up to 30 minutes after stimulation cessation in healthy populations, has already proven an efficacious tool in the exploration of abnormal plasticity.9,10 Recent studies have shown the effects of TBS last considerably longer in individuals on the autistic spectrum - up to 90 minutes.11 This extended effect-duration suggests an underlying abnormality in the brain's natural plasticity state in autistic individuals - similar to the hyperexcitability induced by Fragile X Syndrome. In this experiment, utilizing single-pulse motor-evoked potentials (MEPs) as our benchmark, we will explore the effects of both intermittent and continuous TBS on cortical plasticity in individuals suffering from FXS and individuals on the Autistic Spectrum.
Neuroscience, Issue 46, Transcranial Magnetic Stimulation, Theta-Burst Stimulation, Neural Plasticity, Fragile X, Autism
2272
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
51823
Play Button
Interview: Glycolipid Antigen Presentation by CD1d and the Therapeutic Potential of NKT cell Activation
Authors: Mitchell Kronenberg.
Institutions: La Jolla Institute for Allergy and Immunology.
Natural Killer T cells (NKT) are critical determinants of the immune response to cancer, regulation of autioimmune disease, clearance of infectious agents, and the development of artheriosclerotic plaques. In this interview, Mitch Kronenberg discusses his laboratory's efforts to understand the mechanism through which NKT cells are activated by glycolipid antigens. Central to these studies is CD1d - the antigen presenting molecule that presents glycolipids to NKT cells. The advent of CD1d tetramer technology, a technique developed by the Kronenberg lab, is critical for the sorting and identification of subsets of specific glycolipid-reactive T cells. Mitch explains how glycolipid agonists are being used as therapeutic agents to activate NKT cells in cancer patients and how CD1d tetramers can be used to assess the state of the NKT cell population in vivo following glycolipid agonist therapy. Current status of ongoing clinical trials using these agonists are discussed as well as Mitch's prediction for areas in the field of immunology that will have emerging importance in the near future.
Immunology, Issue 10, Natural Killer T cells, NKT cells, CD1 Tetramers, antigen presentation, glycolipid antigens, CD1d, Mucosal Immunity, Translational Research
635
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.