JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Selection of genetic and phenotypic features associated with inflammatory status of patients on dialysis using relaxed linear separability method.
PLoS ONE
PUBLISHED: 01-01-2014
Identification of risk factors in patients with a particular disease can be analyzed in clinical data sets by using feature selection procedures of pattern recognition and data mining methods. The applicability of the relaxed linear separability (RLS) method of feature subset selection was checked for high-dimensional and mixed type (genetic and phenotypic) clinical data of patients with end-stage renal disease. The RLS method allowed for substantial reduction of the dimensionality through omitting redundant features while maintaining the linear separability of data sets of patients with high and low levels of an inflammatory biomarker. The synergy between genetic and phenotypic features in differentiation between these two subgroups was demonstrated.
ABSTRACT
Saccharomyces cerevisiae has been an excellent model system for examining mechanisms and consequences of genome instability. Information gained from this yeast model is relevant to many organisms, including humans, since DNA repair and DNA damage response factors are well conserved across diverse species. However, S. cerevisiae has not yet been used to fully address whether the rate of accumulating mutations changes with increasing replicative (mitotic) age due to technical constraints. For instance, measurements of yeast replicative lifespan through micromanipulation involve very small populations of cells, which prohibit detection of rare mutations. Genetic methods to enrich for mother cells in populations by inducing death of daughter cells have been developed, but population sizes are still limited by the frequency with which random mutations that compromise the selection systems occur. The current protocol takes advantage of magnetic sorting of surface-labeled yeast mother cells to obtain large enough populations of aging mother cells to quantify rare mutations through phenotypic selections. Mutation rates, measured through fluctuation tests, and mutation frequencies are first established for young cells and used to predict the frequency of mutations in mother cells of various replicative ages. Mutation frequencies are then determined for sorted mother cells, and the age of the mother cells is determined using flow cytometry by staining with a fluorescent reagent that detects bud scars formed on their cell surfaces during cell division. Comparison of predicted mutation frequencies based on the number of cell divisions to the frequencies experimentally observed for mother cells of a given replicative age can then identify whether there are age-related changes in the rate of accumulating mutations. Variations of this basic protocol provide the means to investigate the influence of alterations in specific gene functions or specific environmental conditions on mutation accumulation to address mechanisms underlying genome instability during replicative aging.
23 Related JoVE Articles!
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
50427
Play Button
Reconstitution of a Kv Channel into Lipid Membranes for Structural and Functional Studies
Authors: Sungsoo Lee, Hui Zheng, Liang Shi, Qiu-Xing Jiang.
Institutions: University of Texas Southwestern Medical Center at Dallas.
To study the lipid-protein interaction in a reductionistic fashion, it is necessary to incorporate the membrane proteins into membranes of well-defined lipid composition. We are studying the lipid-dependent gating effects in a prototype voltage-gated potassium (Kv) channel, and have worked out detailed procedures to reconstitute the channels into different membrane systems. Our reconstitution procedures take consideration of both detergent-induced fusion of vesicles and the fusion of protein/detergent micelles with the lipid/detergent mixed micelles as well as the importance of reaching an equilibrium distribution of lipids among the protein/detergent/lipid and the detergent/lipid mixed micelles. Our data suggested that the insertion of the channels in the lipid vesicles is relatively random in orientations, and the reconstitution efficiency is so high that no detectable protein aggregates were seen in fractionation experiments. We have utilized the reconstituted channels to determine the conformational states of the channels in different lipids, record electrical activities of a small number of channels incorporated in planar lipid bilayers, screen for conformation-specific ligands from a phage-displayed peptide library, and support the growth of 2D crystals of the channels in membranes. The reconstitution procedures described here may be adapted for studying other membrane proteins in lipid bilayers, especially for the investigation of the lipid effects on the eukaryotic voltage-gated ion channels.
Molecular Biology, Issue 77, Biochemistry, Genetics, Cellular Biology, Structural Biology, Biophysics, Membrane Lipids, Phospholipids, Carrier Proteins, Membrane Proteins, Micelles, Molecular Motor Proteins, life sciences, biochemistry, Amino Acids, Peptides, and Proteins, lipid-protein interaction, channel reconstitution, lipid-dependent gating, voltage-gated ion channel, conformation-specific ligands, lipids
50436
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Identifying DNA Mutations in Purified Hematopoietic Stem/Progenitor Cells
Authors: Ziming Cheng, Ting Zhou, Azhar Merchant, Thomas J. Prihoda, Brian L. Wickes, Guogang Xu, Christi A. Walter, Vivienne I. Rebel.
Institutions: UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio, UT Health Science Center at San Antonio.
In recent years, it has become apparent that genomic instability is tightly related to many developmental disorders, cancers, and aging. Given that stem cells are responsible for ensuring tissue homeostasis and repair throughout life, it is reasonable to hypothesize that the stem cell population is critical for preserving genomic integrity of tissues. Therefore, significant interest has arisen in assessing the impact of endogenous and environmental factors on genomic integrity in stem cells and their progeny, aiming to understand the etiology of stem-cell based diseases. LacI transgenic mice carry a recoverable λ phage vector encoding the LacI reporter system, in which the LacI gene serves as the mutation reporter. The result of a mutated LacI gene is the production of β-galactosidase that cleaves a chromogenic substrate, turning it blue. The LacI reporter system is carried in all cells, including stem/progenitor cells and can easily be recovered and used to subsequently infect E. coli. After incubating infected E. coli on agarose that contains the correct substrate, plaques can be scored; blue plaques indicate a mutant LacI gene, while clear plaques harbor wild-type. The frequency of blue (among clear) plaques indicates the mutant frequency in the original cell population the DNA was extracted from. Sequencing the mutant LacI gene will show the location of the mutations in the gene and the type of mutation. The LacI transgenic mouse model is well-established as an in vivo mutagenesis assay. Moreover, the mice and the reagents for the assay are commercially available. Here we describe in detail how this model can be adapted to measure the frequency of spontaneously occurring DNA mutants in stem cell-enriched Lin-IL7R-Sca-1+cKit++(LSK) cells and other subpopulations of the hematopoietic system.
Infection, Issue 84, In vivo mutagenesis, hematopoietic stem/progenitor cells, LacI mouse model, DNA mutations, E. coli
50752
Play Button
Utility of Dissociated Intrinsic Hand Muscle Atrophy in the Diagnosis of Amyotrophic Lateral Sclerosis
Authors: Parvathi Menon, Steve Vucic.
Institutions: Westmead Hospital, University of Sydney, Australia.
The split hand phenomenon refers to predominant wasting of thenar muscles and is an early and specific feature of amyotrophic lateral sclerosis (ALS). A novel split hand index (SI) was developed to quantify the split hand phenomenon, and its diagnostic utility was assessed in ALS patients. The split hand index was derived by dividing the product of the compound muscle action potential (CMAP) amplitude recorded over the abductor pollicis brevis and first dorsal interosseous muscles by the CMAP amplitude recorded over the abductor digiti minimi muscle. In order to assess the diagnostic utility of the split hand index, ALS patients were prospectively assessed and their results were compared to neuromuscular disorder patients. The split hand index was significantly reduced in ALS when compared to neuromuscular disorder patients (P<0.0001). Limb-onset ALS patients exhibited the greatest reduction in the split hand index, and a value of 5.2 or less reliably differentiated ALS from other neuromuscular disorders. Consequently, the split hand index appears to be a novel diagnostic biomarker for ALS, perhaps facilitating an earlier diagnosis.
Medicine, Issue 85, Amyotrophic Lateral Sclerosis (ALS), dissociated muscle atrophy, hypothenar muscles, motor neuron disease, split-hand index, thenar muscles
51056
Play Button
Dynamic Visual Tests to Identify and Quantify Visual Damage and Repair Following Demyelination in Optic Neuritis Patients
Authors: Noa Raz, Michal Hallak, Tamir Ben-Hur, Netta Levin.
Institutions: Hadassah Hebrew-University Medical Center.
In order to follow optic neuritis patients and evaluate the effectiveness of their treatment, a handy, accurate and quantifiable tool is required to assess changes in myelination at the central nervous system (CNS). However, standard measurements, including routine visual tests and MRI scans, are not sensitive enough for this purpose. We present two visual tests addressing dynamic monocular and binocular functions which may closely associate with the extent of myelination along visual pathways. These include Object From Motion (OFM) extraction and Time-constrained stereo protocols. In the OFM test, an array of dots compose an object, by moving the dots within the image rightward while moving the dots outside the image leftward or vice versa. The dot pattern generates a camouflaged object that cannot be detected when the dots are stationary or moving as a whole. Importantly, object recognition is critically dependent on motion perception. In the Time-constrained Stereo protocol, spatially disparate images are presented for a limited length of time, challenging binocular 3-dimensional integration in time. Both tests are appropriate for clinical usage and provide a simple, yet powerful, way to identify and quantify processes of demyelination and remyelination along visual pathways. These protocols may be efficient to diagnose and follow optic neuritis and multiple sclerosis patients. In the diagnostic process, these protocols may reveal visual deficits that cannot be identified via current standard visual measurements. Moreover, these protocols sensitively identify the basis of the currently unexplained continued visual complaints of patients following recovery of visual acuity. In the longitudinal follow up course, the protocols can be used as a sensitive marker of demyelinating and remyelinating processes along time. These protocols may therefore be used to evaluate the efficacy of current and evolving therapeutic strategies, targeting myelination of the CNS.
Medicine, Issue 86, Optic neuritis, visual impairment, dynamic visual functions, motion perception, stereopsis, demyelination, remyelination
51107
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
51242
Play Button
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Authors: Kristen K. McCampbell, Kristin N. Springer, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90, zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
51644
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Modeling Astrocytoma Pathogenesis In Vitro and In Vivo Using Cortical Astrocytes or Neural Stem Cells from Conditional, Genetically Engineered Mice
Authors: Robert S. McNeill, Ralf S. Schmid, Ryan E. Bash, Mark Vitucci, Kristen K. White, Andrea M. Werneke, Brian H. Constance, Byron Huff, C. Ryan Miller.
Institutions: University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, Emory University School of Medicine, University of North Carolina School of Medicine.
Current astrocytoma models are limited in their ability to define the roles of oncogenic mutations in specific brain cell types during disease pathogenesis and their utility for preclinical drug development. In order to design a better model system for these applications, phenotypically wild-type cortical astrocytes and neural stem cells (NSC) from conditional, genetically engineered mice (GEM) that harbor various combinations of floxed oncogenic alleles were harvested and grown in culture. Genetic recombination was induced in vitro using adenoviral Cre-mediated recombination, resulting in expression of mutated oncogenes and deletion of tumor suppressor genes. The phenotypic consequences of these mutations were defined by measuring proliferation, transformation, and drug response in vitro. Orthotopic allograft models, whereby transformed cells are stereotactically injected into the brains of immune-competent, syngeneic littermates, were developed to define the role of oncogenic mutations and cell type on tumorigenesis in vivo. Unlike most established human glioblastoma cell line xenografts, injection of transformed GEM-derived cortical astrocytes into the brains of immune-competent littermates produced astrocytomas, including the most aggressive subtype, glioblastoma, that recapitulated the histopathological hallmarks of human astrocytomas, including diffuse invasion of normal brain parenchyma. Bioluminescence imaging of orthotopic allografts from transformed astrocytes engineered to express luciferase was utilized to monitor in vivo tumor growth over time. Thus, astrocytoma models using astrocytes and NSC harvested from GEM with conditional oncogenic alleles provide an integrated system to study the genetics and cell biology of astrocytoma pathogenesis in vitro and in vivo and may be useful in preclinical drug development for these devastating diseases.
Neuroscience, Issue 90, astrocytoma, cortical astrocytes, genetically engineered mice, glioblastoma, neural stem cells, orthotopic allograft
51763
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
50341
Play Button
In Vivo Modeling of the Morbid Human Genome using Danio rerio
Authors: Adrienne R. Niederriter, Erica E. Davis, Christelle Golzio, Edwin C. Oh, I-Chun Tsai, Nicholas Katsanis.
Institutions: Duke University Medical Center, Duke University, Duke University Medical Center.
Here, we present methods for the development of assays to query potentially clinically significant nonsynonymous changes using in vivo complementation in zebrafish. Zebrafish (Danio rerio) are a useful animal system due to their experimental tractability; embryos are transparent to enable facile viewing, undergo rapid development ex vivo, and can be genetically manipulated.1 These aspects have allowed for significant advances in the analysis of embryogenesis, molecular processes, and morphogenetic signaling. Taken together, the advantages of this vertebrate model make zebrafish highly amenable to modeling the developmental defects in pediatric disease, and in some cases, adult-onset disorders. Because the zebrafish genome is highly conserved with that of humans (~70% orthologous), it is possible to recapitulate human disease states in zebrafish. This is accomplished either through the injection of mutant human mRNA to induce dominant negative or gain of function alleles, or utilization of morpholino (MO) antisense oligonucleotides to suppress genes to mimic loss of function variants. Through complementation of MO-induced phenotypes with capped human mRNA, our approach enables the interpretation of the deleterious effect of mutations on human protein sequence based on the ability of mutant mRNA to rescue a measurable, physiologically relevant phenotype. Modeling of the human disease alleles occurs through microinjection of zebrafish embryos with MO and/or human mRNA at the 1-4 cell stage, and phenotyping up to seven days post fertilization (dpf). This general strategy can be extended to a wide range of disease phenotypes, as demonstrated in the following protocol. We present our established models for morphogenetic signaling, craniofacial, cardiac, vascular integrity, renal function, and skeletal muscle disorder phenotypes, as well as others.
Molecular Biology, Issue 78, Genetics, Biomedical Engineering, Medicine, Developmental Biology, Biochemistry, Anatomy, Physiology, Bioengineering, Genomics, Medical, zebrafish, in vivo, morpholino, human disease modeling, transcription, PCR, mRNA, DNA, Danio rerio, animal model
50338
Play Button
Vascular Occlusion Training for Inclusion Body Myositis: A Novel Therapeutic Approach
Authors: Bruno Gualano, Carlos Ugrinowitsch, Manoel Neves Jr., Fernanda R. Lima, Ana Lúcia S. Pinto, Gilberto Laurentino, Valmor A.A. Tricoli, Antonio H. Lancha Jr., Hamilton Roschel.
Institutions: University of São Paulo, University of São Paulo.
Inclusion body myositis (IBM) is a rare idiopathic inflammatory myopathy. It is known to produces remarkable muscle weakness and to greatly compromise function and quality of life. Moreover, clinical practice suggests that, unlike other inflammatory myopathies, the majority of IBM patients are not responsive to treatment with immunosuppressive or immunomodulatory drugs to counteract disease progression1. Additionally, conventional resistance training programs have been proven ineffective in restoring muscle function and muscle mass in these patients2,3. Nevertheless, we have recently observed that restricting muscle blood flow using tourniquet cuffs in association with moderate intensity resistance training in an IBM patient produced a significant gain in muscle mass and function, along with substantial benefits in quality of life4. Thus, a new non-pharmacological approach for IBM patients has been proposed. Herein, we describe the details of a proposed protocol for vascular occlusion associated with a resistance training program for this population.
Medicine, Issue 40, exercise training, therapeutical, myositis, vascular occlusion
1894
Play Button
Examining the Characteristics of Episodic Memory using Event-related Potentials in Patients with Alzheimer's Disease
Authors: Erin Hussey, Brandon Ally.
Institutions: Vanderbilt University.
Our laboratory uses event-related EEG potentials (ERPs) to understand and support behavioral investigations of episodic memory in patients with amnestic mild cognitive impairment (aMCI) and Alzheimer's disease (AD). Whereas behavioral data inform us about the patients' performance, ERPs allow us to record discrete changes in brain activity. Further, ERPs can give us insight into the onset, duration, and interaction of independent cognitive processes associated with memory retrieval. In patient populations, these types of studies are used to examine which aspects of memory are impaired and which remain relatively intact compared to a control population. The methodology for collecting ERP data from a vulnerable patient population while these participants perform a recognition memory task is reviewed. This protocol includes participant preparation, quality assurance, data acquisition, and data analysis. In addition to basic setup and acquisition, we will also demonstrate localization techniques to obtain greater spatial resolution and source localization using high-density (128 channel) electrode arrays.
Medicine, Issue 54, recognition memory, episodic memory, event-related potentials, dual process, Alzheimer's disease, amnestic mild cognitive impairment
2715
Play Button
Nerve Excitability Assessment in Chemotherapy-induced Neurotoxicity
Authors: Susanna B. Park, Cindy S-Y. Lin, Matthew C. Kiernan.
Institutions: University of New South Wales , University of New South Wales , University of New South Wales .
Chemotherapy-induced neurotoxicity is a serious consequence of cancer treatment, which occurs with some of the most commonly used chemotherapies1,2. Chemotherapy-induced peripheral neuropathy produces symptoms of numbness and paraesthesia in the limbs and may progress to difficulties with fine motor skills and walking, leading to functional impairment. In addition to producing troubling symptoms, chemotherapy-induced neuropathy may limit treatment success leading to dose reduction or early cessation of treatment. Neuropathic symptoms may persist long-term, leaving permanent nerve damage in patients with an otherwise good prognosis3. As chemotherapy is utilised more often as a preventative measure, and survival rates increase, the importance of long-lasting and significant neurotoxicity will increase. There are no established neuroprotective or treatment options and a lack of sensitive assessment methods. Appropriate assessment of neurotoxicity will be critical as a prognostic factor and as suitable endpoints for future trials of neuroprotective agents. Current methods to assess the severity of chemotherapy-induced neuropathy utilise clinician-based grading scales which have been demonstrated to lack sensitivity to change and inter-observer objectivity4. Conventional nerve conduction studies provide information about compound action potential amplitude and conduction velocity, which are relatively non-specific measures and do not provide insight into ion channel function or resting membrane potential. Accordingly, prior studies have demonstrated that conventional nerve conduction studies are not sensitive to early change in chemotherapy-induced neurotoxicity4-6. In comparison, nerve excitability studies utilize threshold tracking techniques which have been developed to enable assessment of ion channels, pumps and exchangers in vivo in large myelinated human axons7-9. Nerve excitability techniques have been established as a tool to examine the development and severity of chemotherapy-induced neurotoxicity10-13. Comprising a number of excitability parameters, nerve excitability studies can be used to assess acute neurotoxicity arising immediately following infusion and the development of chronic, cumulative neurotoxicity. Nerve excitability techniques are feasible in the clinical setting, with each test requiring only 5 -10 minutes to complete. Nerve excitability equipment is readily commercially available, and a portable system has been devised so that patients can be tested in situ in the infusion centre setting. In addition, these techniques can be adapted for use in multiple chemotherapies. In patients treated with the chemotherapy oxaliplatin, primarily utilised for colorectal cancer, nerve excitability techniques provide a method to identify patients at-risk for neurotoxicity prior to the onset of chronic neuropathy. Nerve excitability studies have revealed the development of an acute Na+ channelopathy in motor and sensory axons10-13. Importantly, patients who demonstrated changes in excitability in early treatment were subsequently more likely to develop moderate to severe neurotoxicity11. However, across treatment, striking longitudinal changes were identified only in sensory axons which were able to predict clinical neurological outcome in 80% of patients10. These changes demonstrated a different pattern to those seen acutely following oxaliplatin infusion, and most likely reflect the development of significant axonal damage and membrane potential change in sensory nerves which develops longitudinally during oxaliplatin treatment10. Significant abnormalities developed during early treatment, prior to any reduction in conventional measures of nerve function, suggesting that excitability parameters may provide a sensitive biomarker.
Neuroscience, Issue 62, Chemotherapy, Neurotoxicity, Neuropathy, Nerve excitability, Ion channel function, Oxaliplatin, oncology, medicine
3439
Play Button
Measurement of Factor V Activity in Human Plasma Using a Microplate Coagulation Assay
Authors: Derek Tilley, Irina Levit, John A. Samis.
Institutions: University of Ontario Institute of Technology , University of Ontario Institute of Technology , University of Ontario Institute of Technology .
In response to injury, blood coagulation is activated and results in generation of the clotting protease, thrombin. Thrombin cleaves fibrinogen to fibrin which forms an insoluble clot that stops hemorrhage. Factor V (FV) in its activated form, FVa, is a critical cofactor for the protease FXa and accelerator of thrombin generation during fibrin clot formation as part of prothrombinase 1, 2. Manual FV assays have been described 3, 4, but they are time consuming and subjective. Automated FV assays have been reported 5-7, but the analyzer and reagents are expensive and generally provide only the clot time, not the rate and extent of fibrin formation. The microplate platform is preferred for measuring enzyme-catalyzed events because of convenience, time, cost, small volume, continuous monitoring, and high-throughput 8, 9. Microplate assays have been reported for clot lysis 10, platelet aggregation 11, and coagulation Factors 12, but not for FV activity in human plasma. The goal of the method was to develop a microplate assay that measures FV activity during fibrin formation in human plasma. This novel microplate method outlines a simple, inexpensive, and rapid assay of FV activity in human plasma. The assay utilizes a kinetic microplate reader to monitor the absorbance change at 405nm during fibrin formation in human plasma (Figure 1) 13. The assay accurately measures the time, initial rate, and extent of fibrin clot formation. It requires only μl quantities of plasma, is complete in 6 min, has high-throughput, is sensitive to 24-80pM FV, and measures the amount of unintentionally activated (1-stage activity) and thrombin-activated FV (2-stage activity) to obtain a complete assessment of its total functional activity (2-stage activity - 1-stage activity). Disseminated intravascular coagulation (DIC) is an acquired coagulopathy that most often develops from pre-existing infections 14. DIC is associated with a poor prognosis and increases mortality above the pre-existing pathology 15. The assay was used to show that in 9 patients with DIC, the FV 1-stage, 2-stage, and total activities were decreased, on average, by 54%, 44%, and 42%, respectively, compared with normal pooled human reference plasma (NHP). The FV microplate assay is easily adaptable to measure the activity of any coagulation factor. This assay will increase our understanding of FV biochemistry through a more accurate and complete measurement of its activity in research and clinical settings. This information will positively impact healthcare environments through earlier diagnosis and development of more effective treatments for coagulation disorders, such as DIC.
Immunology, Issue 67, Factor V, Microplate, Coagulation assay, Human plasma, Disseminated intravascular coagulation (DIC), blood clotting
3822
Play Button
Rapid Point-of-Care Assay of Enoxaparin Anticoagulant Efficacy in Whole Blood
Authors: Mario A. Inchiosa Jr., Suryanarayana Pothula, Keshar Kubal, Vajubhai T. Sanchala, Iris Navarro.
Institutions: New York Medical College , New York Medical College .
There is the need for a clinical assay to determine the extent to which a patient's blood is effectively anticoagulated by the low-molecular-weight-heparin (LMWH), enoxaparin. There are also urgent clinical situations where it would be important if this could be determined rapidly. The present assay is designed to accomplish this. We only assayed human blood samples that were spiked with known concentrations of enoxaparin. The essential feature of the present assay is the quantification of the efficacy of enoxaparin in a patient's blood sample by degrading it to complete inactivity with heparinase. Two blood samples were drawn into Vacutainer tubes (Becton-Dickenson; Franklin Lakes, NJ) that were spiked with enoxaparin; one sample was digested with heparinase for 5 min at 37 °C, the other sample represented the patient's baseline anticoagulated status. The percent shortening of clotting time in the heparinase-treated sample, as compared to the baseline state, yielded the anticoagulant contribution of enoxaparin. We used the portable, battery operated Hemochron 801 apparatus for measurements of clotting times (International Technidyne Corp., Edison, NJ). The apparatus has 2 thermostatically controlled (37 °C) assay tube wells. We conducted the assays in two types of assay cartridges that are available from the manufacturer of the instrument. One cartridge was modified to increase its sensitivity. We removed the kaolin from the FTK-ACT cartridge by extensive rinsing with distilled water, leaving only the glass surface of the tube, and perhaps the detection magnet, as activators. We called this our minimally activated assay (MAA). The use of a minimally activated assay has been studied by us and others. 2-4 The second cartridge that was studied was an activated partial thromboplastin time (aPTT) assay (A104). This was used as supplied from the manufacturer. The thermostated wells of the instrument were used for both the heparinase digestion and coagulation assays. The assay can be completed within 10 min. The MAA assay showed robust changes in clotting time after heparinase digestion of enoxaparin over a typical clinical concentration range. At 0.2 anti-Xa I.U. of enoxaparin per ml of blood sample, heparinase digestion caused an average decrease of 9.8% (20.4 sec) in clotting time; at 1.0 I.U. per ml of enoxaparin there was a 41.4% decrease (148.8 sec). This report only presents the experimental application of the assay; its value in a clinical setting must still be established.
Medicine, Issue 68, Immunology, Physiology, Pharmacology, low-molecular-weight-heparin, low-molecular-weight-heparin assay, LMWH point-of-care assay, anti-Factor-Xa activity, enoxaparin, heparinase, whole blood, assay
3852
Play Button
Trans-vivo Delayed Type Hypersensitivity Assay for Antigen Specific Regulation
Authors: Ewa Jankowska-Gan, Subramanya Hegde, William J. Burlingham.
Institutions: University of Wisconsin-Madison, School of Medicine and Public Health.
Delayed-type hypersensitivity response (DTH) is a rapid in vivo manifestation of T cell-dependent immune response to a foreign antigen (Ag) that the host immune system has experienced in the recent past. DTH reactions are often divided into a sensitization phase, referring to the initial antigen experience, and a challenge phase, which usually follows several days after sensitization. The lack of a delayed-type hypersensitivity response to a recall Ag demonstrated by skin testing is often regarded as an evidence of anergy. The traditional DTH assay has been effectively used in diagnosing many microbial infections. Despite sharing similar immune features such as lymphocyte infiltration, edema, and tissue necrosis, the direct DTH is not a feasible diagnostic technique in transplant patients because of the possibility of direct injection resulting in sensitization to donor antigens and graft loss. To avoid this problem, the human-to-mouse "trans-vivo" DTH assay was developed 1,2. This test is essentially a transfer DTH assay, in which human peripheral blood mononuclear cells (PBMCs) and specific antigens were injected subcutaneously into the pinnae or footpad of a naïve mouse and DTH-like swelling is measured after 18-24 hr 3. The antigen presentation by human antigen presenting cells such as macrophages or DCs to T cells in highly vascular mouse tissue triggers the inflammatory cascade and attracts mouse immune cells resulting in swelling responses. The response is antigen-specific and requires prior antigen sensitization. A positive donor-reactive DTH response in the Tv-DTH assay reflects that the transplant patient has developed a pro-inflammatory immune disposition toward graft alloantigens. The most important feature of this assay is that it can also be used to detect regulatory T cells, which cause bystander suppression. Bystander suppression of a DTH recall response in the presence of donor antigen is characteristic of transplant recipients with accepted allografts 2,4-14. The monitoring of transplant recipients for alloreactivity and regulation by Tv-DTH may identify a subset of patients who could benefit from reduction of immunosuppression without elevated risk of rejection or deteriorating renal function. A promising area is the application of the Tv-DTH assay in monitoring of autoimmunity15,16 and also in tumor immunology 17.
Immunology, Issue 75, Medicine, Molecular Biology, Cellular Biology, Biomedical Engineering, Anatomy, Physiology, Cancer Biology, Surgery, Trans-vivo delayed type hypersensitivity, Tv-DTH, Donor antigen, Antigen-specific regulation, peripheral blood mononuclear cells, PBMC, T regulatory cells, severe combined immunodeficient mice, SCID, T cells, lymphocytes, inflammation, injection, mouse, animal model
4454
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
50319
Play Button
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Authors: Wenan Chen, Ashwin Belle, Charles Cockrell, Kevin R. Ward, Kayvan Najarian.
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
3871
Play Button
Multifocal Electroretinograms
Authors: Donnell J. Creel.
Institutions: University of Utah.
A limitation of traditional full-field electroretinograms (ERG) for the diagnosis of retinopathy is lack of sensitivity. Generally, ERG results are normal unless more than approximately 20% of the retina is affected. In practical terms, a patient might be legally blind as a result of macular degeneration or other scotomas and still appear normal, according to traditional full field ERG. An important development in ERGs is the multifocal ERG (mfERG). Erich Sutter adapted the mathematical sequences called binary m-sequences enabling the isolation from a single electrical signal an electroretinogram representing less than each square millimeter of retina in response to a visual stimulus1. Results that are generated by mfERG appear similar to those generated by flash ERG. In contrast to flash ERG, which best generates data appropriate for whole-eye disorders. The basic mfERG result is based on the calculated mathematical average of an approximation of the positive deflection component of traditional ERG response, known as the b-wave1. Multifocal ERG programs measure electrical activity from more than a hundred retinal areas per eye, in a few minutes. The enhanced spatial resolution enables scotomas and retinal dysfunction to be mapped and quantified. In the protocol below, we describe the recording of mfERGs using a bipolar speculum contact lens. Components of mfERG systems vary between manufacturers. For the presentation of visible stimulus, some suitable CRT monitors are available but most systems have adopted the use of flat-panel liquid crystal displays (LCD). The visual stimuli depicted here, were produced by a LCD microdisplay subtending 35 - 40 degrees horizontally and 30 - 35 degrees vertically of visual field, and calibrated to produce multifocal flash intensities of 2.7 cd s m-2. Amplification was 50K. Lower and upper bandpass limits were 10 and 300 Hz. The software packages used were VERIS versions 5 and 6.
Medicine, Issue 58, Multifocal electroretinogram, mfERG, electroretinogram, ERG
3176
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
1988
Play Button
Ole Isacson: Development of New Therapies for Parkinson's Disease
Authors: Ole Isacson.
Institutions: Harvard Medical School.
Medicine, Issue 3, Parkinson' disease, Neuroscience, dopamine, neuron, L-DOPA, stem cell, transplantation
189
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.