JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Clinical relevance of VKORC1 (G-1639A and C1173T) and CYP2C9*3 among patients on warfarin.
J Clin Pharm Ther
PUBLISHED: 04-20-2011
Testing for cytochrome P450-2C9 (CYP2C9) and vitamin K epoxide reductase complex subunit 1 (VKORC1) variant alleles is recommended by the FDA for dosing of warfarin. However, dose prediction models derived from data obtained in one population may not be applicable to another. We therefore studied the impact of genetic polymorphisms of CYP2C9 and VKORC1 on warfarin dose requirement in Malaysia.
Authors: Adrienne R. Niederriter, Erica E. Davis, Christelle Golzio, Edwin C. Oh, I-Chun Tsai, Nicholas Katsanis.
Published: 08-24-2013
Here, we present methods for the development of assays to query potentially clinically significant nonsynonymous changes using in vivo complementation in zebrafish. Zebrafish (Danio rerio) are a useful animal system due to their experimental tractability; embryos are transparent to enable facile viewing, undergo rapid development ex vivo, and can be genetically manipulated.1 These aspects have allowed for significant advances in the analysis of embryogenesis, molecular processes, and morphogenetic signaling. Taken together, the advantages of this vertebrate model make zebrafish highly amenable to modeling the developmental defects in pediatric disease, and in some cases, adult-onset disorders. Because the zebrafish genome is highly conserved with that of humans (~70% orthologous), it is possible to recapitulate human disease states in zebrafish. This is accomplished either through the injection of mutant human mRNA to induce dominant negative or gain of function alleles, or utilization of morpholino (MO) antisense oligonucleotides to suppress genes to mimic loss of function variants. Through complementation of MO-induced phenotypes with capped human mRNA, our approach enables the interpretation of the deleterious effect of mutations on human protein sequence based on the ability of mutant mRNA to rescue a measurable, physiologically relevant phenotype. Modeling of the human disease alleles occurs through microinjection of zebrafish embryos with MO and/or human mRNA at the 1-4 cell stage, and phenotyping up to seven days post fertilization (dpf). This general strategy can be extended to a wide range of disease phenotypes, as demonstrated in the following protocol. We present our established models for morphogenetic signaling, craniofacial, cardiac, vascular integrity, renal function, and skeletal muscle disorder phenotypes, as well as others.
29 Related JoVE Articles!
Play Button
Autologous Blood Injection to Model Spontaneous Intracerebral Hemorrhage in Mice
Authors: Lauren H. Sansing, Scott E. Kasner, Louise McCullough, Puneet Agarwal, Frank A. Welsh, Katalin Kariko.
Institutions: University of Connecticut Health Center, School of Medicine, University of Pennsylvania, Hartford Hospital, School of Medicine, University of Pennsylvania.
Investigation of the pathophysiology of injury after intracerebral hemorrhage (ICH) requires a reproducible animal model. While ICH accounts for 10-15% of all strokes, there remains no specific effective therapy. The autologous blood injection model in mice involves the stereotaxic injection of arterial blood into the basal ganglia mimicking a spontaneous hypertensive hemorrhage in man. The response to hemorrhage can then be studied in vivo and the neurobehavioral deficits quantified, allowing for description of the ensuing pathology and the testing of potential therapeutic agents. The procedure described in this protocol uses the double injection technique to minimize risk of blood reflux up the needle track, no anticoagulants in the pumping system, and eliminates all dead space and expandable tubing in the system.
Neuroscience, Issue 54, stroke, intracerebral hemorrhage, mice, animal model
Play Button
An Allele-specific Gene Expression Assay to Test the Functional Basis of Genetic Associations
Authors: Silvia Paracchini, Anthony P. Monaco, Julian C. Knight.
Institutions: University of Oxford.
The number of significant genetic associations with common complex traits is constantly increasing. However, most of these associations have not been understood at molecular level. One of the mechanisms mediating the effect of DNA variants on phenotypes is gene expression, which has been shown to be particularly relevant for complex traits1. This method tests in a cellular context the effect of specific DNA sequences on gene expression. The principle is to measure the relative abundance of transcripts arising from the two alleles of a gene, analysing cells which carry one copy of the DNA sequences associated with disease (the risk variants)2,3. Therefore, the cells used for this method should meet two fundamental genotypic requirements: they have to be heterozygous both for DNA risk variants and for DNA markers, typically coding polymorphisms, which can distinguish transcripts based on their chromosomal origin (Figure 1). DNA risk variants and DNA markers do not need to have the same allele frequency but the phase (haplotypic) relationship of the genetic markers needs to be understood. It is also important to choose cell types which express the gene of interest. This protocol refers specifically to the procedure adopted to extract nucleic acids from fibroblasts but the method is equally applicable to other cells types including primary cells. DNA and RNA are extracted from the selected cell lines and cDNA is generated. DNA and cDNA are analysed with a primer extension assay, designed to target the coding DNA markers4. The primer extension assay is carried out using the MassARRAY (Sequenom)5 platform according to the manufacturer's specifications. Primer extension products are then analysed by matrix-assisted laser desorption/ionization time of-flight mass spectrometry (MALDI-TOF/MS). Because the selected markers are heterozygous they will generate two peaks on the MS profiles. The area of each peak is proportional to the transcript abundance and can be measured with a function of the MassARRAY Typer software to generate an allelic ratio (allele 1: allele 2) calculation. The allelic ratio obtained for cDNA is normalized using that measured from genomic DNA, where the allelic ratio is expected to be 1:1 to correct for technical artifacts. Markers with a normalised allelic ratio significantly different to 1 indicate that the amount of transcript generated from the two chromosomes in the same cell is different, suggesting that the DNA variants associated with the phenotype have an effect on gene expression. Experimental controls should be used to confirm the results.
Cellular Biology, Issue 45, Gene expression, regulatory variant, haplotype, association study, primer extension, MALDI-TOF mass spectrometry, single nucleotide polymorphism, allele-specific
Play Button
A Simple Chelex Protocol for DNA Extraction from Anopheles spp.
Authors: Mulenga Musapa, Taida Kumwenda, Mtawa Mkulama, Sandra Chishimba, Douglas E. Norris, Philip E. Thuma, Sungano Mharakurwa.
Institutions: Malaria Institute at Macha, Johns Hopkins Bloomberg School of Public Health.
Endemic countries are increasingly adopting molecular tools for efficient typing, identification and surveillance against malaria parasites and vector mosquitoes, as an integral part of their control programs1,2,3,4,5. For sustainable establishment of these accurate approaches in operations research to strengthen malaria control and elimination efforts, simple and affordable methods, with parsimonious reagent and equipment requirements are essential6,7,8. Here we present a simple Chelex-based technique for extracting malaria parasite and vector DNA from field collected mosquito specimens. We morphologically identified 72 Anopheles gambiae sl. from 156 mosquitoes captured by pyrethrum spray catches in sleeping rooms of households within a 2,000 km2 vicinity of the Malaria Institute at Macha. After dissection to separate the head and thorax from the abdomen for all 72 Anopheles gambiae sl. mosquitoes, the two sections were individually placed in 1.5 ml microcentrifuge tubes and submerged in 20 μl of deionized water. Using a sterile pipette tip, each mosquito section was separately homogenized to a uniform suspension in the deionized water. Of the ensuing homogenate from each mosquito section, 10 μl was retained while the other 10 μl was transferred to a separate autoclaved 1.5 ml tube. The separate aliquots were subjected to DNA extraction by either the simplified Chelex or the standard salting out extraction protocol9,10. The salting out protocol is so-called and widely used because it employs high salt concentrations in lieu of hazardous organic solvents (such as phenol and chloroform) for the protein precipitation step during DNA extraction9. Extracts were used as templates for PCR amplification using primers targeting arthropod mitochondrial nicotinamide adenine dinucleotide dehydrogenase (NADH) subunit 4 gene (ND4) to check DNA quality11, a PCR for identification of Anopheles gambiae sibling species10 and a nested PCR for typing of Plasmodium falciparum infection12. Comparison using DNA quality (ND4) PCR showed 93% sensitivity and 82% specificity for the Chelex approach relative to the established salting out protocol. Corresponding values of sensitivity and specificity were 100% and 78%, respectively, using sibling species identification PCR and 92% and 80%, respectively for P. falciparum detection PCR. There were no significant differences in proportion of samples giving amplicon signal with the Chelex or the regular salting out protocol across all three PCR applications. The Chelex approach required three simple reagents and 37 min to complete, while the salting out protocol entailed 10 different reagents and 2 hr and 47 min' processing time, including an overnight step. Our results show that the Chelex method is comparable to the existing salting out extraction and can be substituted as a simple and sustainable approach in resource-limited settings where a constant reagent supply chain is often difficult to maintain.
Infection, Issue 71, Immunology, Infectious Diseases, Genetics, Molecular Biology, Microbiology, Parasitology, Entomology, Malaria, Plasmodium falciparum, vector, Anopheles, Diptera, mosquitoes, Chelex, DNA, extraction, PCR, dissection, insect, vector, pathogen
Play Button
A Noninvasive Hair Sampling Technique to Obtain High Quality DNA from Elusive Small Mammals
Authors: Philippe Henry, Alison Henry, Michael A. Russello.
Institutions: University of British Columbia, Okanagan Campus.
Noninvasive genetic sampling approaches are becoming increasingly important to study wildlife populations. A number of studies have reported using noninvasive sampling techniques to investigate population genetics and demography of wild populations1. This approach has proven to be especially useful when dealing with rare or elusive species2. While a number of these methods have been developed to sample hair, feces and other biological material from carnivores and medium-sized mammals, they have largely remained untested in elusive small mammals. In this video, we present a novel, inexpensive and noninvasive hair snare targeted at an elusive small mammal, the American pika (Ochotona princeps). We describe the general set-up of the hair snare, which consists of strips of packing tape arranged in a web-like fashion and placed along travelling routes in the pikas’ habitat. We illustrate the efficiency of the snare at collecting a large quantity of hair that can then be collected and brought back to the lab. We then demonstrate the use of the DNA IQ system (Promega) to isolate DNA and showcase the utility of this method to amplify commonly used molecular markers including nuclear microsatellites, amplified fragment length polymorphisms (AFLPs), mitochondrial sequences (800bp) as well as a molecular sexing marker. Overall, we demonstrate the utility of this novel noninvasive hair snare as a sampling technique for wildlife population biologists. We anticipate that this approach will be applicable to a variety of small mammals, opening up areas of investigation within natural populations, while minimizing impact to study organisms.
Genetics, Issue 49, Conservation genetics, noninvasive genetic sampling, Hair snares, Microsatellites, AFLPs, American pika, Ochotona princeps
Play Button
Direct Pressure Monitoring Accurately Predicts Pulmonary Vein Occlusion During Cryoballoon Ablation
Authors: Ioanna Kosmidou, Shannnon Wooden, Brian Jones, Thomas Deering, Andrew Wickliffe, Dan Dan.
Institutions: Piedmont Heart Institute, Medtronic Inc..
Cryoballoon ablation (CBA) is an established therapy for atrial fibrillation (AF). Pulmonary vein (PV) occlusion is essential for achieving antral contact and PV isolation and is typically assessed by contrast injection. We present a novel method of direct pressure monitoring for assessment of PV occlusion. Transcatheter pressure is monitored during balloon advancement to the PV antrum. Pressure is recorded via a single pressure transducer connected to the inner lumen of the cryoballoon. Pressure curve characteristics are used to assess occlusion in conjunction with fluoroscopic or intracardiac echocardiography (ICE) guidance. PV occlusion is confirmed when loss of typical left atrial (LA) pressure waveform is observed with recordings of PA pressure characteristics (no A wave and rapid V wave upstroke). Complete pulmonary vein occlusion as assessed with this technique has been confirmed with concurrent contrast utilization during the initial testing of the technique and has been shown to be highly accurate and readily reproducible. We evaluated the efficacy of this novel technique in 35 patients. A total of 128 veins were assessed for occlusion with the cryoballoon utilizing the pressure monitoring technique; occlusive pressure was demonstrated in 113 veins with resultant successful pulmonary vein isolation in 111 veins (98.2%). Occlusion was confirmed with subsequent contrast injection during the initial ten procedures, after which contrast utilization was rapidly reduced or eliminated given the highly accurate identification of occlusive pressure waveform with limited initial training. Verification of PV occlusive pressure during CBA is a novel approach to assessing effective PV occlusion and it accurately predicts electrical isolation. Utilization of this method results in significant decrease in fluoroscopy time and volume of contrast.
Medicine, Issue 72, Anatomy, Physiology, Cardiology, Biomedical Engineering, Surgery, Cardiovascular System, Cardiovascular Diseases, Surgical Procedures, Operative, Investigative Techniques, Atrial fibrillation, Cryoballoon Ablation, Pulmonary Vein Occlusion, Pulmonary Vein Isolation, electrophysiology, catheterizatoin, heart, vein, clinical, surgical device, surgical techniques
Play Button
Utility of Dissociated Intrinsic Hand Muscle Atrophy in the Diagnosis of Amyotrophic Lateral Sclerosis
Authors: Parvathi Menon, Steve Vucic.
Institutions: Westmead Hospital, University of Sydney, Australia.
The split hand phenomenon refers to predominant wasting of thenar muscles and is an early and specific feature of amyotrophic lateral sclerosis (ALS). A novel split hand index (SI) was developed to quantify the split hand phenomenon, and its diagnostic utility was assessed in ALS patients. The split hand index was derived by dividing the product of the compound muscle action potential (CMAP) amplitude recorded over the abductor pollicis brevis and first dorsal interosseous muscles by the CMAP amplitude recorded over the abductor digiti minimi muscle. In order to assess the diagnostic utility of the split hand index, ALS patients were prospectively assessed and their results were compared to neuromuscular disorder patients. The split hand index was significantly reduced in ALS when compared to neuromuscular disorder patients (P<0.0001). Limb-onset ALS patients exhibited the greatest reduction in the split hand index, and a value of 5.2 or less reliably differentiated ALS from other neuromuscular disorders. Consequently, the split hand index appears to be a novel diagnostic biomarker for ALS, perhaps facilitating an earlier diagnosis.
Medicine, Issue 85, Amyotrophic Lateral Sclerosis (ALS), dissociated muscle atrophy, hypothenar muscles, motor neuron disease, split-hand index, thenar muscles
Play Button
Implantation of the Syncardia Total Artificial Heart
Authors: Daniel G. Tang, Keyur B. Shah, Micheal L. Hess, Vigneshwar Kasirajan.
Institutions: Virginia Commonwealth University, Virginia Commonwealth University.
With advances in technology, the use of mechanical circulatory support devices for end stage heart failure has rapidly increased. The vast majority of such patients are generally well served by left ventricular assist devices (LVADs). However, a subset of patients with late stage biventricular failure or other significant anatomic lesions are not adequately treated by isolated left ventricular mechanical support. Examples of concomitant cardiac pathology that may be better treated by resection and TAH replacement includes: post infarction ventricular septal defect, aortic root aneurysm / dissection, cardiac allograft failure, massive ventricular thrombus, refractory malignant arrhythmias (independent of filling pressures), hypertrophic / restrictive cardiomyopathy, and complex congenital heart disease. Patients often present with cardiogenic shock and multi system organ dysfunction. Excision of both ventricles and orthotopic replacement with a total artificial heart (TAH) is an effective, albeit extreme, therapy for rapid restoration of blood flow and resuscitation. Perioperative management is focused on end organ resuscitation and physical rehabilitation. In addition to the usual concerns of infection, bleeding, and thromboembolism common to all mechanically supported patients, TAH patients face unique risks with regard to renal failure and anemia. Supplementation of the abrupt decrease in brain natriuretic peptide following ventriculectomy appears to have protective renal effects. Anemia following TAH implantation can be profound and persistent. Nonetheless, the anemia is generally well tolerated and transfusion are limited to avoid HLA sensitization. Until recently, TAH patients were confined as inpatients tethered to a 500 lb pneumatic console driver. Recent introduction of a backpack sized portable driver (currently under clinical trial) has enabled patients to be discharged home and even return to work. Despite the profound presentation of these sick patients, there is a 79-87% success in bridge to transplantation.
Medicine, Issue 89, mechanical circulatory support, total artificial heart, biventricular failure, operative techniques
Play Button
Implantation of Total Artificial Heart in Congenital Heart Disease
Authors: Iki Adachi, David S. L. Morales.
Institutions: Texas Children's Hospital, Baylor College of Medicine, The University of Cincinnati College of Medicine.
In patients with end-stage heart failure (HF), a total artificial heart (TAH) may be implanted as a bridge to cardiac transplant. However, in congenital heart disease (CHD), the malformed heart presents a challenge to TAH implantation. In the case presented here, a 17 year-old patient with congenital transposition of the great arteries (CCTGA) experienced progressively worsening HF due to his congenital condition. He was hospitalized multiple times and received an implantable cardioverter defibrillator (ICD). However, his condition soon deteriorated to end-stage HF with multisystem organ failure. Due to the patient's grave clinical condition and the presence of complex cardiac lesions, the decision was made to proceed with a TAH. The abnormal arrangement of the patient's ventricles and great arteries required modifications to the TAH during implantation. With the TAH in place, the patient was able to return home and regain strength and physical well-being while awaiting a donor heart. He was successfully bridged to heart transplantation 5 months after receiving the device. This report highlights the TAH is feasible even in patients with structurally abnormal hearts, with technical modification.
Medicine, Issue 89, total artificial heart, transposition of the great arteries, congenital heart disease, aortic insufficiency, ventricular outflow tract obstruction, conduit obstruction, heart failure
Play Button
Use of Artificial Sputum Medium to Test Antibiotic Efficacy Against Pseudomonas aeruginosa in Conditions More Relevant to the Cystic Fibrosis Lung
Authors: Sebastian Kirchner, Joanne L Fothergill, Elli A. Wright, Chloe E. James, Eilidh Mowat, Craig Winstanley.
Institutions: University of Liverpool , University of Liverpool .
There is growing concern about the relevance of in vitro antimicrobial susceptibility tests when applied to isolates of P. aeruginosa from cystic fibrosis (CF) patients. Existing methods rely on single or a few isolates grown aerobically and planktonically. Predetermined cut-offs are used to define whether the bacteria are sensitive or resistant to any given antibiotic1. However, during chronic lung infections in CF, P. aeruginosa populations exist in biofilms and there is evidence that the environment is largely microaerophilic2. The stark difference in conditions between bacteria in the lung and those during diagnostic testing has called into question the reliability and even relevance of these tests3. Artificial sputum medium (ASM) is a culture medium containing the components of CF patient sputum, including amino acids, mucin and free DNA. P. aeruginosa growth in ASM mimics growth during CF infections, with the formation of self-aggregating biofilm structures and population divergence4,5,6. The aim of this study was to develop a microtitre-plate assay to study antimicrobial susceptibility of P. aeruginosa based on growth in ASM, which is applicable to both microaerophilic and aerobic conditions. An ASM assay was developed in a microtitre plate format. P. aeruginosa biofilms were allowed to develop for 3 days prior to incubation with antimicrobial agents at different concentrations for 24 hours. After biofilm disruption, cell viability was measured by staining with resazurin. This assay was used to ascertain the sessile cell minimum inhibitory concentration (SMIC) of tobramycin for 15 different P. aeruginosa isolates under aerobic and microaerophilic conditions and SMIC values were compared to those obtained with standard broth growth. Whilst there was some evidence for increased MIC values for isolates grown in ASM when compared to their planktonic counterparts, the biggest differences were found with bacteria tested in microaerophilic conditions, which showed a much increased resistance up to a >128 fold, towards tobramycin in the ASM system when compared to assays carried out in aerobic conditions. The lack of association between current susceptibility testing methods and clinical outcome has questioned the validity of current methods3. Several in vitro models have been used previously to study P. aeruginosa biofilms7, 8. However, these methods rely on surface attached biofilms, whereas the ASM biofilms resemble those observed in the CF lung9 . In addition, reduced oxygen concentration in the mucus has been shown to alter the behavior of P. aeruginosa2 and affect antibiotic susceptibility10. Therefore using ASM under microaerophilic conditions may provide a more realistic environment in which to study antimicrobial susceptibility.
Immunology, Issue 64, Microbiology, Pseudomonas aeruginosa, antimicrobial susceptibility, artificial sputum media, lung infection, cystic fibrosis, diagnostics, plankton
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Play Button
Adaptation of Semiautomated Circulating Tumor Cell (CTC) Assays for Clinical and Preclinical Research Applications
Authors: Lori E. Lowes, Benjamin D. Hedley, Michael Keeney, Alison L. Allan.
Institutions: London Health Sciences Centre, Western University, London Health Sciences Centre, Lawson Health Research Institute, Western University.
The majority of cancer-related deaths occur subsequent to the development of metastatic disease. This highly lethal disease stage is associated with the presence of circulating tumor cells (CTCs). These rare cells have been demonstrated to be of clinical significance in metastatic breast, prostate, and colorectal cancers. The current gold standard in clinical CTC detection and enumeration is the FDA-cleared CellSearch system (CSS). This manuscript outlines the standard protocol utilized by this platform as well as two additional adapted protocols that describe the detailed process of user-defined marker optimization for protein characterization of patient CTCs and a comparable protocol for CTC capture in very low volumes of blood, using standard CSS reagents, for studying in vivo preclinical mouse models of metastasis. In addition, differences in CTC quality between healthy donor blood spiked with cells from tissue culture versus patient blood samples are highlighted. Finally, several commonly discrepant items that can lead to CTC misclassification errors are outlined. Taken together, these protocols will provide a useful resource for users of this platform interested in preclinical and clinical research pertaining to metastasis and CTCs.
Medicine, Issue 84, Metastasis, circulating tumor cells (CTCs), CellSearch system, user defined marker characterization, in vivo, preclinical mouse model, clinical research
Play Button
Investigating Protein-protein Interactions in Live Cells Using Bioluminescence Resonance Energy Transfer
Authors: Pelagia Deriziotis, Sarah A. Graham, Sara B. Estruch, Simon E. Fisher.
Institutions: Max Planck Institute for Psycholinguistics, Donders Institute for Brain, Cognition and Behaviour.
Assays based on Bioluminescence Resonance Energy Transfer (BRET) provide a sensitive and reliable means to monitor protein-protein interactions in live cells. BRET is the non-radiative transfer of energy from a 'donor' luciferase enzyme to an 'acceptor' fluorescent protein. In the most common configuration of this assay, the donor is Renilla reniformis luciferase and the acceptor is Yellow Fluorescent Protein (YFP). Because the efficiency of energy transfer is strongly distance-dependent, observation of the BRET phenomenon requires that the donor and acceptor be in close proximity. To test for an interaction between two proteins of interest in cultured mammalian cells, one protein is expressed as a fusion with luciferase and the second as a fusion with YFP. An interaction between the two proteins of interest may bring the donor and acceptor sufficiently close for energy transfer to occur. Compared to other techniques for investigating protein-protein interactions, the BRET assay is sensitive, requires little hands-on time and few reagents, and is able to detect interactions which are weak, transient, or dependent on the biochemical environment found within a live cell. It is therefore an ideal approach for confirming putative interactions suggested by yeast two-hybrid or mass spectrometry proteomics studies, and in addition it is well-suited for mapping interacting regions, assessing the effect of post-translational modifications on protein-protein interactions, and evaluating the impact of mutations identified in patient DNA.
Cellular Biology, Issue 87, Protein-protein interactions, Bioluminescence Resonance Energy Transfer, Live cell, Transfection, Luciferase, Yellow Fluorescent Protein, Mutations
Play Button
Dynamic Visual Tests to Identify and Quantify Visual Damage and Repair Following Demyelination in Optic Neuritis Patients
Authors: Noa Raz, Michal Hallak, Tamir Ben-Hur, Netta Levin.
Institutions: Hadassah Hebrew-University Medical Center.
In order to follow optic neuritis patients and evaluate the effectiveness of their treatment, a handy, accurate and quantifiable tool is required to assess changes in myelination at the central nervous system (CNS). However, standard measurements, including routine visual tests and MRI scans, are not sensitive enough for this purpose. We present two visual tests addressing dynamic monocular and binocular functions which may closely associate with the extent of myelination along visual pathways. These include Object From Motion (OFM) extraction and Time-constrained stereo protocols. In the OFM test, an array of dots compose an object, by moving the dots within the image rightward while moving the dots outside the image leftward or vice versa. The dot pattern generates a camouflaged object that cannot be detected when the dots are stationary or moving as a whole. Importantly, object recognition is critically dependent on motion perception. In the Time-constrained Stereo protocol, spatially disparate images are presented for a limited length of time, challenging binocular 3-dimensional integration in time. Both tests are appropriate for clinical usage and provide a simple, yet powerful, way to identify and quantify processes of demyelination and remyelination along visual pathways. These protocols may be efficient to diagnose and follow optic neuritis and multiple sclerosis patients. In the diagnostic process, these protocols may reveal visual deficits that cannot be identified via current standard visual measurements. Moreover, these protocols sensitively identify the basis of the currently unexplained continued visual complaints of patients following recovery of visual acuity. In the longitudinal follow up course, the protocols can be used as a sensitive marker of demyelinating and remyelinating processes along time. These protocols may therefore be used to evaluate the efficacy of current and evolving therapeutic strategies, targeting myelination of the CNS.
Medicine, Issue 86, Optic neuritis, visual impairment, dynamic visual functions, motion perception, stereopsis, demyelination, remyelination
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Modeling Astrocytoma Pathogenesis In Vitro and In Vivo Using Cortical Astrocytes or Neural Stem Cells from Conditional, Genetically Engineered Mice
Authors: Robert S. McNeill, Ralf S. Schmid, Ryan E. Bash, Mark Vitucci, Kristen K. White, Andrea M. Werneke, Brian H. Constance, Byron Huff, C. Ryan Miller.
Institutions: University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, Emory University School of Medicine, University of North Carolina School of Medicine.
Current astrocytoma models are limited in their ability to define the roles of oncogenic mutations in specific brain cell types during disease pathogenesis and their utility for preclinical drug development. In order to design a better model system for these applications, phenotypically wild-type cortical astrocytes and neural stem cells (NSC) from conditional, genetically engineered mice (GEM) that harbor various combinations of floxed oncogenic alleles were harvested and grown in culture. Genetic recombination was induced in vitro using adenoviral Cre-mediated recombination, resulting in expression of mutated oncogenes and deletion of tumor suppressor genes. The phenotypic consequences of these mutations were defined by measuring proliferation, transformation, and drug response in vitro. Orthotopic allograft models, whereby transformed cells are stereotactically injected into the brains of immune-competent, syngeneic littermates, were developed to define the role of oncogenic mutations and cell type on tumorigenesis in vivo. Unlike most established human glioblastoma cell line xenografts, injection of transformed GEM-derived cortical astrocytes into the brains of immune-competent littermates produced astrocytomas, including the most aggressive subtype, glioblastoma, that recapitulated the histopathological hallmarks of human astrocytomas, including diffuse invasion of normal brain parenchyma. Bioluminescence imaging of orthotopic allografts from transformed astrocytes engineered to express luciferase was utilized to monitor in vivo tumor growth over time. Thus, astrocytoma models using astrocytes and NSC harvested from GEM with conditional oncogenic alleles provide an integrated system to study the genetics and cell biology of astrocytoma pathogenesis in vitro and in vivo and may be useful in preclinical drug development for these devastating diseases.
Neuroscience, Issue 90, astrocytoma, cortical astrocytes, genetically engineered mice, glioblastoma, neural stem cells, orthotopic allograft
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Play Button
Isolation and Quantification of Botulinum Neurotoxin From Complex Matrices Using the BoTest Matrix Assays
Authors: F. Mark Dunning, Timothy M. Piazza, Füsûn N. Zeytin, Ward C. Tucker.
Institutions: BioSentinel Inc., Madison, WI.
Accurate detection and quantification of botulinum neurotoxin (BoNT) in complex matrices is required for pharmaceutical, environmental, and food sample testing. Rapid BoNT testing of foodstuffs is needed during outbreak forensics, patient diagnosis, and food safety testing while accurate potency testing is required for BoNT-based drug product manufacturing and patient safety. The widely used mouse bioassay for BoNT testing is highly sensitive but lacks the precision and throughput needed for rapid and routine BoNT testing. Furthermore, the bioassay's use of animals has resulted in calls by drug product regulatory authorities and animal-rights proponents in the US and abroad to replace the mouse bioassay for BoNT testing. Several in vitro replacement assays have been developed that work well with purified BoNT in simple buffers, but most have not been shown to be applicable to testing in highly complex matrices. Here, a protocol for the detection of BoNT in complex matrices using the BoTest Matrix assays is presented. The assay consists of three parts: The first part involves preparation of the samples for testing, the second part is an immunoprecipitation step using anti-BoNT antibody-coated paramagnetic beads to purify BoNT from the matrix, and the third part quantifies the isolated BoNT's proteolytic activity using a fluorogenic reporter. The protocol is written for high throughput testing in 96-well plates using both liquid and solid matrices and requires about 2 hr of manual preparation with total assay times of 4-26 hr depending on the sample type, toxin load, and desired sensitivity. Data are presented for BoNT/A testing with phosphate-buffered saline, a drug product, culture supernatant, 2% milk, and fresh tomatoes and includes discussion of critical parameters for assay success.
Neuroscience, Issue 85, Botulinum, food testing, detection, quantification, complex matrices, BoTest Matrix, Clostridium, potency testing
Play Button
Pharmacologic Induction of Epidermal Melanin and Protection Against Sunburn in a Humanized Mouse Model
Authors: Alexandra Amaro-Ortiz, Jillian C. Vanover, Timothy L. Scott, John A. D'Orazio.
Institutions: University of Kentucky College of Medicine, University of Kentucky College of Medicine, University of Kentucky College of Medicine, University of Kentucky College of Medicine.
Fairness of skin, UV sensitivity and skin cancer risk all correlate with the physiologic function of the melanocortin 1 receptor, a Gs-coupled signaling protein found on the surface of melanocytes. Mc1r stimulates adenylyl cyclase and cAMP production which, in turn, up-regulates melanocytic production of melanin in the skin. In order to study the mechanisms by which Mc1r signaling protects the skin against UV injury, this study relies on a mouse model with "humanized skin" based on epidermal expression of stem cell factor (Scf). K14-Scf transgenic mice retain melanocytes in the epidermis and therefore have the ability to deposit melanin in the epidermis. In this animal model, wild type Mc1r status results in robust deposition of black eumelanin pigment and a UV-protected phenotype. In contrast, K14-Scf animals with defective Mc1r signaling ability exhibit a red/blonde pigmentation, very little eumelanin in the skin and a UV-sensitive phenotype. Reasoning that eumelanin deposition might be enhanced by topical agents that mimic Mc1r signaling, we found that direct application of forskolin extract to the skin of Mc1r-defective fair-skinned mice resulted in robust eumelanin induction and UV protection 1. Here we describe the method for preparing and applying a forskolin-containing natural root extract to K14-Scf fair-skinned mice and report a method for measuring UV sensitivity by determining minimal erythematous dose (MED). Using this animal model, it is possible to study how epidermal cAMP induction and melanization of the skin affect physiologic responses to UV exposure.
Medicine, Issue 79, Skin, Inflammation, Photometry, Ultraviolet Rays, Skin Pigmentation, melanocortin 1 receptor, Mc1r, forskolin, cAMP, mean erythematous dose, skin pigmentation, melanocyte, melanin, sunburn, UV, inflammation
Play Button
An Allelotyping PCR for Identifying Salmonella enterica serovars Enteritidis, Hadar, Heidelberg, and Typhimurium
Authors: John J. Maurer, Margie D. Lee, Ying Cheng, Adriana Pedroso.
Institutions: University of Georgia.
Current commercial PCRs tests for identifying Salmonella target genes unique to this genus. However, there are two species, six subspecies, and over 2,500 different Salmonella serovars, and not all are equal in their significance to public health. For example, finding S. enterica subspecies IIIa Arizona on a table egg layer farm is insignificant compared to the isolation of S. enterica subspecies I serovar Enteritidis, the leading cause of salmonellosis linked to the consumption of table eggs. Serovars are identified based on antigenic differences in lipopolysaccharide (LPS)(O antigen) and flagellin (H1 and H2 antigens). These antigenic differences are the outward appearance of the diversity of genes and gene alleles associated with this phenotype. We have developed an allelotyping, multiplex PCR that keys on genetic differences between four major S. enterica subspecies I serovars found in poultry and associated with significant human disease in the US. The PCR primer pairs were targeted to key genes or sequences unique to a specific Salmonella serovar and designed to produce an amplicon with size specific for that gene or allele. Salmonella serovar is assigned to an isolate based on the combination of PCR test results for specific LPS and flagellin gene alleles. The multiplex PCRs described in this article are specific for the detection of S. enterica subspecies I serovars Enteritidis, Hadar, Heidelberg, and Typhimurium. Here we demonstrate how to use the multiplex PCRs to identify serovar for a Salmonella isolate.
Immunology, Issue 53, PCR, Salmonella, multiplex, Serovar
Play Button
A New Approach for the Comparative Analysis of Multiprotein Complexes Based on 15N Metabolic Labeling and Quantitative Mass Spectrometry
Authors: Kerstin Trompelt, Janina Steinbeck, Mia Terashima, Michael Hippler.
Institutions: University of Münster, Carnegie Institution for Science.
The introduced protocol provides a tool for the analysis of multiprotein complexes in the thylakoid membrane, by revealing insights into complex composition under different conditions. In this protocol the approach is demonstrated by comparing the composition of the protein complex responsible for cyclic electron flow (CEF) in Chlamydomonas reinhardtii, isolated from genetically different strains. The procedure comprises the isolation of thylakoid membranes, followed by their separation into multiprotein complexes by sucrose density gradient centrifugation, SDS-PAGE, immunodetection and comparative, quantitative mass spectrometry (MS) based on differential metabolic labeling (14N/15N) of the analyzed strains. Detergent solubilized thylakoid membranes are loaded on sucrose density gradients at equal chlorophyll concentration. After ultracentrifugation, the gradients are separated into fractions, which are analyzed by mass-spectrometry based on equal volume. This approach allows the investigation of the composition within the gradient fractions and moreover to analyze the migration behavior of different proteins, especially focusing on ANR1, CAS, and PGRL1. Furthermore, this method is demonstrated by confirming the results with immunoblotting and additionally by supporting the findings from previous studies (the identification and PSI-dependent migration of proteins that were previously described to be part of the CEF-supercomplex such as PGRL1, FNR, and cyt f). Notably, this approach is applicable to address a broad range of questions for which this protocol can be adopted and e.g. used for comparative analyses of multiprotein complex composition isolated from distinct environmental conditions.
Microbiology, Issue 85, Sucrose density gradients, Chlamydomonas, multiprotein complexes, 15N metabolic labeling, thylakoids
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Transgenic Rodent Assay for Quantifying Male Germ Cell Mutant Frequency
Authors: Jason M. O'Brien, Marc A. Beal, John D. Gingerich, Lynda Soper, George R. Douglas, Carole L. Yauk, Francesco Marchetti.
Institutions: Environmental Health Centre.
De novo mutations arise mostly in the male germline and may contribute to adverse health outcomes in subsequent generations. Traditional methods for assessing the induction of germ cell mutations require the use of large numbers of animals, making them impractical. As such, germ cell mutagenicity is rarely assessed during chemical testing and risk assessment. Herein, we describe an in vivo male germ cell mutation assay using a transgenic rodent model that is based on a recently approved Organisation for Economic Co-operation and Development (OECD) test guideline. This method uses an in vitro positive selection assay to measure in vivo mutations induced in a transgenic λgt10 vector bearing a reporter gene directly in the germ cells of exposed males. We further describe how the detection of mutations in the transgene recovered from germ cells can be used to characterize the stage-specific sensitivity of the various spermatogenic cell types to mutagen exposure by controlling three experimental parameters: the duration of exposure (administration time), the time between exposure and sample collection (sampling time), and the cell population collected for analysis. Because a large number of germ cells can be assayed from a single male, this method has superior sensitivity compared with traditional methods, requires fewer animals and therefore much less time and resources.
Genetics, Issue 90, sperm, spermatogonia, male germ cells, spermatogenesis, de novo mutation, OECD TG 488, transgenic rodent mutation assay, N-ethyl-N-nitrosourea, genetic toxicology
Play Button
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Authors: Gauthier Julie, Fadi F. Hamdan, Guy A. Rouleau.
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1 and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo mutations. This is the case for autism and schizophrenia3. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo mutations would more frequently come from males, particularly older males4. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
Play Button
Pyrosequencing: A Simple Method for Accurate Genotyping
Authors: Cristi King, Tiffany Scott-Horton.
Institutions: Washington University in St. Louis.
Pharmacogenetic research benefits first-hand from the abundance of information provided by the completion of the Human Genome Project. With such a tremendous amount of data available comes an explosion of genotyping methods. Pyrosequencing(R) is one of the most thorough yet simple methods to date used to analyze polymorphisms. It also has the ability to identify tri-allelic, indels, short-repeat polymorphisms, along with determining allele percentages for methylation or pooled sample assessment. In addition, there is a standardized control sequence that provides internal quality control. This method has led to rapid and efficient single-nucleotide polymorphism evaluation including many clinically relevant polymorphisms. The technique and methodology of Pyrosequencing is explained.
Cellular Biology, Issue 11, Springer Protocols, Pyrosequencing, genotype, polymorphism, SNP, pharmacogenetics, pharmacogenomics, PCR
Play Button
The WATCHMAN Left Atrial Appendage Closure Device for Atrial Fibrillation
Authors: Sven Möbius-Winkler, Marcus Sandri, Norman Mangner, Phillip Lurz, Ingo Dähnert, Gerhard Schuler.
Institutions: University of Leipzig Heart Center.
Atrial fibrillation (AF) is the most common cardiac arrhythmia, affecting an estimated 6 million people in the United States 1. Since AF affects primarily elderly people, its prevalence increases parallel with age. As such, it is expected that 15.9 million Americans will be affected by the year 2050 2. Ischemic stroke occurs in 5% of non-anticoagulated AF patients each year. Current treatments for AF include rate control, rhythm control and prevention of stroke 3. The American College of Cardiology, American Heart Association, and European Society of Cardiology currently recommended rate control as the first course of therapy for AF 3. Rate control is achieved by administration of pharmacological agents, such as β-blockers, that lower the heart rate until it reaches a less symptomatic state 3. Rhythm control aims to return the heart to its normal sinus rhythm and is typically achieved through administration of antiarrhythmic drugs such as amiodarone, electrical cardioversion or ablation therapy. Rhythm control methods, however, have not been demonstrated to be superior to rate-control methods 4-6. In fact, certain antiarrhythmic drugs have been shown to be associated with higher hospitalization rates, serious adverse effects 3, or even increases in mortality in patients with structural heart defects 7. Thus, treatment with antiarrhythmics is more often used when rate-control drugs are ineffective or contraindicated. Rate-control and antiarrhythmic agents relieve the symptoms of AF, including palpitations, shortness of breath, and fatigue 8, but don't reliably prevent thromboembolic events 6. Treatment with the anticoagulant drug warfarin significantly reduces the rate of stroke or embolism 9,10. However, because of problems associated with its use, fewer than 50% of patients are treated with it. The therapeutic dose is affected by drug, dietary, and metabolic interactions, and thus requires detailed monitoring. In addition, warfarin has the potential to cause severe, sometimes lethal, bleeding 2. As an alternative, aspirin is commonly prescribed. While aspirin is typically well tolerated, it is far less effective at preventing stroke 10. Other alternatives to warfarin, such as dabigatran 11 or rivaroxaban 12 demonstrate non-inferiority to warfarin with respect to thromboembolic events (in fact, dabigatran given as a high dose of 150 mg twice a day has shown superiority). While these drugs have the advantage of eliminating dietary concerns and eliminating the need for regular blood monitoring, major bleeding and associated complications, while somewhat less so than with warfarin, remain an issue 13-15. Since 90% of AF-associated strokes result from emboli that arise from the left atrial appendage (LAA) 2, one alternative approach to warfarin therapy has been to exclude the LAA using an implanted device to trap blood clots before they exit. Here, we demonstrate a procedure for implanting the WATCHMAN Left Atrial Appendage Closure Device. A transseptal cannula is inserted through the femoral vein, and under fluoroscopic guidance, inter-atrial septum is crossed. Once access to the left atrium has been achieved, a guidewire is placed in the upper pulmonary vein and the WATCHMAN Access Sheath and dilator are advanced over the wire into the left atrium. The guidewire is removed, and the access sheath is carefully advanced into the distal portion of the LAA over a pigtail catheter. The WATCHMAN Delivery System is prepped, inserted into the access sheath, and slowly advanced. The WATCHMAN device is then deployed into the LAA. The device release criteria are confirmed via fluoroscopy and transesophageal echocardiography (TEE) and the device is released.
Medicine, Issue 60, atrial fibrillation, cardiology, cardiac, interventional cardiology, medical procedures, medicine, WATCHMAN, medical device, left atrial appendage
Play Button
Catheter Ablation in Combination With Left Atrial Appendage Closure for Atrial Fibrillation
Authors: Martin J. Swaans, Arash Alipour, Benno J.W.M. Rensing, Martijn C. Post, Lucas V.A. Boersma.
Institutions: St. Antonius Hospital, The Netherlands.
Atrial fibrillation (AF) is the most common sustained cardiac arrhythmia, affecting millions of individuals worldwide 1-3. The rapid, irregular, and disordered electrical activity in the atria gives rise to palpitations, fatigue, dyspnea, chest pain and dizziness with or without syncope 4, 5. Patients with AF have a five-fold higher risk of stroke 6. Oral anticoagulation (OAC) with warfarin is commonly used for stroke prevention in patients with AF and has been shown to reduce the risk of stroke by 64% 7. Warfarin therapy has several major disadvantages, however, including bleeding, non-tolerance, interactions with other medications and foods, non-compliance and a narrow therapeutic range 8-11. These issues, together with poor appreciation of the risk-benefit ratio, unawareness of guidelines, or absence of an OAC monitoring outpatient clinic may explain why only 30-60% of patients with AF are prescribed this drug 8. The problems associated with warfarin, combined with the limited efficacy and/or serious side effects associated with other medications used for AF 12,13, highlight the need for effective non-pharmacological approaches to treatment. One such approach is catheter ablation (CA), a procedure in which a radiofrequency electrical current is applied to regions of the heart to create small ablation lesions that electrically isolate potential AF triggers 4. CA is a well-established treatment for AF symptoms 14, 15, that may also decrease the risk of stroke. Recent data showed a significant decrease in the relative risk of stroke and transient ischemic attack events among patients who underwent ablation compared with those undergoing antiarrhythmic drug therapy 16. Since the left atrial appendage (LAA) is the source of thrombi in more than 90% of patients with non-valvular atrial fibrillation 17, another approach to stroke prevention is to physically block clots from exiting the LAA. One method for occluding the LAA is via percutaneous placement of the WATCHMAN LAA closure device. The WATCHMAN device resembles a small parachute. It consists of a nitinol frame covered by fabric polyethyl terephthalate that prevents emboli, but not blood, from exiting during the healing process. Fixation anchors around the perimeter secure the device in the LAA (Figure 1). To date, the WATCHMAN is the only implanted percutaneous device for which a randomized clinical trial has been reported. In this study, implantation of the WATCHMAN was found to be at least as effective as warfarin in preventing stroke (all-causes) and death (all-causes) 18. This device received the Conformité Européenne (CE) mark for use in the European Union for warfarin eligible patients and in those who have a contraindication to anticoagulation therapy 19. Given the proven effectiveness of CA to alleviate AF symptoms and the promising data with regard to reduction of thromboembolic events with both CA and WATCHMAN implantation, combining the two procedures is hoped to further reduce the incidence of stroke in high-risk patients while simultaneously relieving symptoms. The combined procedure may eventually enable patients to undergo implantation of the WATCHMAN device without subsequent warfarin treatment, since the CA procedure itself reduces thromboembolic events. This would present an avenue of treatment previously unavailable to patients ineligible for warfarin treatment because of recurrent bleeding 20 or other warfarin-associated problems. The combined procedure is performed under general anesthesia with biplane fluoroscopy and TEE guidance. Catheter ablation is followed by implantation of the WATCHMAN LAA closure device. Data from a non-randomized trial with 10 patients demonstrates that this procedure can be safely performed in patients with a CHADS2 score of greater than 1 21. Further studies to examine the effectiveness of the combined procedure in reducing symptoms from AF and associated stroke are therefore warranted.
Medicine, Issue 72, Anatomy, Physiology, Biomedical Engineering, Immunology, Cardiology, Surgery, catheter ablation, WATCHMAN, LAA occlusion, atrial fibrillation, left atrial appendage, warfarin, oral anticoagulation alternatives, catheterization, ischemia, stroke, heart, vein, clinical, surgical device, surgical techniques, Vitamin K antagonist
Play Button
Use of Arabidopsis eceriferum Mutants to Explore Plant Cuticle Biosynthesis
Authors: Lacey Samuels, Allan DeBono, Patricia Lam, Miao Wen, Reinhard Jetter, Ljerka Kunst.
Institutions: University of British Columbia - UBC, University of British Columbia - UBC.
The plant cuticle is a waxy outer covering on plants that has a primary role in water conservation, but is also an important barrier against the entry of pathogenic microorganisms. The cuticle is made up of a tough crosslinked polymer called "cutin" and a protective wax layer that seals the plant surface. The waxy layer of the cuticle is obvious on many plants, appearing as a shiny film on the ivy leaf or as a dusty outer covering on the surface of a grape or a cabbage leaf thanks to light scattering crystals present in the wax. Because the cuticle is an essential adaptation of plants to a terrestrial environment, understanding the genes involved in plant cuticle formation has applications in both agriculture and forestry. Today, we'll show the analysis of plant cuticle mutants identified by forward and reverse genetics approaches.
Plant Biology, Issue 16, Annual Review, Cuticle, Arabidopsis, Eceriferum Mutants, Cryso-SEM, Gas Chromatography
Play Button
Minimal Erythema Dose (MED) Testing
Authors: Carolyn J. Heckman, Rachel Chandler, Jacqueline D. Kloss, Amy Benson, Deborah Rooney, Teja Munshi, Susan D. Darlow, Clifford Perlis, Sharon L. Manne, David W. Oslin.
Institutions: Fox Chase Cancer Center , University of Pennsylvania , Drexel University , Fox Chase Cancer Center , The Cancer Institute of New Jersey.
Ultraviolet radiation (UV) therapy is sometimes used as a treatment for various common skin conditions, including psoriasis, acne, and eczema. The dosage of UV light is prescribed according to an individual's skin sensitivity. Thus, to establish the proper dosage of UV light to administer to a patient, the patient is sometimes screened to determine a minimal erythema dose (MED), which is the amount of UV radiation that will produce minimal erythema (sunburn or redness caused by engorgement of capillaries) of an individual's skin within a few hours following exposure. This article describes how to conduct minimal erythema dose (MED) testing. There is currently no easy way to determine an appropriate UV dose for clinical or research purposes without conducting formal MED testing, requiring observation hours after testing, or informal trial and error testing with the risks of under- or over-dosing. However, some alternative methods are discussed.
Medicine, Issue 75, Anatomy, Physiology, Dermatology, Analytical, Diagnostic, Therapeutic Techniques, Equipment, Health Care, Minimal erythema dose (MED) testing, skin sensitivity, ultraviolet radiation, spectrophotometry, UV exposure, psoriasis, acne, eczema, clinical techniques
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.