Psoriasis is a chronic, immune-mediated inflammatory skin disease affecting approximately 2-3% of the population. The Goeckerman regimen consists of exposure to ultraviolet B (UVB) light and application of crude coal tar (CCT). Goeckerman therapy is extremely effective and relatively safe for the treatment of psoriasis and for improving a patient's quality of life. In the following article, we present our protocol for the Goeckerman therapy that is utilized specifically at the University of California, San Francisco. This protocol details the preparation of supplies, administration of phototherapy and application of topical tar. This protocol also describes how to assess the patient daily, monitor for adverse effects (including pruritus and burning), and adjust the treatment based on the patient's response. Though it is one of the oldest therapies available for psoriasis, there is an absence of any published videos demonstrating the process in detail. The video is beneficial for healthcare providers who want to administer the therapy, for trainees who want to learn more about the process, and for prospective patients who want to undergo treatment for their cutaneous disease.
24 Related JoVE Articles!
Generation of High Quality Chromatin Immunoprecipitation DNA Template for High-throughput Sequencing (ChIP-seq)
Institutions: Children's Hospital of Philadelphia Research Institute, University of Pennsylvania .
ChIP-sequencing (ChIP-seq) methods directly offer whole-genome coverage, where combining chromatin immunoprecipitation (ChIP) and massively parallel sequencing can be utilized to identify the repertoire of mammalian DNA sequences bound by transcription factors in vivo
. "Next-generation" genome sequencing technologies provide 1-2 orders of magnitude increase in the amount of sequence that can be cost-effectively generated over older technologies thus allowing for ChIP-seq methods to directly provide whole-genome coverage for effective profiling of mammalian protein-DNA interactions.
For successful ChIP-seq approaches, one must generate high quality ChIP DNA template to obtain the best sequencing outcomes. The description is based around experience with the protein product of the gene most strongly implicated in the pathogenesis of type 2 diabetes, namely the transcription factor transcription factor 7-like 2 (TCF7L2). This factor has also been implicated in various cancers.
Outlined is how to generate high quality ChIP DNA template derived from the colorectal carcinoma cell line, HCT116, in order to build a high-resolution map through sequencing to determine the genes bound by TCF7L2, giving further insight in to its key role in the pathogenesis of complex traits.
Molecular Biology, Issue 74, Genetics, Biochemistry, Microbiology, Medicine, Proteins, DNA-Binding Proteins, Transcription Factors, Chromatin Immunoprecipitation, Genes, chromatin, immunoprecipitation, ChIP, DNA, PCR, sequencing, antibody, cross-link, cell culture, assay
Handwriting Analysis Indicates Spontaneous Dyskinesias in Neuroleptic Naïve Adolescents at High Risk for Psychosis
Institutions: University of Colorado Boulder, NeuroScript LLC, University of California, San Diego.
Growing evidence suggests that movement abnormalities are a core feature of psychosis. One marker of movement abnormality, dyskinesia, is a result of impaired neuromodulation of dopamine in fronto-striatal pathways. The traditional methods for identifying movement abnormalities include observer-based reports and force stability gauges. The drawbacks of these methods are long training times for raters, experimenter bias, large site differences in instrumental apparatus, and suboptimal reliability. Taking these drawbacks into account has guided the development of better standardized and more efficient procedures to examine movement abnormalities through handwriting analysis software and tablet. Individuals at risk for psychosis showed significantly more dysfluent pen movements (a proximal measure for dyskinesia) in a handwriting task. Handwriting kinematics offers a great advance over previous methods of assessing dyskinesia, which could clearly be beneficial for understanding the etiology of psychosis.
Behavior, Issue 81, Schizophrenia, Disorders with Psychotic Features, Psychology, Clinical, Psychopathology, behavioral sciences, Movement abnormalities, Ultra High Risk, psychosis, handwriting, computer tablet, dyskinesia
An Allele-specific Gene Expression Assay to Test the Functional Basis of Genetic Associations
Institutions: University of Oxford.
The number of significant genetic associations with common complex traits is constantly increasing. However, most of these associations have not been understood at molecular level. One of the mechanisms mediating the effect of DNA variants on phenotypes is gene expression, which has been shown to be particularly relevant for complex traits1
This method tests in a cellular context the effect of specific DNA sequences on gene expression. The principle is to measure the relative abundance of transcripts arising from the two alleles of a gene, analysing cells which carry one copy of the DNA sequences associated with disease (the risk variants)2,3
. Therefore, the cells used for this method should meet two fundamental genotypic requirements: they have to be heterozygous both for DNA risk variants and for DNA markers, typically coding polymorphisms, which can distinguish transcripts based on their chromosomal origin (Figure 1). DNA risk variants and DNA markers do not need to have the same allele frequency but the phase (haplotypic) relationship of the genetic markers needs to be understood. It is also important to choose cell types which express the gene of interest. This protocol refers specifically to the procedure adopted to extract nucleic acids from fibroblasts but the method is equally applicable to other cells types including primary cells.
DNA and RNA are extracted from the selected cell lines and cDNA is generated. DNA and cDNA are analysed with a primer extension assay, designed to target the coding DNA markers4
. The primer extension assay is carried out using the MassARRAY (Sequenom)5
platform according to the manufacturer's specifications. Primer extension products are then analysed by matrix-assisted laser desorption/ionization time of-flight mass spectrometry (MALDI-TOF/MS). Because the selected markers are heterozygous they will generate two peaks on the MS profiles. The area of each peak is proportional to the transcript abundance and can be measured with a function of the MassARRAY Typer software to generate an allelic ratio (allele 1: allele 2) calculation. The allelic ratio obtained for cDNA is normalized using that measured from genomic DNA, where the allelic ratio is expected to be 1:1 to correct for technical artifacts. Markers with a normalised allelic ratio significantly different to 1 indicate that the amount of transcript generated from the two chromosomes in the same cell is different, suggesting that the DNA variants associated with the phenotype have an effect on gene expression. Experimental controls should be used to confirm the results.
Cellular Biology, Issue 45, Gene expression, regulatory variant, haplotype, association study, primer extension, MALDI-TOF mass spectrometry, single nucleotide polymorphism, allele-specific
Detection of Rare Genomic Variants from Pooled Sequencing Using SPLINTER
Institutions: Washington University School of Medicine, Washington University School of Medicine, Washington University School of Medicine.
As DNA sequencing technology has markedly advanced in recent years2
, it has become increasingly evident that the amount of genetic variation between any two individuals is greater than previously thought3
. In contrast, array-based genotyping has failed to identify a significant contribution of common sequence variants to the phenotypic variability of common disease4,5
. Taken together, these observations have led to the evolution of the Common Disease / Rare Variant hypothesis suggesting that the majority of the "missing heritability" in common and complex phenotypes is instead due to an individual's personal profile of rare or private DNA variants6-8
. However, characterizing how rare variation impacts complex phenotypes requires the analysis of many affected individuals at many genomic loci, and is ideally compared to a similar survey in an unaffected cohort. Despite the sequencing power offered by today's platforms, a population-based survey of many genomic loci and the subsequent computational analysis required remains prohibitive for many investigators.
To address this need, we have developed a pooled sequencing approach1,9
and a novel software package1
for highly accurate rare variant detection from the resulting data. The ability to pool genomes from entire populations of affected individuals and survey the degree of genetic variation at multiple targeted regions in a single sequencing library provides excellent cost and time savings to traditional single-sample sequencing methodology. With a mean sequencing coverage per allele of 25-fold, our custom algorithm, SPLINTER, uses an internal variant calling control strategy to call insertions, deletions and substitutions up to four base pairs in length with high sensitivity and specificity from pools of up to 1 mutant allele in 500 individuals. Here we describe the method for preparing the pooled sequencing library followed by step-by-step instructions on how to use the SPLINTER package for pooled sequencing analysis (http://www.ibridgenetwork.org/wustl/splinter). We show a comparison between pooled sequencing of 947 individuals, all of whom also underwent genome-wide array, at over 20kb of sequencing per person. Concordance between genotyping of tagged and novel variants called in the pooled sample were excellent. This method can be easily scaled up to any number of genomic loci and any number of individuals. By incorporating the internal positive and negative amplicon controls at ratios that mimic the population under study, the algorithm can be calibrated for optimal performance. This strategy can also be modified for use with hybridization capture or individual-specific barcodes and can be applied to the sequencing of naturally heterogeneous samples, such as tumor DNA.
Genetics, Issue 64, Genomics, Cancer Biology, Bioinformatics, Pooled DNA sequencing, SPLINTER, rare genetic variants, genetic screening, phenotype, high throughput, computational analysis, DNA, PCR, primers
A Protocol for Computer-Based Protein Structure and Function Prediction
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
Getting to Compliance in Forced Exercise in Rodents: A Critical Standard to Evaluate Exercise Impact in Aging-related Disorders and Disease
Institutions: Louisiana State University Health Sciences Center.
There is a major increase in the awareness of the positive impact of exercise on improving several disease states with neurobiological basis; these include improving cognitive function and physical performance. As a result, there is an increase in the number of animal studies employing exercise. It is argued that one intrinsic value of forced exercise is that the investigator has control over the factors that can influence the impact of exercise on behavioral outcomes, notably exercise frequency, duration, and intensity of the exercise regimen. However, compliance in forced exercise regimens may be an issue, particularly if potential confounds of employing foot-shock are to be avoided. It is also important to consider that since most cognitive and locomotor impairments strike in the aged individual, determining impact of exercise on these impairments should consider using aged rodents with a highest possible level of compliance to ensure minimal need for test subjects. Here, the pertinent steps and considerations necessary to achieve nearly 100% compliance to treadmill exercise in an aged rodent model will be presented and discussed. Notwithstanding the particular exercise regimen being employed by the investigator, our protocol should be of use to investigators that are particularly interested in the potential impact of forced exercise on aging-related impairments, including aging-related Parkinsonism and Parkinson’s disease.
Behavior, Issue 90, Exercise, locomotor, Parkinson’s disease, aging, treadmill, bradykinesia, Parkinsonism
Dried Blood Spot Collection of Health Biomarkers to Maximize Participation in Population Studies
Institutions: Harvard School of Public Health, Brigham and Women's Hospital, Harvard Medical School, Pennsylvania State University.
Biomarkers are directly-measured biological indicators of disease, health, exposures, or other biological information. In population and social sciences, biomarkers need to be easy to obtain, transport, and analyze. Dried Blood Spots meet this need, and can be collected in the field with high response rates. These elements are particularly important in longitudinal study designs including interventions where attrition is critical to avoid, and high response rates improve the interpretation of results. Dried Blood Spot sample collection is simple, quick, relatively painless, less invasive then venipuncture, and requires minimal field storage requirements (i.e.
samples do not need to be immediately frozen and can be stored for a long period of time in a stable freezer environment before assay). The samples can be analyzed for a variety of different analytes, including cholesterol, C-reactive protein, glycosylated hemoglobin, numerous cytokines, and other analytes, as well as provide genetic material. DBS collection is depicted as employed in several recent studies.
Medicine, Issue 83, dried blood spots (DBS), Biomarkers, cardiometabolic risk, Inflammation, standard precautions, blood collection
High-throughput Functional Screening using a Homemade Dual-glow Luciferase Assay
Institutions: Massachusetts General Hospital.
We present a rapid and inexpensive high-throughput screening protocol to identify transcriptional regulators of alpha-synuclein, a gene associated with Parkinson's disease. 293T cells are transiently transfected with plasmids from an arrayed ORF expression library, together with luciferase reporter plasmids, in a one-gene-per-well microplate format. Firefly luciferase activity is assayed after 48 hr to determine the effects of each library gene upon alpha-synuclein transcription, normalized to expression from an internal control construct (a hCMV promoter directing Renilla
luciferase). This protocol is facilitated by a bench-top robot enclosed in a biosafety cabinet, which performs aseptic liquid handling in 96-well format. Our automated transfection protocol is readily adaptable to high-throughput lentiviral library production or other functional screening protocols requiring triple-transfections of large numbers of unique library plasmids in conjunction with a common set of helper plasmids. We also present an inexpensive and validated alternative to commercially-available, dual luciferase reagents which employs PTC124, EDTA, and pyrophosphate to suppress firefly luciferase activity prior to measurement of Renilla
luciferase. Using these methods, we screened 7,670 human genes and identified 68 regulators of alpha-synuclein. This protocol is easily modifiable to target other genes of interest.
Cellular Biology, Issue 88, Luciferases, Gene Transfer Techniques, Transfection, High-Throughput Screening Assays, Transfections, Robotics
Assessment and Evaluation of the High Risk Neonate: The NICU Network Neurobehavioral Scale
Institutions: Brown University, Women & Infants Hospital of Rhode Island, University of Massachusetts, Boston.
There has been a long-standing interest in the assessment of the neurobehavioral integrity of the newborn infant. The NICU Network Neurobehavioral Scale (NNNS) was developed as an assessment for the at-risk infant. These are infants who are at increased risk for poor developmental outcome because of insults during prenatal development, such as substance exposure or prematurity or factors such as poverty, poor nutrition or lack of prenatal care that can have adverse effects on the intrauterine environment and affect the developing fetus. The NNNS assesses the full range of infant neurobehavioral performance including neurological integrity, behavioral functioning, and signs of stress/abstinence. The NNNS is a noninvasive neonatal assessment tool with demonstrated validity as a predictor, not only of medical outcomes such as cerebral palsy diagnosis, neurological abnormalities, and diseases with risks to the brain, but also of developmental outcomes such as mental and motor functioning, behavior problems, school readiness, and IQ. The NNNS can identify infants at high risk for abnormal developmental outcome and is an important clinical tool that enables medical researchers and health practitioners to identify these infants and develop intervention programs to optimize the development of these infants as early as possible. The video shows the NNNS procedures, shows examples of normal and abnormal performance and the various clinical populations in which the exam can be used.
Behavior, Issue 90, NICU Network Neurobehavioral Scale, NNNS, High risk infant, Assessment, Evaluation, Prediction, Long term outcome
gDNA Enrichment by a Transposase-based Technology for NGS Analysis of the Whole Sequence of BRCA1, BRCA2, and 9 Genes Involved in DNA Damage Repair
Institutions: Centre Georges-François Leclerc.
The widespread use of Next Generation Sequencing has opened up new avenues for cancer research and diagnosis. NGS will bring huge amounts of new data on cancer, and especially cancer genetics. Current knowledge and future discoveries will make it necessary to study a huge number of genes that could be involved in a genetic predisposition to cancer. In this regard, we developed a Nextera design to study 11 complete genes involved in DNA damage repair. This protocol was developed to safely study 11 genes (ATM
, and TP53
) from promoter to 3'-UTR in 24 patients simultaneously. This protocol, based on transposase technology and gDNA enrichment, gives a great advantage in terms of time for the genetic diagnosis thanks to sample multiplexing. This protocol can be safely used with blood gDNA.
Genetics, Issue 92, gDNA enrichment, Nextera, NGS, DNA damage, BRCA1, BRCA2
Quantification of Atherosclerotic Plaque Activity and Vascular Inflammation using [18-F] Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography (FDG-PET/CT)
Institutions: University of Pennsylvania, Perelman School of Medicine, University of Pennsylvania, Perelman School of Medicine, University of Pennsylvania, Perelman School of Medicine.
Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC)1
and carotid intimal medial thickness (C-IMT)2
provide information about the burden of disease. However, despite multiple validation studies of CAC3-5
, and C-IMT2,6
, these modalities do not accurately assess plaque characteristics7,8
, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events9-13
F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism14,15
. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity16
, an important source of cellular inflammation in vessel walls. More recently, we17,18
and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries9,16,19,20
. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo
imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors21,22
and is also highly associated with overall burden of atherosclerosis23
. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy24
as well as longer term therapeutic lifestyle changes (16 months)25
The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is calculated by dividing the arterial SUV by the venous blood pool SUV. This method has shown to represent a stable, reproducible phenotype over time, has a high sensitivity for detection of vascular inflammation, and also has high inter-and intra-reader reliability26
. Here we present our methodology for patient preparation, image acquisition, and quantification of atherosclerotic plaque activity and vascular inflammation using SUV, TBR, and a global parameter called the metabolic volumetric product (MVP). These approaches may be applied to assess vascular inflammation in various study samples of interest in a consistent fashion as we have shown in several prior publications.9,20,27,28
Medicine, Issue 63, FDG-PET/CT, atherosclerosis, vascular inflammation, quantitative radiology, imaging
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
The ChroP Approach Combines ChIP and Mass Spectrometry to Dissect Locus-specific Proteomic Landscapes of Chromatin
Institutions: European Institute of Oncology.
Chromatin is a highly dynamic nucleoprotein complex made of DNA and proteins that controls various DNA-dependent processes. Chromatin structure and function at specific regions is regulated by the local enrichment of histone post-translational modifications (hPTMs) and variants, chromatin-binding proteins, including transcription factors, and DNA methylation. The proteomic characterization of chromatin composition at distinct functional regions has been so far hampered by the lack of efficient protocols to enrich such domains at the appropriate purity and amount for the subsequent in-depth analysis by Mass Spectrometry (MS). We describe here a newly designed chromatin proteomics strategy, named ChroP (Chromatin Proteomics
), whereby a preparative chromatin immunoprecipitation is used to isolate distinct chromatin regions whose features, in terms of hPTMs, variants and co-associated non-histonic proteins, are analyzed by MS. We illustrate here the setting up of ChroP for the enrichment and analysis of transcriptionally silent heterochromatic regions, marked by the presence of tri-methylation of lysine 9 on histone H3. The results achieved demonstrate the potential of ChroP
in thoroughly characterizing the heterochromatin proteome and prove it as a powerful analytical strategy for understanding how the distinct protein determinants of chromatin interact and synergize to establish locus-specific structural and functional configurations.
Biochemistry, Issue 86, chromatin, histone post-translational modifications (hPTMs), epigenetics, mass spectrometry, proteomics, SILAC, chromatin immunoprecipitation , histone variants, chromatome, hPTMs cross-talks
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4
is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1
). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6
. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8
. Using logistic regression analysis of subject scores (i.e.
pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e.
composite networks with improved discrimination of patients from healthy control subjects5,6
. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9
. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10
. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11
. These standardized values can in turn be used to assist in differential diagnosis12,13
and to assess disease progression and treatment effects at the network level7,14-16
. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone.
Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia.
The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction.
There exists a non-invasive, in vivo
method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia.
This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Transgenic Rodent Assay for Quantifying Male Germ Cell Mutant Frequency
Institutions: Environmental Health Centre.
mutations arise mostly in the male germline and may contribute to adverse health outcomes in subsequent generations. Traditional methods for assessing the induction of germ cell mutations require the use of large numbers of animals, making them impractical. As such, germ cell mutagenicity is rarely assessed during chemical testing and risk assessment. Herein, we describe an in vivo
male germ cell mutation assay using a transgenic rodent model that is based on a recently approved Organisation for Economic Co-operation and Development (OECD) test guideline. This method uses an in vitro
positive selection assay to measure in vivo
mutations induced in a transgenic λgt10 vector bearing a reporter gene directly in the germ cells of exposed males. We further describe how the detection of mutations in the transgene recovered from germ cells can be used to characterize the stage-specific sensitivity of the various spermatogenic cell types to mutagen exposure by controlling three experimental parameters: the duration of exposure (administration time), the time between exposure and sample collection (sampling time), and the cell population collected for analysis. Because a large number of germ cells can be assayed from a single male, this method has superior sensitivity compared with traditional methods, requires fewer animals and therefore much less time and resources.
Genetics, Issue 90, sperm, spermatogonia, male germ cells, spermatogenesis, de novo mutation, OECD TG 488, transgenic rodent mutation assay, N-ethyl-N-nitrosourea, genetic toxicology
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Infinium Assay for Large-scale SNP Genotyping Applications
Institutions: Oklahoma Medical Research Foundation.
Genotyping variants in the human genome has proven to be an efficient method to identify genetic associations with phenotypes. The distribution of variants within families or populations can facilitate identification of the genetic factors of disease. Illumina's panel of genotyping BeadChips allows investigators to genotype thousands or millions of single nucleotide polymorphisms (SNPs) or to analyze other genomic variants, such as copy number, across a large number of DNA samples. These SNPs can be spread throughout the genome or targeted in specific regions in order to maximize potential discovery. The Infinium assay has been optimized to yield high-quality, accurate results quickly. With proper setup, a single technician can process from a few hundred to over a thousand DNA samples per week, depending on the type of array. This assay guides users through every step, starting with genomic DNA and ending with the scanning of the array. Using propriety reagents, samples are amplified, fragmented, precipitated, resuspended, hybridized to the chip, extended by a single base, stained, and scanned on either an iScan or Hi Scan high-resolution optical imaging system. One overnight step is required to amplify the DNA. The DNA is denatured and isothermally amplified by whole-genome amplification; therefore, no PCR is required. Samples are hybridized to the arrays during a second overnight step. By the third day, the samples are ready to be scanned and analyzed. Amplified DNA may be stockpiled in large quantities, allowing bead arrays to be processed every day of the week, thereby maximizing throughput.
Basic Protocol, Issue 81, genomics, SNP, Genotyping, Infinium, iScan, HiScan, Illumina
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
A Strategy to Identify de Novo Mutations in Common Disorders such as Autism and Schizophrenia
Institutions: Universite de Montreal, Universite de Montreal, Universite de Montreal.
There are several lines of evidence supporting the role of de novo
mutations as a mechanism for common disorders, such as autism and schizophrenia. First, the de novo
mutation rate in humans is relatively high, so new mutations are generated at a high frequency in the population. However, de novo
mutations have not been reported in most common diseases. Mutations in genes leading to severe diseases where there is a strong negative selection against the phenotype, such as lethality in embryonic stages or reduced reproductive fitness, will not be transmitted to multiple family members, and therefore will not be detected by linkage gene mapping or association studies. The observation of very high concordance in monozygotic twins and very low concordance in dizygotic twins also strongly supports the hypothesis that a significant fraction of cases may result from new mutations. Such is the case for diseases such as autism and schizophrenia. Second, despite reduced reproductive fitness1
and extremely variable environmental factors, the incidence of some diseases is maintained worldwide at a relatively high and constant rate. This is the case for autism and schizophrenia, with an incidence of approximately 1% worldwide. Mutational load can be thought of as a balance between selection for or against a deleterious mutation and its production by de novo
mutation. Lower rates of reproduction constitute a negative selection factor that should reduce the number of mutant alleles in the population, ultimately leading to decreased disease prevalence. These selective pressures tend to be of different intensity in different environments. Nonetheless, these severe mental disorders have been maintained at a constant relatively high prevalence in the worldwide population across a wide range of cultures and countries despite a strong negative selection against them2
. This is not what one would predict in diseases with reduced reproductive fitness, unless there was a high new mutation rate. Finally, the effects of paternal age: there is a significantly increased risk of the disease with increasing paternal age, which could result from the age related increase in paternal de novo
mutations. This is the case for autism and schizophrenia3
. The male-to-female ratio of mutation rate is estimated at about 4–6:1, presumably due to a higher number of germ-cell divisions with age in males. Therefore, one would predict that de novo
mutations would more frequently come from males, particularly older males4
. A high rate of new mutations may in part explain why genetic studies have so far failed to identify many genes predisposing to complexes diseases genes, such as autism and schizophrenia, and why diseases have been identified for a mere 3% of genes in the human genome. Identification for de novo
mutations as a cause of a disease requires a targeted molecular approach, which includes studying parents and affected subjects. The process for determining if the genetic basis of a disease may result in part from de novo
mutations and the molecular approach to establish this link will be illustrated, using autism and schizophrenia as examples.
Medicine, Issue 52, de novo mutation, complex diseases, schizophrenia, autism, rare variations, DNA sequencing
A New Single Chamber Implantable Defibrillator with Atrial Sensing: A Practical Demonstration of Sensing and Ease of Implantation
Institutions: University Hospital of Rostock, Germany.
Implantable cardioverter-defibrillators (ICDs) terminate ventricular tachycardia (VT) and ventricular fibrillation (VF) with high efficacy and can protect patients from sudden cardiac death (SCD). However, inappropriate shocks may occur if tachycardias are misdiagnosed. Inappropriate shocks are harmful and impair patient quality of life. The risk of inappropriate therapy increases with lower detection rates programmed in the ICD. Single-chamber detection poses greater risks for misdiagnosis when compared with dual-chamber devices that have the benefit of additional atrial information. However, using a dual-chamber device merely for the sake of detection is generally not accepted, since the risks associated with the second electrode may outweigh the benefits of detection. Therefore, BIOTRONIK developed a ventricular lead called the LinoxSMART
S DX, which allows for the detection of atrial signals from two electrodes positioned at the atrial part of the ventricular electrode. This device contains two ring electrodes; one that contacts the atrial wall at the junction of the superior vena cava (SVC) and one positioned at the free floating part of the electrode in the atrium. The excellent signal quality can only be achieved by a special filter setting in the ICD (Lumax 540 and 740 VR-T DX, BIOTRONIK). Here, the ease of implantation of the system will be demonstrated.
Medicine, Issue 60, Implantable defibrillator, dual chamber, single chamber, tachycardia detection
Minimal Erythema Dose (MED) Testing
Institutions: Fox Chase Cancer Center , University of Pennsylvania , Drexel University , Fox Chase Cancer Center , The Cancer Institute of New Jersey.
Ultraviolet radiation (UV) therapy is sometimes used as a treatment for various common skin conditions, including psoriasis, acne, and eczema. The dosage of UV light is prescribed according to an individual's skin sensitivity. Thus, to establish the proper dosage of UV light to administer to a patient, the patient is sometimes screened to determine a minimal erythema dose (MED), which is the amount of UV radiation that will produce minimal erythema (sunburn or redness caused by engorgement of capillaries) of an individual's skin within a few hours following exposure. This article describes how to conduct minimal erythema dose (MED) testing. There is currently no easy way to determine an appropriate UV dose for clinical or research purposes without conducting formal MED testing, requiring observation hours after testing, or informal trial and error testing with the risks of under- or over-dosing. However, some alternative methods are discussed.
Medicine, Issue 75, Anatomy, Physiology, Dermatology, Analytical, Diagnostic, Therapeutic Techniques, Equipment, Health Care, Minimal erythema dose (MED) testing, skin sensitivity, ultraviolet radiation, spectrophotometry, UV exposure, psoriasis, acne, eczema, clinical techniques
A Rapid Technique for the Visualization of Live Immobilized Yeast Cells
Institutions: Princeton University.
We present here a simple, rapid, and extremely flexible technique for the immobilization and visualization of growing yeast cells by epifluorescence microscopy. The technique is equally suited for visualization of static yeast populations, or time courses experiments up to ten hours in length. My microscopy investigates epigenetic inheritance at the silent mating loci in S. cerevisiae. There are two silent mating loci, HML and HMR, which are normally not expressed as they are packaged in heterochromatin. In the sir1 mutant background silencing is weakened such that each locus can either be in the expressed or silenced epigenetic state, so in the population as a whole there is a mix of cells of different epigenetic states for both HML and HMR. My microscopy demonstrated that there is no relationship between the epigenetic state of HML and HMR in an individual cell. sir1 cells stochastically switch epigenetic states, establishing silencing at a previously expressed locus or expressing a previously silenced locus. My time course microscopy tracked individual sir1 cells and their offspring to score the frequency of each of the four possible epigenetic switches, and thus the stability of each of the epigenetic states in sir1 cells. See also Xu et al., Mol. Cell 2006.
Microbiology, Issue 1, yeast, HML, HMR, epigenetic, loci, silencing, cerevisiae