The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone.
Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia.
The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction.
There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia.
This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
25 Related JoVE Articles!
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
Assessment of Morphine-induced Hyperalgesia and Analgesic Tolerance in Mice Using Thermal and Mechanical Nociceptive Modalities
Institutions: Université de Strasbourg.
Opioid-induced hyperalgesia and tolerance severely impact the clinical efficacy of opiates as pain relievers in animals and humans. The molecular mechanisms underlying both phenomena are not well understood and their elucidation should benefit from the study of animal models and from the design of appropriate experimental protocols.
We describe here a methodological approach for inducing, recording and quantifying morphine-induced hyperalgesia as well as for evidencing analgesic tolerance, using the tail-immersion and tail pressure tests in wild-type mice. As shown in the video, the protocol is divided into five sequential steps. Handling and habituation phases allow a safe determination of the basal nociceptive response of the animals. Chronic morphine administration induces significant hyperalgesia as shown by an increase in both thermal and mechanical sensitivity, whereas the comparison of analgesia time-courses after acute or repeated morphine treatment clearly indicates the development of tolerance manifested by a decline in analgesic response amplitude. This protocol may be similarly adapted to genetically modified mice in order to evaluate the role of individual genes in the modulation of nociception and morphine analgesia. It also provides a model system to investigate the effectiveness of potential therapeutic agents to improve opiate analgesic efficacy.
Neuroscience, Issue 89, mice, nociception, tail immersion test, tail pressure test, morphine, analgesia, opioid-induced hyperalgesia, tolerance
RNAscope for In situ Detection of Transcriptionally Active Human Papillomavirus in Head and Neck Squamous Cell Carcinoma
Institutions: Advanced Cell Diagnostics, Inc..
The 'gold standard' for oncogenic HPV detection is the demonstration of transcriptionally active high-risk HPV in tumor tissue. However, detection of E6/E7 mRNA by quantitative reverse transcription polymerase chain reaction (qRT-PCR) requires RNA extraction which destroys the tumor tissue context critical for morphological correlation and has been difficult to be adopted in routine clinical practice. Our recently developed RNA in situ
hybridization technology, RNAscope, permits direct visualization of RNA in formalin-fixed, paraffin-embedded (FFPE) tissue with single molecule sensitivity and single cell resolution, which enables highly sensitive and specific in situ
analysis of any RNA biomarker in routine clinical specimens. The RNAscope HPV assay was designed to detect the E6/E7 mRNA of seven high-risk HPV genotypes (HPV16, 18, 31, 33, 35, 52, and 58) using a pool of genotype-specific probes. It has demonstrated excellent sensitivity and specificity against the current 'gold standard' method of detecting E6/E7 mRNA by qRT-PCR. HPV status determined by RNAscope is strongly prognostic of clinical outcome in oropharyngeal cancer patients.
Medicine, Issue 85, RNAscope, Head and Neck Squamous Cell Carcinoma (HNSCC), Oropharyngeal Squamous Cell Carcinoma (OPSCC), Human Papillomavirus (HPV), E6/ E7 mRNA, in situ hybridization, tumor
Laboratory Estimation of Net Trophic Transfer Efficiencies of PCB Congeners to Lake Trout (Salvelinus namaycush) from Its Prey
Institutions: U. S. Geological Survey, Grand Valley State University, Shedd Aquarium.
A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi
) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush
) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.
Environmental Sciences, Issue 90, trophic transfer efficiency, polychlorinated biphenyl congeners, lake trout, activity, contaminants, accumulation, risk assessment, toxic equivalents
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Voluntary Breath-hold Technique for Reducing Heart Dose in Left Breast Radiotherapy
Institutions: Royal Marsden NHS Foundation Trust, University of Surrey, Institute of Cancer Research, Sutton, UK, Institute of Cancer Research, Sutton, UK.
Breath-holding techniques reduce the amount of radiation received by cardiac structures during tangential-field left breast radiotherapy. With these techniques, patients hold their breath while radiotherapy is delivered, pushing the heart down and away from the radiotherapy field. Despite clear dosimetric benefits, these techniques are not yet in widespread use. One reason for this is that commercially available solutions require specialist equipment, necessitating not only significant capital investment, but often also incurring ongoing costs such as a need for daily disposable mouthpieces. The voluntary breath-hold technique described here does not require any additional specialist equipment. All breath-holding techniques require a surrogate to monitor breath-hold consistency and whether breath-hold is maintained. Voluntary breath-hold uses the distance moved by the anterior and lateral reference marks (tattoos) away from the treatment room lasers in breath-hold to monitor consistency at CT-planning and treatment setup. Light fields are then used to monitor breath-hold consistency prior to and during radiotherapy delivery.
Medicine, Issue 89, breast, radiotherapy, heart, cardiac dose, breath-hold
Combined DNA-RNA Fluorescent In situ Hybridization (FISH) to Study X Chromosome Inactivation in Differentiated Female Mouse Embryonic Stem Cells
Institutions: Erasmus MC - University Medical Center.
Fluorescent in situ
hybridization (FISH) is a molecular technique which enables the detection of nucleic acids in cells. DNA FISH is often used in cytogenetics and cancer diagnostics, and can detect aberrations of the genome, which often has important clinical implications. RNA FISH can be used to detect RNA molecules in cells and has provided important insights in regulation of gene expression. Combining DNA and RNA FISH within the same cell is technically challenging, as conditions suitable for DNA FISH might be too harsh for fragile, single stranded RNA molecules. We here present an easily applicable protocol which enables the combined, simultaneous detection of Xist
RNA and DNA encoded by the X chromosomes. This combined DNA-RNA FISH protocol can likely be applied to other systems where both RNA and DNA need to be detected.
Biochemistry, Issue 88, Fluorescent in situ hybridization (FISH), combined DNA-RNA FISH, ES cell, cytogenetics, single cell analysis, X chromosome inactivation (XCI), Xist, Bacterial artificial chromosome (BAC), DNA-probe, Rnf12
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33
. To help improve this understanding, proton magnetic resonance spectroscopy (1
H-MRS) can be used as it allows the in vivo
quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41
. In fact, a recent study demonstrated that 1
H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34
. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1
H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31
. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
gDNA Enrichment by a Transposase-based Technology for NGS Analysis of the Whole Sequence of BRCA1, BRCA2, and 9 Genes Involved in DNA Damage Repair
Institutions: Centre Georges-François Leclerc.
The widespread use of Next Generation Sequencing has opened up new avenues for cancer research and diagnosis. NGS will bring huge amounts of new data on cancer, and especially cancer genetics. Current knowledge and future discoveries will make it necessary to study a huge number of genes that could be involved in a genetic predisposition to cancer. In this regard, we developed a Nextera design to study 11 complete genes involved in DNA damage repair. This protocol was developed to safely study 11 genes (ATM
, and TP53
) from promoter to 3'-UTR in 24 patients simultaneously. This protocol, based on transposase technology and gDNA enrichment, gives a great advantage in terms of time for the genetic diagnosis thanks to sample multiplexing. This protocol can be safely used with blood gDNA.
Genetics, Issue 92, gDNA enrichment, Nextera, NGS, DNA damage, BRCA1, BRCA2
Fundus Photography as a Convenient Tool to Study Microvascular Responses to Cardiovascular Disease Risk Factors in Epidemiological Studies
Institutions: Flemish Institute for Technological Research (VITO), Hasselt University, Hasselt University, Leuven University.
The microcirculation consists of blood vessels with diameters less than 150 µm. It makes up a large part of the circulatory system and plays an important role in maintaining cardiovascular health. The retina is a tissue that lines the interior of the eye and it is the only tissue that allows for a non-invasive analysis of the microvasculature. Nowadays, high-quality fundus images can be acquired using digital cameras. Retinal images can be collected in 5 min or less, even without dilatation of the pupils. This unobtrusive and fast procedure for visualizing the microcirculation is attractive to apply in epidemiological studies and to monitor cardiovascular health from early age up to old age.
Systemic diseases that affect the circulation can result in progressive morphological changes in the retinal vasculature. For example, changes in the vessel calibers of retinal arteries and veins have been associated with hypertension, atherosclerosis, and increased risk of stroke and myocardial infarction. The vessel widths are derived using image analysis software and the width of the six largest arteries and veins are summarized in the Central Retinal Arteriolar Equivalent (CRAE) and the Central Retinal Venular Equivalent (CRVE). The latter features have been shown useful to study the impact of modifiable lifestyle and environmental cardiovascular disease risk factors.
The procedures to acquire fundus images and the analysis steps to obtain CRAE and CRVE are described. Coefficients of variation of repeated measures of CRAE and CRVE are less than 2% and within-rater reliability is very high. Using a panel study, the rapid response of the retinal vessel calibers to short-term changes in particulate air pollution, a known risk factor for cardiovascular mortality and morbidity, is reported. In conclusion, retinal imaging is proposed as a convenient and instrumental tool for epidemiological studies to study microvascular responses to cardiovascular disease risk factors.
Medicine, Issue 92, retina, microvasculature, image analysis, Central Retinal Arteriolar Equivalent, Central Retinal Venular Equivalent, air pollution, particulate matter, black carbon
Training Synesthetic Letter-color Associations by Reading in Color
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Handwriting Analysis Indicates Spontaneous Dyskinesias in Neuroleptic Naïve Adolescents at High Risk for Psychosis
Institutions: University of Colorado Boulder, NeuroScript LLC, University of California, San Diego.
Growing evidence suggests that movement abnormalities are a core feature of psychosis. One marker of movement abnormality, dyskinesia, is a result of impaired neuromodulation of dopamine in fronto-striatal pathways. The traditional methods for identifying movement abnormalities include observer-based reports and force stability gauges. The drawbacks of these methods are long training times for raters, experimenter bias, large site differences in instrumental apparatus, and suboptimal reliability. Taking these drawbacks into account has guided the development of better standardized and more efficient procedures to examine movement abnormalities through handwriting analysis software and tablet. Individuals at risk for psychosis showed significantly more dysfluent pen movements (a proximal measure for dyskinesia) in a handwriting task. Handwriting kinematics offers a great advance over previous methods of assessing dyskinesia, which could clearly be beneficial for understanding the etiology of psychosis.
Behavior, Issue 81, Schizophrenia, Disorders with Psychotic Features, Psychology, Clinical, Psychopathology, behavioral sciences, Movement abnormalities, Ultra High Risk, psychosis, handwriting, computer tablet, dyskinesia
Changes in Mammary Gland Morphology and Breast Cancer Risk in Rats
Institutions: Georgetown University, University of Turku Medical Faculty.
Studies in rodent models of breast cancer show that exposures to dietary/hormonal factors during the in utero
and pubertal periods, when the mammary gland undergoes extensive modeling and re-modeling, alter susceptibility to carcinogen-induced mammary tumors. Similar findings have been described in humans: for example, high birthweight increases later risk of developing breast cancer, and dietary intake of soy during childhood decreases breast cancer risk. It is thought that these prenatal and postnatal dietary modifications induce persistent morphological changes in the mammary gland that in turn modify breast cancer risk later in life. These morphological changes likely reflect epigenetic modifications, such as changes in DNA methylation, histones and miRNA expression that then affect gene transcription . In this article we describe how changes in mammary gland morphology can predict mammary cancer risk in rats. Our protocol specifically describes how to dissect and remove the rat abdominal mammary gland and how to prepare mammary gland whole mounts. It also describes how to analyze mammary gland morphology according to three end-points (number of terminal end buds, epithelial elongation and differentiation) and to use the data to predict risk of developing mammary cancer.
Medicine, Issue 44, mammary gland morphology, terminal end buds, mammary cancer, maternal dietary exposures, pregnancy, prepubertal dietay exposures
An Allele-specific Gene Expression Assay to Test the Functional Basis of Genetic Associations
Institutions: University of Oxford.
The number of significant genetic associations with common complex traits is constantly increasing. However, most of these associations have not been understood at molecular level. One of the mechanisms mediating the effect of DNA variants on phenotypes is gene expression, which has been shown to be particularly relevant for complex traits1
This method tests in a cellular context the effect of specific DNA sequences on gene expression. The principle is to measure the relative abundance of transcripts arising from the two alleles of a gene, analysing cells which carry one copy of the DNA sequences associated with disease (the risk variants)2,3
. Therefore, the cells used for this method should meet two fundamental genotypic requirements: they have to be heterozygous both for DNA risk variants and for DNA markers, typically coding polymorphisms, which can distinguish transcripts based on their chromosomal origin (Figure 1). DNA risk variants and DNA markers do not need to have the same allele frequency but the phase (haplotypic) relationship of the genetic markers needs to be understood. It is also important to choose cell types which express the gene of interest. This protocol refers specifically to the procedure adopted to extract nucleic acids from fibroblasts but the method is equally applicable to other cells types including primary cells.
DNA and RNA are extracted from the selected cell lines and cDNA is generated. DNA and cDNA are analysed with a primer extension assay, designed to target the coding DNA markers4
. The primer extension assay is carried out using the MassARRAY (Sequenom)5
platform according to the manufacturer's specifications. Primer extension products are then analysed by matrix-assisted laser desorption/ionization time of-flight mass spectrometry (MALDI-TOF/MS). Because the selected markers are heterozygous they will generate two peaks on the MS profiles. The area of each peak is proportional to the transcript abundance and can be measured with a function of the MassARRAY Typer software to generate an allelic ratio (allele 1: allele 2) calculation. The allelic ratio obtained for cDNA is normalized using that measured from genomic DNA, where the allelic ratio is expected to be 1:1 to correct for technical artifacts. Markers with a normalised allelic ratio significantly different to 1 indicate that the amount of transcript generated from the two chromosomes in the same cell is different, suggesting that the DNA variants associated with the phenotype have an effect on gene expression. Experimental controls should be used to confirm the results.
Cellular Biology, Issue 45, Gene expression, regulatory variant, haplotype, association study, primer extension, MALDI-TOF mass spectrometry, single nucleotide polymorphism, allele-specific
C. elegans Positive Butanone Learning, Short-term, and Long-term Associative Memory Assays
Institutions: Princeton University, Princeton University.
The memory of experiences and learned information is critical for organisms to make choices that aid their survival. C. elegans
navigates its environment through neuron-specific detection of food and chemical odors1, 2
, and can associate nutritive states with chemical odors3
, and the pathogenicity of a food source5
Here, we describe assays of C. elegans
associative learning and short- and long-term associative memory. We modified an aversive olfactory learning paradigm6
to instead produce a positive response; the assay involves starving ~400 worms, then feeding the worms in the presence of the AWC neuron-sensed volatile chemoattractant butanone at a concentration that elicits a low chemotactic index (similar to Toroyama et al.7
). A standard population chemotaxis assay1 tests the worms' attraction to the odorant immediately or minutes to hours after conditioning.
After conditioning, wild-type animals' chemotaxis to butanone increases ~0.6 Chemotaxis Index units, its "Learning Index". Associative learning is dependent on the presence of both food and butanone during training. Pairing food and butanone for a single conditioning period ("massed training") produces short-term associative memory that lasts ~2 hours. Multiple conditioning periods with rest periods between ("spaced training") yields long-term associative memory (<40 hours), and is dependent on the cAMP Response Element Binding protein (CREB),6
a transcription factor required for long-term memory across species.8
Our protocol also includes image analysis methods for quick and accurate determination of chemotaxis indices. High-contrast images of animals on chemotaxis assay plates are captured and analyzed by worm counting software in MatLab. The software corrects for uneven background using a morphological tophat transformation.9
Otsu's method is then used to determine a threshold to separate worms from the background.10
Very small particles are removed automatically and larger non-worm regions (plate edges or agar punches) are removed by manual selection. The software then estimates the size of single worm by ignoring regions that are above a specified maximum size and taking the median size of the remaining regions. The number of worms is then estimated by dividing the total area identified as occupied by worms by the estimated size of a single worm.
We have found that learning and short- and long-term memory can be distinguished, and that these processes share similar key molecules with higher organisms.6,8
Our assays can quickly test novel candidate genes or molecules that affect learning and short- or long-term memory in C. elegans
that are relevant across species.
Neuroscience, Issue 49, memory, associative learning, C. elegans, chemotaxis, spaced training, behavior
Quantification of Atherosclerotic Plaque Activity and Vascular Inflammation using [18-F] Fluorodeoxyglucose Positron Emission Tomography/Computed Tomography (FDG-PET/CT)
Institutions: University of Pennsylvania, Perelman School of Medicine, University of Pennsylvania, Perelman School of Medicine, University of Pennsylvania, Perelman School of Medicine.
Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC)1
and carotid intimal medial thickness (C-IMT)2
provide information about the burden of disease. However, despite multiple validation studies of CAC3-5
, and C-IMT2,6
, these modalities do not accurately assess plaque characteristics7,8
, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events9-13
F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism14,15
. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity16
, an important source of cellular inflammation in vessel walls. More recently, we17,18
and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries9,16,19,20
. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo
imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors21,22
and is also highly associated with overall burden of atherosclerosis23
. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy24
as well as longer term therapeutic lifestyle changes (16 months)25
The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is calculated by dividing the arterial SUV by the venous blood pool SUV. This method has shown to represent a stable, reproducible phenotype over time, has a high sensitivity for detection of vascular inflammation, and also has high inter-and intra-reader reliability26
. Here we present our methodology for patient preparation, image acquisition, and quantification of atherosclerotic plaque activity and vascular inflammation using SUV, TBR, and a global parameter called the metabolic volumetric product (MVP). These approaches may be applied to assess vascular inflammation in various study samples of interest in a consistent fashion as we have shown in several prior publications.9,20,27,28
Medicine, Issue 63, FDG-PET/CT, atherosclerosis, vascular inflammation, quantitative radiology, imaging
Stereotactic Radiosurgery for Gynecologic Cancer
Institutions: University Hospitals Case Medical Center and Case Western Reserve University School of Medicine, University Hospitals Case Medical Center and Case Western Reserve University School of Medicine.
Stereotactic body radiotherapy (SBRT) distinguishes itself by necessitating more rigid patient immobilization, accounting for respiratory motion, intricate treatment planning, on-board imaging, and reduced number of ablative radiation doses to cancer targets usually refractory to chemotherapy and conventional radiation. Steep SBRT radiation dose drop-off permits narrow 'pencil beam' treatment fields to be used for ablative radiation treatment condensed into 1 to 3 treatments.
Treating physicians must appreciate that SBRT comes at a bigger danger of normal tissue injury and chance of geographic tumor miss. Both must be tackled by immobilization of cancer targets and by high-precision treatment delivery. Cancer target immobilization has been achieved through use of indexed customized Styrofoam casts, evacuated bean bags, or body-fix molds with patient-independent abdominal compression.1-3
Intrafraction motion of cancer targets due to breathing now can be reduced by patient-responsive breath hold techniques,4
patient mouthpiece active breathing coordination,5
respiration-correlated computed tomography,6
or image-guided tracking of fiducials implanted within and around a moving tumor.7-9
The Cyberknife system (Accuray [Sunnyvale, CA]) utilizes a radiation linear accelerator mounted on a industrial robotic arm that accurately follows patient respiratory motion by a camera-tracked set of light-emitting diodes (LED) impregnated on a vest fitted to a patient.10
Substantial reductions in radiation therapy margins can be achieved by motion tracking, ultimately rendering a smaller planning target volumes that are irradiated with submillimeter accuracy.11-13
Cancer targets treated by SBRT are irradiated by converging, tightly collimated beams. Resultant radiation dose to cancer target volume histograms have a more pronounced radiation "shoulder" indicating high percentage target coverage and a small high-dose radiation "tail." Thus, increased target conformality comes at the expense of decreased dose uniformity in the SBRT cancer target. This may have implications for both subsequent tumor control in the SBRT target and normal tissue tolerance of organs at-risk. Due to the sharp dose falloff in SBRT, the possibility of occult disease escaping ablative radiation dose occurs when cancer targets are not fully recognized and inadequate SBRT dose margins are applied. Clinical target volume (CTV) expansion by 0.5 cm, resulting in a larger planning target volume (PTV), is associated with increased target control without undue normal tissue injury.7,8
Further reduction in the probability of geographic miss may be achieved by incorporation of 2-[18
F-FDG) positron emission tomography (PET).8
Use of 18
F-FDG PET/CT in SBRT treatment planning is only the beginning of attempts to discover new imaging target molecular signatures for gynecologic cancers.
Medicine, Issue 62, radiosurgery, Cyberknife stereotactic radiosurgery, radiation, ovarian cancer, cervix cancer
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
High-throughput Functional Screening using a Homemade Dual-glow Luciferase Assay
Institutions: Massachusetts General Hospital.
We present a rapid and inexpensive high-throughput screening protocol to identify transcriptional regulators of alpha-synuclein, a gene associated with Parkinson's disease. 293T cells are transiently transfected with plasmids from an arrayed ORF expression library, together with luciferase reporter plasmids, in a one-gene-per-well microplate format. Firefly luciferase activity is assayed after 48 hr to determine the effects of each library gene upon alpha-synuclein transcription, normalized to expression from an internal control construct (a hCMV promoter directing Renilla
luciferase). This protocol is facilitated by a bench-top robot enclosed in a biosafety cabinet, which performs aseptic liquid handling in 96-well format. Our automated transfection protocol is readily adaptable to high-throughput lentiviral library production or other functional screening protocols requiring triple-transfections of large numbers of unique library plasmids in conjunction with a common set of helper plasmids. We also present an inexpensive and validated alternative to commercially-available, dual luciferase reagents which employs PTC124, EDTA, and pyrophosphate to suppress firefly luciferase activity prior to measurement of Renilla
luciferase. Using these methods, we screened 7,670 human genes and identified 68 regulators of alpha-synuclein. This protocol is easily modifiable to target other genes of interest.
Cellular Biology, Issue 88, Luciferases, Gene Transfer Techniques, Transfection, High-Throughput Screening Assays, Transfections, Robotics
Genetic Manipulation in Δku80 Strains for Functional Genomic Analysis of Toxoplasma gondii
Institutions: The Geisel School of Medicine at Dartmouth.
Targeted genetic manipulation using homologous recombination is the method of choice for functional genomic analysis to obtain a detailed view of gene function and phenotype(s). The development of mutant strains with targeted gene deletions, targeted mutations, complemented gene function, and/or tagged genes provides powerful strategies to address gene function, particularly if these genetic manipulations can be efficiently targeted to the gene locus of interest using integration mediated by double cross over homologous recombination.
Due to very high rates of nonhomologous recombination, functional genomic analysis of Toxoplasma gondii
has been previously limited by the absence of efficient methods for targeting gene deletions and gene replacements to specific genetic loci. Recently, we abolished the major pathway of nonhomologous recombination in type I and type II strains of T. gondii
by deleting the gene encoding the KU80 protein1,2
. The Δku80
strains behave normally during tachyzoite (acute) and bradyzoite (chronic) stages in vitro
and in vivo
and exhibit essentially a 100% frequency of homologous recombination. The Δku80
strains make functional genomic studies feasible on the single gene as well as on the genome scale1-4
Here, we report methods for using type I and type II Δku80Δhxgprt
strains to advance gene targeting approaches in T. gondii
. We outline efficient methods for generating gene deletions, gene replacements, and tagged genes by targeted insertion or deletion of the hypoxanthine-xanthine-guanine phosphoribosyltransferase (HXGPRT
) selectable marker. The described gene targeting protocol can be used in a variety of ways in Δku80
strains to advance functional analysis of the parasite genome and to develop single strains that carry multiple targeted genetic manipulations. The application of this genetic method and subsequent phenotypic assays will reveal fundamental and unique aspects of the biology of T. gondii
and related significant human pathogens that cause malaria (Plasmodium
sp.) and cryptosporidiosis (Cryptosporidium
Infectious Diseases, Issue 77, Genetics, Microbiology, Infection, Medicine, Immunology, Molecular Biology, Cellular Biology, Biomedical Engineering, Bioengineering, Genomics, Parasitology, Pathology, Apicomplexa, Coccidia, Toxoplasma, Genetic Techniques, Gene Targeting, Eukaryota, Toxoplasma gondii, genetic manipulation, gene targeting, gene deletion, gene replacement, gene tagging, homologous recombination, DNA, sequencing
Pharmacologic Induction of Epidermal Melanin and Protection Against Sunburn in a Humanized Mouse Model
Institutions: University of Kentucky College of Medicine, University of Kentucky College of Medicine, University of Kentucky College of Medicine, University of Kentucky College of Medicine.
Fairness of skin, UV sensitivity and skin cancer risk all correlate with the physiologic function of the melanocortin 1 receptor, a Gs
-coupled signaling protein found on the surface of melanocytes. Mc1r stimulates adenylyl cyclase and cAMP production which, in turn, up-regulates melanocytic production of melanin in the skin. In order to study the mechanisms by which Mc1r signaling protects the skin against UV injury, this study relies on a mouse model with "humanized skin" based on epidermal expression of stem cell factor (Scf). K14-Scf
transgenic mice retain melanocytes in the epidermis and therefore have the ability to deposit melanin in the epidermis. In this animal model, wild type Mc1r status results in robust deposition of black eumelanin pigment and a UV-protected phenotype. In contrast, K14-Scf
animals with defective Mc1r signaling ability exhibit a red/blonde pigmentation, very little eumelanin in the skin and a UV-sensitive phenotype. Reasoning that eumelanin deposition might be enhanced by topical agents that mimic Mc1r signaling, we found that direct application of forskolin extract to the skin of Mc1r-defective fair-skinned mice resulted in robust eumelanin induction and UV protection 1
. Here we describe the method for preparing and applying a forskolin-containing natural root extract to K14-Scf
fair-skinned mice and report a method for measuring UV sensitivity by determining minimal erythematous dose (MED). Using this animal model, it is possible to study how epidermal cAMP induction and melanization of the skin affect physiologic responses to UV exposure.
Medicine, Issue 79, Skin, Inflammation, Photometry, Ultraviolet Rays, Skin Pigmentation, melanocortin 1 receptor, Mc1r, forskolin, cAMP, mean erythematous dose, skin pigmentation, melanocyte, melanin, sunburn, UV, inflammation
Detecting Somatic Genetic Alterations in Tumor Specimens by Exon Capture and Massively Parallel Sequencing
Institutions: Memorial Sloan-Kettering Cancer Center, Memorial Sloan-Kettering Cancer Center.
Efforts to detect and investigate key oncogenic mutations have proven valuable to facilitate the appropriate treatment for cancer patients. The establishment of high-throughput, massively parallel "next-generation" sequencing has aided the discovery of many such mutations. To enhance the clinical and translational utility of this technology, platforms must be high-throughput, cost-effective, and compatible with formalin-fixed paraffin embedded (FFPE) tissue samples that may yield small amounts of degraded or damaged DNA. Here, we describe the preparation of barcoded and multiplexed DNA libraries followed by hybridization-based capture of targeted exons for the detection of cancer-associated mutations in fresh frozen and FFPE tumors by massively parallel sequencing. This method enables the identification of sequence mutations, copy number alterations, and select structural rearrangements involving all targeted genes. Targeted exon sequencing offers the benefits of high throughput, low cost, and deep sequence coverage, thus conferring high sensitivity for detecting low frequency mutations.
Molecular Biology, Issue 80, Molecular Diagnostic Techniques, High-Throughput Nucleotide Sequencing, Genetics, Neoplasms, Diagnosis, Massively parallel sequencing, targeted exon sequencing, hybridization capture, cancer, FFPE, DNA mutations
Isolation of Human Umbilical Arterial Smooth Muscle Cells (HUASMC)
Institutions: Universidade da Beira Interior.
The human umbilical cord (UC) is a biological sample that can be easily obtained just after birth. This biological sample is, most of the time, discarded and their collection does not imply any added risk to the newborn or mother s health. Moreover no ethical concerns are raised. The UC is composed by one vein and two arteries from which both endothelial cells (ECs) 1
and smooth muscle cells (SMCs) 2
, two of the main cellular components of blood vessels, can be isolated. In this project the SMCs were obtained after enzymatic treatment of the UC arteries accordingly the experimental procedure previously described by Jaffe et al 3
. After cell isolation they were kept in t-flash with DMEM-F12 supplemented with 5% of fetal bovine serum and were cultured for several passages. Cells maintained their morphological and other phenotypic characteristics in the different generations. The aim of this study was to isolate smooth muscle cells in order to use them as models for future assays with constrictor drugs, isolate and structurally characterize L-type calcium channels, to study cellular and molecular aspects of the vascular function 4
and to use them in tissue engineering.
Cellular Biology, Issue 41, Human Cells, Umbilical Cord, Tissue Engineering, Cell Culture
Pyrosequencing: A Simple Method for Accurate Genotyping
Institutions: Washington University in St. Louis.
Pharmacogenetic research benefits first-hand from the abundance of information provided by the completion of the Human Genome Project. With such a tremendous amount of data available comes an explosion of genotyping methods. Pyrosequencing(R) is one of the most thorough yet simple methods to date used to analyze polymorphisms. It also has the ability to identify tri-allelic, indels, short-repeat polymorphisms, along with determining allele percentages for methylation or pooled sample assessment. In addition, there is a standardized control sequence that provides internal quality control. This method has led to rapid and efficient single-nucleotide polymorphism evaluation including many clinically relevant polymorphisms. The technique and methodology of Pyrosequencing is explained.
Cellular Biology, Issue 11, Springer Protocols, Pyrosequencing, genotype, polymorphism, SNP, pharmacogenetics, pharmacogenomics, PCR