HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
27 Related JoVE Articles!
Diagnosing Pulmonary Tuberculosis with the Xpert MTB/RIF Test
Institutions: University of Bern, MCL Laboratories Inc..
Tuberculosis (TB) due to Mycobacterium tuberculosis
(MTB) remains a major public health issue: the infection affects up to one third of the world population1
, and almost two million people are killed by TB each year.2
Universal access to high-quality, patient-centered treatment for all TB patients is emphasized by WHO's Stop TB Strategy.3
The rapid detection of MTB in respiratory specimens and drug therapy based on reliable drug resistance testing results are a prerequisite for the successful implementation of this strategy. However, in many areas of the world, TB diagnosis still relies on insensitive, poorly standardized sputum microscopy methods. Ineffective TB detection and the emergence and transmission of drug-resistant MTB strains increasingly jeopardize global TB control activities.2
Effective diagnosis of pulmonary TB requires the availability - on a global scale - of standardized, easy-to-use, and robust diagnostic tools that would allow the direct detection of both the MTB complex and resistance to key antibiotics, such as rifampicin (RIF). The latter result can serve as marker for multidrug-resistant MTB (MDR TB) and has been reported in > 95% of the MDR-TB isolates.4, 5
The rapid availability of reliable test results is likely to directly translate into sound patient management decisions that, ultimately, will cure the individual patient and break the chain of TB transmission in the community.2
Cepheid's (Sunnyvale, CA, U.S.A.) Xpert MTB/RIF assay6, 7
meets the demands outlined above in a remarkable manner. It is a nucleic-acids amplification test for 1) the detection of MTB complex DNA in sputum or concentrated sputum sediments; and 2) the detection of RIF resistance-associated mutations of the rpoB
It is designed for use with Cepheid's GeneXpert Dx System that integrates and automates sample processing, nucleic acid amplification, and detection of the target sequences using real-time PCR and reverse transcriptase PCR. The system consists of an instrument, personal computer, barcode scanner, and preloaded software for running tests and viewing the results.9
It employs single-use disposable Xpert MTB/RIF cartridges that hold PCR reagents and host the PCR process. Because the cartridges are self-contained, cross-contamination between samples is eliminated.6
Current nucleic acid amplification methods used to detect MTB are complex, labor-intensive, and technically demanding. The Xpert MTB/RIF assay has the potential to bring standardized, sensitive and very specific diagnostic testing for both TB and drug resistance to universal-access point-of-care settings3
, provided that they will be able to afford it. In order to facilitate access, the Foundation for Innovative New Diagnostics (FIND) has negotiated significant price reductions. Current FIND-negotiated prices, along with the list of countries eligible for the discounts, are available on the web.10
Immunology, Issue 62, tuberculosis, drug resistance, rifampicin, rapid diagnosis, Xpert MTB/RIF test
Blood Collection for Biochemical Analysis in Adult Zebrafish
Institutions: Centro de Pesquisa Experimental Laboratório de Hepatologia e Gastroenterologia Experimental, Universidade Federal do Rio Grande do Sul, UFRGS. Porto Alegre, RS, Brasil.
The zebrafish has been used as an animal model for studies of several human diseases. It can serve as a powerful preclinical platform for studies of molecular events and therapeutic strategies as well as for evaluating the physiological mechanisms of some pathologies1
There are relatively few publications related to adult zebrafish physiology of organs and systems2
, which may lead researchers to infer that the basic techniques needed to allow the exploration of zebrafish systems are lacking3
. Hematologic biochemical values of zebrafish were first reported in 2003 by Murtha and colleagues4
who employed a blood collection technique first described by Jagadeeswaran and colleagues in 1999. Briefly, blood was collected via a micropipette tip through a lateral incision, approximately 0.3 cm in length, in the region of the dorsal aorta5
. Because of the minute dimensions involved, this is a high-precision technique requiring a highly skilled practitioner. The same technique was used by the same group in another publication in that same year6
. In 2010, Eames and colleagues assessed whole blood glucose levels in zebrafish7
. They gained access to the blood by performing decapitations with scissors and then inserting a heparinized microcapillary collection tube into the pectoral articulation. They mention difficulties with hemolysis that were solved with an appropriate storage temperature based on the work Kilpatrick et al.8
. When attempting to use Jagadeeswaran's technique in our laboratory, we found that it was difficult to make the incision in precisely the right place as not to allow a significant amount of blood to be lost before collection could be started.
Recently, Gupta et al.9
described how to dissect adult zebrafish organs, Kinkle et al.10
described how to perform intraperitoneal injections, and Pugach et al.11
described how to perform retro-orbital injections. However, more work is needed to more fully explore basic techniques for research in zebrafish.
The small size of zebrafish presents challenges for researchers using it as an experimental model. Furthermore, given this smallness of scale, it is important that simple techniques are developed to enable researchers to explore the advantages of the zebrafish model.
Biochemistry, Issue 63, Developmental Biology, Zebrafish, Zebrafish blood, Hematologic, Biochemical analysis
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+
release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
V3 Stain-free Workflow for a Practical, Convenient, and Reliable Total Protein Loading Control in Western Blotting
Institutions: Bio-Rad Laboratories.
The western blot is a very useful and widely adopted lab technique, but its execution is challenging. The workflow is often characterized as a "black box" because an experimentalist does not know if it has been performed successfully until the last of several steps. Moreover, the quality of western blot data is sometimes challenged due to a lack of effective quality control tools in place throughout the western blotting process. Here we describe the V3 western workflow, which applies stain-free technology to address the major concerns associated with the traditional western blot protocol. This workflow allows researchers: 1) to run a gel in about 20-30 min; 2) to visualize sample separation quality within 5 min after the gel run; 3) to transfer proteins in 3-10 min; 4) to verify transfer efficiency quantitatively; and most importantly 5) to validate changes in the level of the protein of interest using total protein loading control. This novel approach eliminates the need of stripping and reprobing the blot for housekeeping proteins such as β-actin, β-tubulin, GAPDH, etc.
The V3 stain-free workflow makes the western blot process faster, transparent, more quantitative and reliable.
Basic Protocol, Issue 82, Biotechnology, Pharmaceutical, Protein electrophoresis, Western blot, Stain-Free, loading control, total protein normalization, stain-free technology
Measurement of Greenhouse Gas Flux from Agricultural Soils Using Static Chambers
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, USDA-ARS Dairy Forage Research Center, USDA-ARS Pasture Systems Watershed Management Research Unit.
Measurement of greenhouse gas (GHG) fluxes between the soil and the atmosphere, in both managed and unmanaged ecosystems, is critical to understanding the biogeochemical drivers of climate change and to the development and evaluation of GHG mitigation strategies based on modulation of landscape management practices. The static chamber-based method described here is based on trapping gases emitted from the soil surface within a chamber and collecting samples from the chamber headspace at regular intervals for analysis by gas chromatography. Change in gas concentration over time is used to calculate flux. This method can be utilized to measure landscape-based flux of carbon dioxide, nitrous oxide, and methane, and to estimate differences between treatments or explore system dynamics over seasons or years. Infrastructure requirements are modest, but a comprehensive experimental design is essential. This method is easily deployed in the field, conforms to established guidelines, and produces data suitable to large-scale GHG emissions studies.
Environmental Sciences, Issue 90, greenhouse gas, trace gas, gas flux, static chamber, soil, field, agriculture, climate
Rapid Analysis and Exploration of Fluorescence Microscopy Images
Institutions: UT Southwestern Medical Center, UT Southwestern Medical Center, Princeton University.
Despite rapid advances in high-throughput microscopy, quantitative image-based assays still pose significant challenges. While a variety of specialized image analysis tools are available, most traditional image-analysis-based workflows have steep learning curves (for fine tuning of analysis parameters) and result in long turnaround times between imaging and analysis. In particular, cell segmentation, the process of identifying individual cells in an image, is a major bottleneck in this regard.
Here we present an alternate, cell-segmentation-free workflow based on PhenoRipper, an open-source software platform designed for the rapid analysis and exploration of microscopy images. The pipeline presented here is optimized for immunofluorescence microscopy images of cell cultures and requires minimal user intervention. Within half an hour, PhenoRipper can analyze data from a typical 96-well experiment and generate image profiles. Users can then visually explore their data, perform quality control on their experiment, ensure response to perturbations and check reproducibility of replicates. This facilitates a rapid feedback cycle between analysis and experiment, which is crucial during assay optimization. This protocol is useful not just as a first pass analysis for quality control, but also may be used as an end-to-end solution, especially for screening. The workflow described here scales to large data sets such as those generated by high-throughput screens, and has been shown to group experimental conditions by phenotype accurately over a wide range of biological systems. The PhenoBrowser interface provides an intuitive framework to explore the phenotypic space and relate image properties to biological annotations. Taken together, the protocol described here will lower the barriers to adopting quantitative analysis of image based screens.
Basic Protocol, Issue 85, PhenoRipper, fluorescence microscopy, image analysis, High-content analysis, high-throughput screening, Open-source, Phenotype
Protease- and Acid-catalyzed Labeling Workflows Employing 18O-enriched Water
Institutions: Boston Biomedical Research Institute.
Stable isotopes are essential tools in biological mass spectrometry. Historically, 18
O-stable isotopes have been extensively used to study the catalytic mechanisms of proteolytic enzymes1-3
. With the advent of mass spectrometry-based proteomics, the enzymatically-catalyzed incorporation of 18
O-atoms from stable isotopically enriched water has become a popular method to quantitatively compare protein expression levels (
reviewed by Fenselau and Yao4
, Miyagi and Rao5
and Ye et al.6)
O-labeling constitutes a simple and low-cost alternative to chemical (e.g.
iTRAQ, ICAT) and metabolic (e.g.
SILAC) labeling techniques7
. Depending on the protease utilized, 18
O-labeling can result in the incorporation of up to two 18
O-atoms in the C-terminal carboxyl group of the cleavage product3
. The labeling reaction can be subdivided into two independent processes, the peptide bond cleavage and the carboxyl oxygen exchange reaction8
. In our PALeO (p
-enriched water) adaptation of enzymatic 18
O-labeling, we utilized 50% 18
O-enriched water to yield distinctive isotope signatures. In combination with high-resolution matrix-assisted laser desorption ionization time-of-flight tandem mass spectrometry (MALDI-TOF/TOF MS/MS), the characteristic isotope envelopes can be used to identify cleavage products with a high level of specificity. We previously have used the PALeO-methodology to detect and characterize endogenous proteases9
and monitor proteolytic reactions10-11
. Since PALeO encodes the very essence of the proteolytic cleavage reaction, the experimental setup is simple and biochemical enrichment steps of cleavage products can be circumvented. The PALeO-method can easily be extended to (i) time course experiments that monitor the dynamics of proteolytic cleavage reactions and (ii) the analysis of proteolysis in complex biological samples that represent physiological conditions. PALeO-TimeCourse experiments help identifying rate-limiting processing steps and reaction intermediates in complex proteolytic pathway reactions. Furthermore, the PALeO-reaction allows us to identify proteolytic enzymes such as the serine protease trypsin that is capable to rebind its cleavage products and catalyze the incorporation of a second 18
O-atom. Such "double-labeling" enzymes can be used for postdigestion 18
O-labeling, in which peptides are exclusively labeled by the carboxyl oxygen exchange reaction. Our third strategy extends labeling employing 18
O-enriched water beyond enzymes and uses acidic pH conditions to introduce 18
O-stable isotope signatures into peptides.
Biochemistry, Issue 72, Molecular Biology, Proteins, Proteomics, Chemistry, Physics, MALDI-TOF mass spectrometry, proteomics, proteolysis, quantification, stable isotope labeling, labeling, catalyst, peptides, 18-O enriched water
High-throughput Image Analysis of Tumor Spheroids: A User-friendly Software Application to Measure the Size of Spheroids Automatically and Accurately
Institutions: Raymond and Beverly Sackler Foundation, New Jersey, Rutgers University, Rutgers University, Institute for Advanced Study, New Jersey.
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro
model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application – SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary “Manual Initialize” and “Hand Draw” tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro
model for drug screens in industry and academia.
Cancer Biology, Issue 89, computer programming, high-throughput, image analysis, tumor spheroids, 3D, software application, cancer therapy, drug screen, neuroendocrine tumor cell line, BON-1, cancer research
Accuracy in Dental Medicine, A New Way to Measure Trueness and Precision
Institutions: University of Zürich.
Reference scanners are used in dental medicine to verify a lot of procedures. The main interest is to verify impression methods as they serve as a base for dental restorations. The current limitation of many reference scanners is the lack of accuracy scanning large objects like full dental arches, or the limited possibility to assess detailed tooth surfaces. A new reference scanner, based on focus variation scanning technique, was evaluated with regards to highest local and general accuracy. A specific scanning protocol was tested to scan original tooth surface from dental impressions. Also, different model materials were verified. The results showed a high scanning accuracy of the reference scanner with a mean deviation of 5.3 ± 1.1 µm for trueness and 1.6 ± 0.6 µm for precision in case of full arch scans. Current dental impression methods showed much higher deviations (trueness: 20.4 ± 2.2 µm, precision: 12.5 ± 2.5 µm) than the internal scanning accuracy of the reference scanner. Smaller objects like single tooth surface can be scanned with an even higher accuracy, enabling the system to assess erosive and abrasive tooth surface loss. The reference scanner can be used to measure differences for a lot of dental research fields. The different magnification levels combined with a high local and general accuracy can be used to assess changes of single teeth or restorations up to full arch changes.
Medicine, Issue 86, Laboratories, Dental, Calibration, Technology, Dental impression, Accuracy, Trueness, Precision, Full arch scan, Abrasion
Techniques for Processing Eyes Implanted With a Retinal Prosthesis for Localized Histopathological Analysis
Institutions: Bionics Institute, St Vincent's Hospital Melbourne, University of Melbourne, University of Melbourne.
With the recent development of retinal prostheses, it is important to develop reliable techniques for assessing the safety of these devices in preclinical studies. However, the standard fixation, preparation, and automated histology procedures are not ideal. Here we describe new procedures for evaluating the health of the retina directly adjacent to an implant. Retinal prostheses feature electrode arrays in contact with eye tissue. Previous methods have not been able to spatially localize the ocular tissue adjacent to individual electrodes within the array. In addition, standard histological processing often results in gross artifactual detachment of the retinal layers when assessing implanted eyes. Consequently, it has been difficult to assess localized damage, if present, caused by implantation and stimulation of an implanted electrode array. Therefore, we developed a method for identifying and localizing the ocular tissue adjacent to implanted electrodes using a (color-coded) dye marking scheme, and we modified an eye fixation technique to minimize artifactual retinal detachment. This method also rendered the sclera translucent, enabling localization of individual electrodes and specific parts of an implant. Finally, we used a matched control to increase the power of the histopathological assessments. In summary, this method enables reliable and efficient discrimination and assessment of the retinal cytoarchitecture in an implanted eye.
Medicine, Issue 78, Anatomy, Physiology, Biomedical Engineering, Bioengineering, Surgery, Ophthalmology, Pathology, Tissue Engineering, Prosthesis Implantation, Implantable Neurostimulators, Implants, Experimental, Histology, bionics, Retina, Prosthesis, Bionic Eye, Retinal, Implant, Suprachoroidal, Fixation, Localization, Safety, Preclinical, dissection, embedding, staining, tissue, surgical techniques, clinical techniques
Clinical Examination Protocol to Detect Atypical and Classical Scrapie in Sheep
Institutions: Animal Health and Veterinary Laboratories Agency Weybridge.
The diagnosis of scrapie, a transmissible spongiform encephalopathy (TSEs) of sheep and goats, is currently based on the detection of disease-associated prion protein by post mortem
tests. Unless a random sample of the sheep or goat population is actively monitored for scrapie, identification of scrapie cases relies on the reporting of clinical suspects, which is dependent on the individual's familiarization with the disease and ability to recognize clinical signs associated with scrapie. Scrapie may not be considered in the differential diagnosis of neurological diseases in small ruminants, particularly in countries with low scrapie prevalence, or not recognized if it presents as nonpruritic form like atypical scrapie. To aid in the identification of clinical suspects, a short examination protocol is presented to assess the display of specific clinical signs associated with pruritic and nonpruritic forms of TSEs in sheep, which could also be applied to goats. This includes assessment of behavior, vision (by testing of the menace response), pruritus (by testing the response to scratching), and movement (with and without blindfolding). This may lead to a more detailed neurologic examination of reporting animals as scrapie suspects. It could also be used in experimental TSE studies of sheep or goats to evaluate disease progression or to identify clinical end-point.
Infectious Diseases, Issue 83, transmissible spongiform encephalopathy, sheep, atypical scrapie, classical scrapie, neurologic examination, scratch test, menace response, blindfolding
Vascular Occlusion Training for Inclusion Body Myositis: A Novel Therapeutic Approach
Institutions: University of São Paulo, University of São Paulo.
Inclusion body myositis (IBM) is a rare idiopathic inflammatory myopathy. It is known to produces remarkable muscle weakness and to greatly compromise function and quality of life. Moreover, clinical practice suggests that, unlike other inflammatory myopathies, the majority of IBM patients are not responsive to treatment with immunosuppressive or immunomodulatory drugs to counteract disease progression1
. Additionally, conventional resistance training programs have been proven ineffective in restoring muscle function and muscle mass in these patients2,3
. Nevertheless, we have recently observed that restricting muscle blood flow using tourniquet cuffs in association with moderate intensity resistance training in an IBM patient produced a significant gain in muscle mass and function, along with substantial benefits in quality of life4
. Thus, a new non-pharmacological approach for IBM patients has been proposed. Herein, we describe the details of a proposed protocol for vascular occlusion associated with a resistance training program for this population.
Medicine, Issue 40, exercise training, therapeutical, myositis, vascular occlusion
Electrochemotherapy of Tumours
Institutions: Institute of Oncology Ljubljana, University of Ljubljana.
Electrochemotherapy is a combined use of certain chemotherapeutic drugs and electric pulses applied to the treated tumour nodule. Local application of electric pulses to the tumour increases drug delivery into cells, specifically at the site of electric pulse application. Drug uptake by delivery of electric pulses is increased for only those chemotherapeutic drugs whose transport through the plasma membrane is impeded. Among many drugs that have been tested so far, bleomycin and cisplatin found their way from preclinical testing to clinical use. Clinical data collected within a number of clinical studies indicate that approximately 80% of the treated cutaneous and subcutaneous tumour nodules of different malignancies are in an objective response, from these, approximately 70% in complete response after a single application of electrochemotherapy. Usually only one treatment is needed, however, electrochemotherapy can be repeated several times every few weeks with equal effectiveness each time. The treatment results in an effective eradication of the treated nodules, with a good cosmetic effect without tissue scarring.
Medicine, Issue 22, electrochemotherapy, electroporation, cisplatin, bleomycin, malignant tumours, cutaneous lesions
DNA Fingerprinting of Mycobacterium leprae Strains Using Variable Number Tandem Repeat (VNTR) - Fragment Length Analysis (FLA)
Institutions: Colorado State University.
The study of the transmission of leprosy is particularly difficult since the causative agent, Mycobacterium leprae
, cannot be cultured in the laboratory. The only sources of the bacteria are leprosy patients, and experimentally infected armadillos and nude mice. Thus, many of the methods used in modern epidemiology are not available for the study of leprosy. Despite an extensive global drug treatment program for leprosy implemented by the WHO1
, leprosy remains endemic in many countries with approximately 250,000 new cases each year.2
The entire M. leprae
genome has been mapped3,4
and many loci have been identified that have repeated segments of 2 or more base pairs (called micro- and minisatellites).5
Clinical strains of M. leprae
may vary in the number of tandem repeated segments (short tandem repeats, STR) at many of these loci.5,6,7
Variable number tandem repeat (VNTR)5
analysis has been used to distinguish different strains of the leprosy bacilli. Some of the loci appear to be more stable than others, showing less variation in repeat numbers, while others seem to change more rapidly, sometimes in the same patient. While the variability of certain VNTRs has brought up questions regarding their suitability for strain typing7,8,9
, the emerging data suggest that analyzing multiple loci, which are diverse in their stability, can be used as a valuable epidemiological tool. Multiple locus VNTR analysis (MLVA)10
has been used to study leprosy evolution and transmission in several countries including China11,12
, the Philippines10,13
, and Brazil14
. MLVA involves multiple steps. First, bacterial DNA is extracted along with host tissue DNA from clinical biopsies or slit skin smears (SSS).10
The desired loci are then amplified from the extracted DNA via polymerase chain reaction (PCR). Fluorescently-labeled primers for 4-5 different loci are used per reaction, with 18 loci being amplified in a total of four reactions.10
The PCR products may be subjected to agarose gel electrophoresis to verify the presence of the desired DNA segments, and then submitted for fluorescent fragment length analysis (FLA) using capillary electrophoresis. DNA from armadillo passaged bacteria with a known number of repeat copies for each locus is used as a positive control. The FLA chromatograms are then examined using Peak Scanner
software and fragment length is converted to number of VNTR copies (allele). Finally, the VNTR haplotypes are analyzed for patterns, and when combined with patient clinical data can be used to track distribution of strain types.
Immunology, Issue 53, Mycobacterium leprae, leprosy, biopsy, STR, VNTR, PCR, fragment length analysis
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Mass Production of Genetically Modified Aedes aegypti for Field Releases in Brazil
Institutions: Oxitec Ltd, Universidade de São Paulo, Universidade de São Paulo, Moscamed Brasil, University of Oxford, Instituto Nacional de Ciência e Tecnologia em Entomologia Molecular (INCT-EM).
New techniques and methods are being sought to try to win the battle against mosquitoes. Recent advances in molecular techniques have led to the development of new and innovative methods of mosquito control based around the Sterile Insect Technique (SIT)1-3
. A control method known as RIDL (Release of Insects carrying a Dominant Lethal)4
, is based around SIT, but uses genetic methods to remove the need for radiation-sterilization5-8
. A RIDL strain of Ae. aegypti
was successfully tested in the field in Grand Cayman9,10
; further field use is planned or in progress in other countries around the world.
Mass rearing of insects has been established in several insect species and to levels of billions a week. However, in mosquitoes, rearing has generally been performed on a much smaller scale, with most large scale rearing being performed in the 1970s and 80s. For a RIDL program it is desirable to release as few females as possible as they bite and transmit disease. In a mass rearing program there are several stages to produce the males to be released: egg production, rearing eggs until pupation, and then sorting males from females before release. These males are then used for a RIDL control program, released as either pupae or adults11,12
To suppress a mosquito population using RIDL a large number of high quality male adults need to be reared13,14
. The following describes the methods for the mass rearing of OX513A, a RIDL strain of Ae. aegypti 8,
for release and covers the techniques required for the production of eggs and mass rearing RIDL males for a control program.
Basic Protocol, Issue 83, Aedes aegypti, mass rearing, population suppression, transgenic, insect, mosquito, dengue
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Using the Threat Probability Task to Assess Anxiety and Fear During Uncertain and Certain Threat
Institutions: University of Wisconsin-Madison.
Fear of certain threat and anxiety about uncertain threat are distinct emotions with unique behavioral, cognitive-attentional, and neuroanatomical components. Both anxiety and fear can be studied in the laboratory by measuring the potentiation of the startle reflex. The startle reflex is a defensive reflex that is potentiated when an organism is threatened and the need for defense is high. The startle reflex is assessed via electromyography (EMG) in the orbicularis oculi muscle elicited by brief, intense, bursts of acoustic white noise (i.e.
, “startle probes”). Startle potentiation is calculated as the increase in startle response magnitude during presentation of sets of visual threat cues that signal delivery of mild electric shock relative to sets of matched cues that signal the absence of shock (no-threat cues). In the Threat Probability Task, fear is measured via startle potentiation to high probability (100% cue-contingent shock; certain) threat cues whereas anxiety is measured via startle potentiation to low probability (20% cue-contingent shock; uncertain) threat cues. Measurement of startle potentiation during the Threat Probability Task provides an objective and easily implemented alternative to assessment of negative affect via self-report or other methods (e.g.
, neuroimaging) that may be inappropriate or impractical for some researchers. Startle potentiation has been studied rigorously in both animals (e.g
., rodents, non-human primates) and humans which facilitates animal-to-human translational research. Startle potentiation during certain and uncertain threat provides an objective measure of negative affective and distinct emotional states (fear, anxiety) to use in research on psychopathology, substance use/abuse and broadly in affective science. As such, it has been used extensively by clinical scientists interested in psychopathology etiology and by affective scientists interested in individual differences in emotion.
Behavior, Issue 91,
Startle; electromyography; shock; addiction; uncertainty; fear; anxiety; humans; psychophysiology; translational
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Measuring Attentional Biases for Threat in Children and Adults
Institutions: Rutgers University.
Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g.,
snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g.,
flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.
Behavior, Issue 92, Detection, threat, attention, attentional bias, anxiety, visual search
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
Determining the Contribution of the Energy Systems During Exercise
Institutions: University of Sao Paulo, University of Sao Paulo, University of Sao Paulo, University of Sao Paulo.
One of the most important aspects of the metabolic demand is the relative contribution of the energy systems to the total energy required for a given physical activity. Although some sports are relatively easy to be reproduced in a laboratory (e.g., running and cycling), a number of sports are much more difficult to be reproduced and studied in controlled situations. This method presents how to assess the differential contribution of the energy systems in sports that are difficult to mimic in controlled laboratory conditions. The concepts shown here can be adapted to virtually any sport.
The following physiologic variables will be needed: rest oxygen consumption, exercise oxygen consumption, post-exercise oxygen consumption, rest plasma lactate concentration and post-exercise plasma peak lactate. To calculate the contribution of the aerobic metabolism, you will need the oxygen consumption at rest and during the exercise. By using the trapezoidal method, calculate the area under the curve of oxygen consumption during exercise, subtracting the area corresponding to the rest oxygen consumption. To calculate the contribution of the alactic anaerobic metabolism, the post-exercise oxygen consumption curve has to be adjusted to a mono or a bi-exponential model (chosen by the one that best fits). Then, use the terms of the fitted equation to calculate anaerobic alactic metabolism, as follows: ATP-CP metabolism = A1
(mL . s-1
) x t1
(s). Finally, to calculate the contribution of the lactic anaerobic system, multiply peak plasma lactate by 3 and by the athlete’s body mass (the result in mL is then converted to L and into kJ).
The method can be used for both continuous and intermittent exercise. This is a very interesting approach as it can be adapted to exercises and sports that are difficult to be mimicked in controlled environments. Also, this is the only available method capable of distinguishing the contribution of three different energy systems. Thus, the method allows the study of sports with great similarity to real situations, providing desirable ecological validity to the study.
Physiology, Issue 61, aerobic metabolism, anaerobic alactic metabolism, anaerobic lactic metabolism, exercise, athletes, mathematical model
Large Scale Non-targeted Metabolomic Profiling of Serum by Ultra Performance Liquid Chromatography-Mass Spectrometry (UPLC-MS)
Institutions: Colorado State University.
Non-targeted metabolite profiling by ultra performance liquid chromatography coupled with mass spectrometry (UPLC-MS) is a powerful technique to investigate metabolism. The approach offers an unbiased and in-depth analysis that can enable the development of diagnostic tests, novel therapies, and further our understanding of disease processes. The inherent chemical diversity of the metabolome creates significant analytical challenges and there is no single experimental approach that can detect all metabolites. Additionally, the biological variation in individual metabolism and the dependence of metabolism on environmental factors necessitates large sample numbers to achieve the appropriate statistical power required for meaningful biological interpretation. To address these challenges, this tutorial outlines an analytical workflow for large scale non-targeted metabolite profiling of serum by UPLC-MS. The procedure includes guidelines for sample organization and preparation, data acquisition, quality control, and metabolite identification and will enable reliable acquisition of data for large experiments and provide a starting point for laboratories new to non-targeted metabolite profiling by UPLC-MS.
Chemistry, Issue 73, Biochemistry, Genetics, Molecular Biology, Physiology, Genomics, Proteins, Proteomics, Metabolomics, Metabolite Profiling, Non-targeted metabolite profiling, mass spectrometry, Ultra Performance Liquid Chromatography, UPLC-MS, serum, spectrometry
Comprehensive & Cost Effective Laboratory Monitoring of HIV/AIDS: an African Role Model
Institutions: National Health Laboratory Services (NHLS-SA), University of Witwatersrand, Lightcurve Films.
We present the video about assisting anti-retroviral therapy (ART) by an apt laboratory service - representing a South-African role model for economical large scale diagnostic testing. In the low-income countries inexpensive ART has transformed the prospects for the survival of HIV seropositive patients but there are doubts whether there is a need for the laboratory monitoring of ART and at what costs - in situations when the overall quality of pathology services can still be very low. The appropriate answer is to establish economically sound services with better coordination and stricter internal quality assessment than seen in western countries. This video, photographed at location in the National Health Laboratory Services (NHLS-SA) at the Witwatersrand University, Johannesburg, South Africa, provides such a coordinated scheme expanding the original 2-color CD4-CD45 PanLeucoGating strategy (PLG). Thus the six modules of the video presentation reveal the simplicity of a 4-color flow cytometric assay to combine haematological, immunological and virology-related tests in a single tube. These video modules are: (i) the set-up of instruments; (ii) sample preparations; (iii) testing absolute counts and monitoring quality for each sample by bead-count-rate; (iv) the heamatological CD45 test for white cell counts and differentials; (v) the CD4 counts, and (vi) the activation of CD8+ T cells measured by CD38 display, a viral load related parameter. The potential cost-savings are remarkable. This arrangement is a prime example for the feasibility of performing > 800-1000 tests per day with a stricter quality control than that applied in western laboratories, and also with a transfer of technology to other laboratories within a NHLS-SA network. Expert advisors, laboratory managers and policy makers who carry the duty of making decisions about introducing modern medical technology are frequently not in a position to see the latest technical details as carried out in the large regional laboratories with huge burdens of workload. Hence this video shows details of these new developments.
Immunology, Issue 44, Human Immunodeficiency virus (HIV); CD4 lymphocyte count, white cell count, CD45, panleucogating, lymphocyte activation, CD38, HIV viral load, antiretroviral therapy (ART), internal quality control
Functional Mapping with Simultaneous MEG and EEG
Institutions: MGH - Massachusetts General Hospital.
We use magnetoencephalography (MEG) and electroencephalography (EEG) to locate and determine the temporal evolution in brain areas involved in the processing of simple sensory stimuli. We will use somatosensory stimuli to locate the hand somatosensory areas, auditory stimuli to locate the auditory cortices, visual stimuli in four quadrants of the visual field to locate the early visual areas. These type of experiments are used for functional mapping in epileptic and brain tumor patients to locate eloquent cortices. In basic neuroscience similar experimental protocols are used to study the orchestration of cortical activity. The acquisition protocol includes quality assurance procedures, subject preparation for the combined MEG/EEG study, and acquisition of evoked-response data with somatosensory, auditory, and visual stimuli. We also demonstrate analysis of the data using the equivalent current dipole model and cortically-constrained minimum-norm estimates. Anatomical MRI data are employed in the analysis for visualization and for deriving boundaries of tissue boundaries for forward modeling and cortical location and orientation constraints for the minimum-norm estimates.
JoVE neuroscience, Issue 40, neuroscience, brain, MEG, EEG, functional imaging