JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Development of a Drug-Response Modeling Framework to Identify Cell Line Derived Translational Biomarkers That Can Predict Treatment Outcome to Erlotinib or Sorafenib.
PUBLISHED: 06-25-2015
Development of drug responsive biomarkers from pre-clinical data is a critical step in drug discovery, as it enables patient stratification in clinical trial design. Such translational biomarkers can be validated in early clinical trial phases and utilized as a patient inclusion parameter in later stage trials. Here we present a study on building accurate and selective drug sensitivity models for Erlotinib or Sorafenib from pre-clinical in vitro data, followed by validation of individual models on corresponding treatment arms from patient data generated in the BATTLE clinical trial. A Partial Least Squares Regression (PLSR) based modeling framework was designed and implemented, using a special splitting strategy and canonical pathways to capture robust information for model building. Erlotinib and Sorafenib predictive models could be used to identify a sub-group of patients that respond better to the corresponding treatment, and these models are specific to the corresponding drugs. The model derived signature genes reflect each drug's known mechanism of action. Also, the models predict each drug's potential cancer indications consistent with clinical trial results from a selection of globally normalized GEO expression datasets.
Authors: Ariosto Silva, Timothy Jacobson, Mark Meads, Allison Distler, Kenneth Shain.
Published: 07-15-2015
In this work we describe a novel approach that combines ex vivo drug sensitivity assays and digital image analysis to estimate chemosensitivity and heterogeneity of patient-derived multiple myeloma (MM) cells. This approach consists in seeding primary MM cells freshly extracted from bone marrow aspirates into microfluidic chambers implemented in multi-well plates, each consisting of a reconstruction of the bone marrow microenvironment, including extracellular matrix (collagen or basement membrane matrix) and stroma (patient-derived mesenchymal stem cells) or human-derived endothelial cells (HUVECs). The chambers are drugged with different agents and concentrations, and are imaged sequentially for 96 hr through bright field microscopy, in a motorized microscope equipped with a digital camera. Digital image analysis software detects live and dead cells from presence or absence of membrane motion, and generates curves of change in viability as a function of drug concentration and exposure time. We use a computational model to determine the parameters of chemosensitivity of the tumor population to each drug, as well as the number of sub-populations present as a measure of tumor heterogeneity. These patient-tailored models can then be used to simulate therapeutic regimens and estimate clinical response.
24 Related JoVE Articles!
Play Button
Modeling Astrocytoma Pathogenesis In Vitro and In Vivo Using Cortical Astrocytes or Neural Stem Cells from Conditional, Genetically Engineered Mice
Authors: Robert S. McNeill, Ralf S. Schmid, Ryan E. Bash, Mark Vitucci, Kristen K. White, Andrea M. Werneke, Brian H. Constance, Byron Huff, C. Ryan Miller.
Institutions: University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, Emory University School of Medicine, University of North Carolina School of Medicine.
Current astrocytoma models are limited in their ability to define the roles of oncogenic mutations in specific brain cell types during disease pathogenesis and their utility for preclinical drug development. In order to design a better model system for these applications, phenotypically wild-type cortical astrocytes and neural stem cells (NSC) from conditional, genetically engineered mice (GEM) that harbor various combinations of floxed oncogenic alleles were harvested and grown in culture. Genetic recombination was induced in vitro using adenoviral Cre-mediated recombination, resulting in expression of mutated oncogenes and deletion of tumor suppressor genes. The phenotypic consequences of these mutations were defined by measuring proliferation, transformation, and drug response in vitro. Orthotopic allograft models, whereby transformed cells are stereotactically injected into the brains of immune-competent, syngeneic littermates, were developed to define the role of oncogenic mutations and cell type on tumorigenesis in vivo. Unlike most established human glioblastoma cell line xenografts, injection of transformed GEM-derived cortical astrocytes into the brains of immune-competent littermates produced astrocytomas, including the most aggressive subtype, glioblastoma, that recapitulated the histopathological hallmarks of human astrocytomas, including diffuse invasion of normal brain parenchyma. Bioluminescence imaging of orthotopic allografts from transformed astrocytes engineered to express luciferase was utilized to monitor in vivo tumor growth over time. Thus, astrocytoma models using astrocytes and NSC harvested from GEM with conditional oncogenic alleles provide an integrated system to study the genetics and cell biology of astrocytoma pathogenesis in vitro and in vivo and may be useful in preclinical drug development for these devastating diseases.
Neuroscience, Issue 90, astrocytoma, cortical astrocytes, genetically engineered mice, glioblastoma, neural stem cells, orthotopic allograft
Play Button
Workflow for High-content, Individual Cell Quantification of Fluorescent Markers from Universal Microscope Data, Supported by Open Source Software
Authors: Simon R. Stockwell, Sibylle Mittnacht.
Institutions: UCL Cancer Institute.
Advances in understanding the control mechanisms governing the behavior of cells in adherent mammalian tissue culture models are becoming increasingly dependent on modes of single-cell analysis. Methods which deliver composite data reflecting the mean values of biomarkers from cell populations risk losing subpopulation dynamics that reflect the heterogeneity of the studied biological system. In keeping with this, traditional approaches are being replaced by, or supported with, more sophisticated forms of cellular assay developed to allow assessment by high-content microscopy. These assays potentially generate large numbers of images of fluorescent biomarkers, which enabled by accompanying proprietary software packages, allows for multi-parametric measurements per cell. However, the relatively high capital costs and overspecialization of many of these devices have prevented their accessibility to many investigators. Described here is a universally applicable workflow for the quantification of multiple fluorescent marker intensities from specific subcellular regions of individual cells suitable for use with images from most fluorescent microscopes. Key to this workflow is the implementation of the freely available Cell Profiler software1 to distinguish individual cells in these images, segment them into defined subcellular regions and deliver fluorescence marker intensity values specific to these regions. The extraction of individual cell intensity values from image data is the central purpose of this workflow and will be illustrated with the analysis of control data from a siRNA screen for G1 checkpoint regulators in adherent human cells. However, the workflow presented here can be applied to analysis of data from other means of cell perturbation (e.g., compound screens) and other forms of fluorescence based cellular markers and thus should be useful for a wide range of laboratories.
Cellular Biology, Issue 94, Image analysis, High-content analysis, Screening, Microscopy, Individual cell analysis, Multiplexed assays
Play Button
A Manual Small Molecule Screen Approaching High-throughput Using Zebrafish Embryos
Authors: Shahram Jevin Poureetezadi, Eric K. Donahue, Rebecca A. Wingert.
Institutions: University of Notre Dame.
Zebrafish have become a widely used model organism to investigate the mechanisms that underlie developmental biology and to study human disease pathology due to their considerable degree of genetic conservation with humans. Chemical genetics entails testing the effect that small molecules have on a biological process and is becoming a popular translational research method to identify therapeutic compounds. Zebrafish are specifically appealing to use for chemical genetics because of their ability to produce large clutches of transparent embryos, which are externally fertilized. Furthermore, zebrafish embryos can be easily drug treated by the simple addition of a compound to the embryo media. Using whole-mount in situ hybridization (WISH), mRNA expression can be clearly visualized within zebrafish embryos. Together, using chemical genetics and WISH, the zebrafish becomes a potent whole organism context in which to determine the cellular and physiological effects of small molecules. Innovative advances have been made in technologies that utilize machine-based screening procedures, however for many labs such options are not accessible or remain cost-prohibitive. The protocol described here explains how to execute a manual high-throughput chemical genetic screen that requires basic resources and can be accomplished by a single individual or small team in an efficient period of time. Thus, this protocol provides a feasible strategy that can be implemented by research groups to perform chemical genetics in zebrafish, which can be useful for gaining fundamental insights into developmental processes, disease mechanisms, and to identify novel compounds and signaling pathways that have medically relevant applications.
Developmental Biology, Issue 93, zebrafish, chemical genetics, chemical screen, in vivo small molecule screen, drug discovery, whole mount in situ hybridization (WISH), high-throughput screening (HTS), high-content screening (HCS)
Play Button
Ex Vivo Treatment Response of Primary Tumors and/or Associated Metastases for Preclinical and Clinical Development of Therapeutics
Authors: Adriana D. Corben, Mohammad M. Uddin, Brooke Crawford, Mohammad Farooq, Shanu Modi, John Gerecitano, Gabriela Chiosis, Mary L. Alpaugh.
Institutions: Memorial Sloan Kettering Cancer Center, Memorial Sloan Kettering Cancer Center, Weill Cornell Medical College, Memorial Sloan Kettering Cancer Center, Memorial Sloan Kettering Cancer Center, Memorial Sloan Kettering Cancer Center.
The molecular analysis of established cancer cell lines has been the mainstay of cancer research for the past several decades. Cell culture provides both direct and rapid analysis of therapeutic sensitivity and resistance. However, recent evidence suggests that therapeutic response is not exclusive to the inherent molecular composition of cancer cells but rather is greatly influenced by the tumor cell microenvironment, a feature that cannot be recapitulated by traditional culturing methods. Even implementation of tumor xenografts, though providing a wealth of information on drug delivery/efficacy, cannot capture the tumor cell/microenvironment crosstalk (i.e., soluble factors) that occurs within human tumors and greatly impacts tumor response. To this extent, we have developed an ex vivo (fresh tissue sectioning) technique which allows for the direct assessment of treatment response for preclinical and clinical therapeutics development. This technique maintains tissue integrity and cellular architecture within the tumor cell/microenvironment context throughout treatment response providing a more precise means to assess drug efficacy.
Cancer Biology, Issue 92, Ex vivo sectioning, Treatment response, Sensitivity/Resistance, Drug development, Patient tumors, Preclinical and Clinical
Play Button
Human Pluripotent Stem Cell Based Developmental Toxicity Assays for Chemical Safety Screening and Systems Biology Data Generation
Authors: Vaibhav Shinde, Stefanie Klima, Perumal Srinivasan Sureshkumar, Kesavan Meganathan, Smita Jagtap, Eugen Rempel, Jörg Rahnenführer, Jan Georg Hengstler, Tanja Waldmann, Jürgen Hescheler, Marcel Leist, Agapios Sachinidis.
Institutions: University of Cologne, University of Konstanz, Technical University of Dortmund, Technical University of Dortmund.
Efficient protocols to differentiate human pluripotent stem cells to various tissues in combination with -omics technologies opened up new horizons for in vitro toxicity testing of potential drugs. To provide a solid scientific basis for such assays, it will be important to gain quantitative information on the time course of development and on the underlying regulatory mechanisms by systems biology approaches. Two assays have therefore been tuned here for these requirements. In the UKK test system, human embryonic stem cells (hESC) (or other pluripotent cells) are left to spontaneously differentiate for 14 days in embryoid bodies, to allow generation of cells of all three germ layers. This system recapitulates key steps of early human embryonic development, and it can predict human-specific early embryonic toxicity/teratogenicity, if cells are exposed to chemicals during differentiation. The UKN1 test system is based on hESC differentiating to a population of neuroectodermal progenitor (NEP) cells for 6 days. This system recapitulates early neural development and predicts early developmental neurotoxicity and epigenetic changes triggered by chemicals. Both systems, in combination with transcriptome microarray studies, are suitable for identifying toxicity biomarkers. Moreover, they may be used in combination to generate input data for systems biology analysis. These test systems have advantages over the traditional toxicological studies requiring large amounts of animals. The test systems may contribute to a reduction of the costs for drug development and chemical safety evaluation. Their combination sheds light especially on compounds that may influence neurodevelopment specifically.
Developmental Biology, Issue 100, Human embryonic stem cells, developmental toxicity, neurotoxicity, neuroectodermal progenitor cells, immunoprecipitation, differentiation, cytotoxicity, embryopathy, embryoid body
Play Button
Operant Procedures for Assessing Behavioral Flexibility in Rats
Authors: Anne Marie Brady, Stan B. Floresco.
Institutions: St. Mary's College of Maryland, University of British Columbia.
Executive functions consist of multiple high-level cognitive processes that drive rule generation and behavioral selection. An emergent property of these processes is the ability to adjust behavior in response to changes in one’s environment (i.e., behavioral flexibility). These processes are essential to normal human behavior, and may be disrupted in diverse neuropsychiatric conditions, including schizophrenia, alcoholism, depression, stroke, and Alzheimer’s disease. Understanding of the neurobiology of executive functions has been greatly advanced by the availability of animal tasks for assessing discrete components of behavioral flexibility, particularly strategy shifting and reversal learning. While several types of tasks have been developed, most are non-automated, labor intensive, and allow testing of only one animal at a time. The recent development of automated, operant-based tasks for assessing behavioral flexibility streamlines testing, standardizes stimulus presentation and data recording, and dramatically improves throughput. Here, we describe automated strategy shifting and reversal tasks, using operant chambers controlled by custom written software programs. Using these tasks, we have shown that the medial prefrontal cortex governs strategy shifting but not reversal learning in the rat, similar to the dissociation observed in humans. Moreover, animals with a neonatal hippocampal lesion, a neurodevelopmental model of schizophrenia, are selectively impaired on the strategy shifting task but not the reversal task. The strategy shifting task also allows the identification of separate types of performance errors, each of which is attributable to distinct neural substrates. The availability of these automated tasks, and the evidence supporting the dissociable contributions of separate prefrontal areas, makes them particularly well-suited assays for the investigation of basic neurobiological processes as well as drug discovery and screening in disease models.
Behavior, Issue 96, executive function, behavioral flexibility, prefrontal cortex, strategy shifting, reversal learning, behavioral neuroscience, schizophrenia, operant
Play Button
Evaluating the Effectiveness of Cancer Drug Sensitization In Vitro and In Vivo
Authors: Mateusz Rytelewski, Adrian Buensuceso, Hon S. Leong, Bonnie J. Deroo, Ann F. Chambers, James Koropatnick.
Institutions: Western University, Western University, Western University, Western University.
Due to the high level of heterogeneity and mutations inherent in human cancers, single agent therapies, or combination regimens which target the same pathway, are likely to fail. Emphasis must be placed upon the inhibition of pathways that are responsible for intrinsic and/or adaptive resistance to therapy. An active field of investigation is the development and testing of DNA repair inhibitors that promote the action of, and prevent resistance to, commonly used chemotherapy and radiotherapy. We used a novel protocol to evaluate the effectiveness of BRCA2 inhibition as a means to sensitize tumor cells to the DNA damaging drug cisplatin. Tumor cell metabolism (acidification and respiration) was monitored in real-time for a period of 72 hr to delineate treatment effectiveness on a minute by minute basis. In combination, we performed an assessment of metastatic frequency using a chicken embryo chorioallantoic membrane (CAM) model of extravasation and invasion. This protocol addresses some of the weaknesses of commonly used in vitro and in vivo methods to evaluate novel cancer therapy regimens. It can be used in addition to common methods such as cell proliferation assays, cell death assays, and in vivo murine xenograft studies, to more closely discriminate amongst candidate targets and agents, and select only the most promising candidates for further development.
Medicine, Issue 96, chicken embryo chorio-allantoic membrane model, real-time metabolic monitoring, anti-cancer drug testing, pre-clinical development, DNA repair
Play Button
The Double-H Maze: A Robust Behavioral Test for Learning and Memory in Rodents
Authors: Robert D. Kirch, Richard C. Pinnell, Ulrich G. Hofmann, Jean-Christophe Cassel.
Institutions: University Hospital Freiburg, UMR 7364 Université de Strasbourg, CNRS, Neuropôle de Strasbourg.
Spatial cognition research in rodents typically employs the use of maze tasks, whose attributes vary from one maze to the next. These tasks vary by their behavioral flexibility and required memory duration, the number of goals and pathways, and also the overall task complexity. A confounding feature in many of these tasks is the lack of control over the strategy employed by the rodents to reach the goal, e.g., allocentric (declarative-like) or egocentric (procedural) based strategies. The double-H maze is a novel water-escape memory task that addresses this issue, by allowing the experimenter to direct the type of strategy learned during the training period. The double-H maze is a transparent device, which consists of a central alleyway with three arms protruding on both sides, along with an escape platform submerged at the extremity of one of these arms. Rats can be trained using an allocentric strategy by alternating the start position in the maze in an unpredictable manner (see protocol 1; §4.7), thus requiring them to learn the location of the platform based on the available allothetic cues. Alternatively, an egocentric learning strategy (protocol 2; §4.8) can be employed by releasing the rats from the same position during each trial, until they learn the procedural pattern required to reach the goal. This task has been proven to allow for the formation of stable memory traces. Memory can be probed following the training period in a misleading probe trial, in which the starting position for the rats alternates. Following an egocentric learning paradigm, rats typically resort to an allocentric-based strategy, but only when their initial view on the extra-maze cues differs markedly from their original position. This task is ideally suited to explore the effects of drugs/perturbations on allocentric/egocentric memory performance, as well as the interactions between these two memory systems.
Behavior, Issue 101, Double-H maze, spatial memory, procedural memory, consolidation, allocentric, egocentric, habits, rodents, video tracking system
Play Button
Scalable 96-well Plate Based iPSC Culture and Production Using a Robotic Liquid Handling System
Authors: Michael K. Conway, Michael J. Gerger, Erin E. Balay, Rachel O'Connell, Seth Hanson, Neil J. Daily, Tetsuro Wakatsuki.
Institutions: InvivoSciences, Inc., Gilson, Inc..
Continued advancement in pluripotent stem cell culture is closing the gap between bench and bedside for using these cells in regenerative medicine, drug discovery and safety testing. In order to produce stem cell derived biopharmaceutics and cells for tissue engineering and transplantation, a cost-effective cell-manufacturing technology is essential. Maintenance of pluripotency and stable performance of cells in downstream applications (e.g., cell differentiation) over time is paramount to large scale cell production. Yet that can be difficult to achieve especially if cells are cultured manually where the operator can introduce significant variability as well as be prohibitively expensive to scale-up. To enable high-throughput, large-scale stem cell production and remove operator influence novel stem cell culture protocols using a bench-top multi-channel liquid handling robot were developed that require minimal technician involvement or experience. With these protocols human induced pluripotent stem cells (iPSCs) were cultured in feeder-free conditions directly from a frozen stock and maintained in 96-well plates. Depending on cell line and desired scale-up rate, the operator can easily determine when to passage based on a series of images showing the optimal colony densities for splitting. Then the necessary reagents are prepared to perform a colony split to new plates without a centrifugation step. After 20 passages (~3 months), two iPSC lines maintained stable karyotypes, expressed stem cell markers, and differentiated into cardiomyocytes with high efficiency. The system can perform subsequent high-throughput screening of new differentiation protocols or genetic manipulation designed for 96-well plates. This technology will reduce the labor and technical burden to produce large numbers of identical stem cells for a myriad of applications.
Developmental Biology, Issue 99, iPSC, high-throughput, robotic, liquid-handling, scalable, stem cell, automated stem cell culture, 96-well
Play Button
Studying Pancreatic Cancer Stem Cell Characteristics for Developing New Treatment Strategies
Authors: Enza Lonardo, Michele Cioffi, Patricia Sancho, Shanthini Crusz, Christopher Heeschen.
Institutions: Spanish National Cancer Research Center, Institute for Research in Biomedicine (IRB Barcelona), Queen Mary University of London.
Pancreatic ductal adenocarcinoma (PDAC) contains a subset of exclusively tumorigenic cancer stem cells (CSCs) which have been shown to drive tumor initiation, metastasis and resistance to radio- and chemotherapy. Here we describe a specific methodology for culturing primary human pancreatic CSCs as tumor spheres in anchorage-independent conditions. Cells are grown in serum-free, non-adherent conditions in order to enrich for CSCs while their more differentiated progenies do not survive and proliferate during the initial phase following seeding of single cells. This assay can be used to estimate the percentage of CSCs present in a population of tumor cells. Both size (which can range from 35 to 250 micrometers) and number of tumor spheres formed represents CSC activity harbored in either bulk populations of cultured cancer cells or freshly harvested and digested tumors 1,2. Using this assay, we recently found that metformin selectively ablates pancreatic CSCs; a finding that was subsequently further corroborated by demonstrating diminished expression of pluripotency-associated genes/surface markers and reduced in vivo tumorigenicity of metformin-treated cells. As the final step for preclinical development we treated mice bearing established tumors with metformin and found significantly prolonged survival. Clinical studies testing the use of metformin in patients with PDAC are currently underway (e.g., NCT01210911, NCT01167738, and NCT01488552). Mechanistically, we found that metformin induces a fatal energy crisis in CSCs by enhancing reactive oxygen species (ROS) production and reducing mitochondrial transmembrane potential. In contrast, non-CSCs were not eliminated by metformin treatment, but rather underwent reversible cell cycle arrest. Therefore, our study serves as a successful example for the potential of in vitro sphere formation as a screening tool to identify compounds that potentially target CSCs, but this technique will require further in vitro and in vivo validation to eliminate false discoveries.
Medicine, Issue 100, Pancreatic ductal adenocarcinoma, cancer stem cells, spheres, metformin (met), metabolism
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Setting-up an In Vitro Model of Rat Blood-brain Barrier (BBB): A Focus on BBB Impermeability and Receptor-mediated Transport
Authors: Yves Molino, Françoise Jabès, Emmanuelle Lacassagne, Nicolas Gaudin, Michel Khrestchatisky.
Institutions: VECT-HORUS SAS, CNRS, NICN UMR 7259.
The blood brain barrier (BBB) specifically regulates molecular and cellular flux between the blood and the nervous tissue. Our aim was to develop and characterize a highly reproducible rat syngeneic in vitro model of the BBB using co-cultures of primary rat brain endothelial cells (RBEC) and astrocytes to study receptors involved in transcytosis across the endothelial cell monolayer. Astrocytes were isolated by mechanical dissection following trypsin digestion and were frozen for later co-culture. RBEC were isolated from 5-week-old rat cortices. The brains were cleaned of meninges and white matter, and mechanically dissociated following enzymatic digestion. Thereafter, the tissue homogenate was centrifuged in bovine serum albumin to separate vessel fragments from nervous tissue. The vessel fragments underwent a second enzymatic digestion to free endothelial cells from their extracellular matrix. The remaining contaminating cells such as pericytes were further eliminated by plating the microvessel fragments in puromycin-containing medium. They were then passaged onto filters for co-culture with astrocytes grown on the bottom of the wells. RBEC expressed high levels of tight junction (TJ) proteins such as occludin, claudin-5 and ZO-1 with a typical localization at the cell borders. The transendothelial electrical resistance (TEER) of brain endothelial monolayers, indicating the tightness of TJs reached 300 ohm·cm2 on average. The endothelial permeability coefficients (Pe) for lucifer yellow (LY) was highly reproducible with an average of 0.26 ± 0.11 x 10-3 cm/min. Brain endothelial cells organized in monolayers expressed the efflux transporter P-glycoprotein (P-gp), showed a polarized transport of rhodamine 123, a ligand for P-gp, and showed specific transport of transferrin-Cy3 and DiILDL across the endothelial cell monolayer. In conclusion, we provide a protocol for setting up an in vitro BBB model that is highly reproducible due to the quality assurance methods, and that is suitable for research on BBB transporters and receptors.
Medicine, Issue 88, rat brain endothelial cells (RBEC), mouse, spinal cord, tight junction (TJ), receptor-mediated transport (RMT), low density lipoprotein (LDL), LDLR, transferrin, TfR, P-glycoprotein (P-gp), transendothelial electrical resistance (TEER),
Play Button
Adaptation of Semiautomated Circulating Tumor Cell (CTC) Assays for Clinical and Preclinical Research Applications
Authors: Lori E. Lowes, Benjamin D. Hedley, Michael Keeney, Alison L. Allan.
Institutions: London Health Sciences Centre, Western University, London Health Sciences Centre, Lawson Health Research Institute, Western University.
The majority of cancer-related deaths occur subsequent to the development of metastatic disease. This highly lethal disease stage is associated with the presence of circulating tumor cells (CTCs). These rare cells have been demonstrated to be of clinical significance in metastatic breast, prostate, and colorectal cancers. The current gold standard in clinical CTC detection and enumeration is the FDA-cleared CellSearch system (CSS). This manuscript outlines the standard protocol utilized by this platform as well as two additional adapted protocols that describe the detailed process of user-defined marker optimization for protein characterization of patient CTCs and a comparable protocol for CTC capture in very low volumes of blood, using standard CSS reagents, for studying in vivo preclinical mouse models of metastasis. In addition, differences in CTC quality between healthy donor blood spiked with cells from tissue culture versus patient blood samples are highlighted. Finally, several commonly discrepant items that can lead to CTC misclassification errors are outlined. Taken together, these protocols will provide a useful resource for users of this platform interested in preclinical and clinical research pertaining to metastasis and CTCs.
Medicine, Issue 84, Metastasis, circulating tumor cells (CTCs), CellSearch system, user defined marker characterization, in vivo, preclinical mouse model, clinical research
Play Button
The NeuroStar TMS Device: Conducting the FDA Approved Protocol for Treatment of Depression
Authors: Jared C. Horvath, John Mathews, Mark A. Demitrack, Alvaro Pascual-Leone.
Institutions: Beth Israel Deaconess Medical Center, Inc..
The Neuronetics NeuroStar Transcranial Magnetic Stimulation (TMS) System is a class II medical device that produces brief duration, pulsed magnetic fields. These rapidly alternating fields induce electrical currents within localized, targeted regions of the cortex which are associated with various physiological and functional brain changes.1,2,3 In 2007, O'Reardon et al., utilizing the NeuroStar device, published the results of an industry-sponsored, multisite, randomized, sham-stimulation controlled clinical trial in which 301 patients with major depression, who had previously failed to respond to at least one adequate antidepressant treatment trial, underwent either active or sham TMS over the left dorsolateral prefrontal cortex (DLPFC). The patients, who were medication-free at the time of the study, received TMS five times per week over 4-6 weeks.4 The results demonstrated that a sub-population of patients (those who were relatively less resistant to medication, having failed not more than two good pharmacologic trials) showed a statistically significant improvement on the Montgomery-Asberg Depression Scale (MADRS), the Hamilton Depression Rating Scale (HAMD), and various other outcome measures. In October 2008, supported by these and other similar results5,6,7, Neuronetics obtained the first and only Food and Drug Administration (FDA) approval for the clinical treatment of a specific form of medication-refractory depression using a TMS Therapy device (FDA approval K061053). In this paper, we will explore the specified FDA approved NeuroStar depression treatment protocol (to be administered only under prescription and by a licensed medical profession in either an in- or outpatient setting).
Neuroscience, Issue 45, Transcranial Magnetic Stimulation, Depression, Neuronetics, NeuroStar, FDA Approved
Play Button
Ex Vivo Culture of Patient Tissue & Examination of Gene Delivery
Authors: Simon Rajendran, Slawomir Salwa, Xuefeng Gao, Sabin Tabirca, Deirdre O'Hanlon, Gerald C. O'Sullivan, Mark Tangney.
Institutions: University College Cork, University College Cork.
This video describes the use of patient tissue as an ex vivo model for the study of gene delivery. Fresh patient tissue obtained at the time of surgery is sliced and maintained in culture. The ex vivo model system allows for the physical delivery of genes into intact patient tissue and gene expression is analysed by bioluminescence imaging using the IVIS detection system. The bioluminescent detection system demonstrates rapid and accurate quantification of gene expression within individual slices without the need for tissue sacrifice. This slice tissue culture system may be used in a variety of tissue types including normal and malignant tissue and allows us to study the effects of the heterogeneous nature of intact tissue and the high degree of variability between individual patients. This model system could be used in certain situations as an alternative to animal models and as a complementary preclinical mode prior to entering clinical trial.
Medicine, Issue 46, Bioluminescent imaging, Ex vivo tissue model, Preclinical research, Gene delivery
Play Button
Generation of Comprehensive Thoracic Oncology Database - Tool for Translational Research
Authors: Mosmi Surati, Matthew Robinson, Suvobroto Nandi, Leonardo Faoro, Carley Demchuk, Rajani Kanteti, Benjamin Ferguson, Tara Gangadhar, Thomas Hensing, Rifat Hasina, Aliya Husain, Mark Ferguson, Theodore Karrison, Ravi Salgia.
Institutions: University of Chicago, University of Chicago, Northshore University Health Systems, University of Chicago, University of Chicago, University of Chicago.
The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.
Medicine, Issue 47, Database, Thoracic oncology, Bioinformatics, Biorepository, Microsoft Access, Proteomics, Genomics
Play Button
Quantitative Visualization and Detection of Skin Cancer Using Dynamic Thermal Imaging
Authors: Cila Herman, Muge Pirtini Cetingul.
Institutions: The Johns Hopkins University.
In 2010 approximately 68,720 melanomas will be diagnosed in the US alone, with around 8,650 resulting in death 1. To date, the only effective treatment for melanoma remains surgical excision, therefore, the key to extended survival is early detection 2,3. Considering the large numbers of patients diagnosed every year and the limitations in accessing specialized care quickly, the development of objective in vivo diagnostic instruments to aid the diagnosis is essential. New techniques to detect skin cancer, especially non-invasive diagnostic tools, are being explored in numerous laboratories. Along with the surgical methods, techniques such as digital photography, dermoscopy, multispectral imaging systems (MelaFind), laser-based systems (confocal scanning laser microscopy, laser doppler perfusion imaging, optical coherence tomography), ultrasound, magnetic resonance imaging, are being tested. Each technique offers unique advantages and disadvantages, many of which pose a compromise between effectiveness and accuracy versus ease of use and cost considerations. Details about these techniques and comparisons are available in the literature 4. Infrared (IR) imaging was shown to be a useful method to diagnose the signs of certain diseases by measuring the local skin temperature. There is a large body of evidence showing that disease or deviation from normal functioning are accompanied by changes of the temperature of the body, which again affect the temperature of the skin 5,6. Accurate data about the temperature of the human body and skin can provide a wealth of information on the processes responsible for heat generation and thermoregulation, in particular the deviation from normal conditions, often caused by disease. However, IR imaging has not been widely recognized in medicine due to the premature use of the technology 7,8 several decades ago, when temperature measurement accuracy and the spatial resolution were inadequate and sophisticated image processing tools were unavailable. This situation changed dramatically in the late 1990s-2000s. Advances in IR instrumentation, implementation of digital image processing algorithms and dynamic IR imaging, which enables scientists to analyze not only the spatial, but also the temporal thermal behavior of the skin 9, allowed breakthroughs in the field. In our research, we explore the feasibility of IR imaging, combined with theoretical and experimental studies, as a cost effective, non-invasive, in vivo optical measurement technique for tumor detection, with emphasis on the screening and early detection of melanoma 10-13. In this study, we show data obtained in a patient study in which patients that possess a pigmented lesion with a clinical indication for biopsy are selected for imaging. We compared the difference in thermal responses between healthy and malignant tissue and compared our data with biopsy results. We concluded that the increased metabolic activity of the melanoma lesion can be detected by dynamic infrared imaging.
Medicine, Issue 51, Infrared imaging, quantitative thermal analysis, image processing, skin cancer, melanoma, transient thermal response, skin thermal models, skin phantom experiment, patient study
Play Button
Nerve Excitability Assessment in Chemotherapy-induced Neurotoxicity
Authors: Susanna B. Park, Cindy S-Y. Lin, Matthew C. Kiernan.
Institutions: University of New South Wales , University of New South Wales , University of New South Wales .
Chemotherapy-induced neurotoxicity is a serious consequence of cancer treatment, which occurs with some of the most commonly used chemotherapies1,2. Chemotherapy-induced peripheral neuropathy produces symptoms of numbness and paraesthesia in the limbs and may progress to difficulties with fine motor skills and walking, leading to functional impairment. In addition to producing troubling symptoms, chemotherapy-induced neuropathy may limit treatment success leading to dose reduction or early cessation of treatment. Neuropathic symptoms may persist long-term, leaving permanent nerve damage in patients with an otherwise good prognosis3. As chemotherapy is utilised more often as a preventative measure, and survival rates increase, the importance of long-lasting and significant neurotoxicity will increase. There are no established neuroprotective or treatment options and a lack of sensitive assessment methods. Appropriate assessment of neurotoxicity will be critical as a prognostic factor and as suitable endpoints for future trials of neuroprotective agents. Current methods to assess the severity of chemotherapy-induced neuropathy utilise clinician-based grading scales which have been demonstrated to lack sensitivity to change and inter-observer objectivity4. Conventional nerve conduction studies provide information about compound action potential amplitude and conduction velocity, which are relatively non-specific measures and do not provide insight into ion channel function or resting membrane potential. Accordingly, prior studies have demonstrated that conventional nerve conduction studies are not sensitive to early change in chemotherapy-induced neurotoxicity4-6. In comparison, nerve excitability studies utilize threshold tracking techniques which have been developed to enable assessment of ion channels, pumps and exchangers in vivo in large myelinated human axons7-9. Nerve excitability techniques have been established as a tool to examine the development and severity of chemotherapy-induced neurotoxicity10-13. Comprising a number of excitability parameters, nerve excitability studies can be used to assess acute neurotoxicity arising immediately following infusion and the development of chronic, cumulative neurotoxicity. Nerve excitability techniques are feasible in the clinical setting, with each test requiring only 5 -10 minutes to complete. Nerve excitability equipment is readily commercially available, and a portable system has been devised so that patients can be tested in situ in the infusion centre setting. In addition, these techniques can be adapted for use in multiple chemotherapies. In patients treated with the chemotherapy oxaliplatin, primarily utilised for colorectal cancer, nerve excitability techniques provide a method to identify patients at-risk for neurotoxicity prior to the onset of chronic neuropathy. Nerve excitability studies have revealed the development of an acute Na+ channelopathy in motor and sensory axons10-13. Importantly, patients who demonstrated changes in excitability in early treatment were subsequently more likely to develop moderate to severe neurotoxicity11. However, across treatment, striking longitudinal changes were identified only in sensory axons which were able to predict clinical neurological outcome in 80% of patients10. These changes demonstrated a different pattern to those seen acutely following oxaliplatin infusion, and most likely reflect the development of significant axonal damage and membrane potential change in sensory nerves which develops longitudinally during oxaliplatin treatment10. Significant abnormalities developed during early treatment, prior to any reduction in conventional measures of nerve function, suggesting that excitability parameters may provide a sensitive biomarker.
Neuroscience, Issue 62, Chemotherapy, Neurotoxicity, Neuropathy, Nerve excitability, Ion channel function, Oxaliplatin, oncology, medicine
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Movement Retraining using Real-time Feedback of Performance
Authors: Michael Anthony Hunt.
Institutions: University of British Columbia .
Any modification of movement - especially movement patterns that have been honed over a number of years - requires re-organization of the neuromuscular patterns responsible for governing the movement performance. This motor learning can be enhanced through a number of methods that are utilized in research and clinical settings alike. In general, verbal feedback of performance in real-time or knowledge of results following movement is commonly used clinically as a preliminary means of instilling motor learning. Depending on patient preference and learning style, visual feedback (e.g. through use of a mirror or different types of video) or proprioceptive guidance utilizing therapist touch, are used to supplement verbal instructions from the therapist. Indeed, a combination of these forms of feedback is commonplace in the clinical setting to facilitate motor learning and optimize outcomes. Laboratory-based, quantitative motion analysis has been a mainstay in research settings to provide accurate and objective analysis of a variety of movements in healthy and injured populations. While the actual mechanisms of capturing the movements may differ, all current motion analysis systems rely on the ability to track the movement of body segments and joints and to use established equations of motion to quantify key movement patterns. Due to limitations in acquisition and processing speed, analysis and description of the movements has traditionally occurred offline after completion of a given testing session. This paper will highlight a new supplement to standard motion analysis techniques that relies on the near instantaneous assessment and quantification of movement patterns and the display of specific movement characteristics to the patient during a movement analysis session. As a result, this novel technique can provide a new method of feedback delivery that has advantages over currently used feedback methods.
Medicine, Issue 71, Biophysics, Anatomy, Physiology, Physics, Biomedical Engineering, Behavior, Psychology, Kinesiology, Physical Therapy, Musculoskeletal System, Biofeedback, biomechanics, gait, movement, walking, rehabilitation, clinical, training
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Electrochemotherapy of Tumours
Authors: Gregor Sersa, Damijan Miklavcic.
Institutions: Institute of Oncology Ljubljana, University of Ljubljana.
Electrochemotherapy is a combined use of certain chemotherapeutic drugs and electric pulses applied to the treated tumour nodule. Local application of electric pulses to the tumour increases drug delivery into cells, specifically at the site of electric pulse application. Drug uptake by delivery of electric pulses is increased for only those chemotherapeutic drugs whose transport through the plasma membrane is impeded. Among many drugs that have been tested so far, bleomycin and cisplatin found their way from preclinical testing to clinical use. Clinical data collected within a number of clinical studies indicate that approximately 80% of the treated cutaneous and subcutaneous tumour nodules of different malignancies are in an objective response, from these, approximately 70% in complete response after a single application of electrochemotherapy. Usually only one treatment is needed, however, electrochemotherapy can be repeated several times every few weeks with equal effectiveness each time. The treatment results in an effective eradication of the treated nodules, with a good cosmetic effect without tissue scarring.
Medicine, Issue 22, electrochemotherapy, electroporation, cisplatin, bleomycin, malignant tumours, cutaneous lesions
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.