JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Interprofessional health education in Australia: three research projects informing curriculum renewal and development.
Appl Nurs Res
PUBLISHED: 03-08-2014
This paper reports on three interrelated Australian studies that provide a nationally coherent and evidence-informed approach to interprofessional education (IPE). Based on findings from previous studies that IPE tends to be marginalized in mainstream health curriculum, the three studies aspired to produce a range of resources that would guide the sustainable implementation of IPE across the Australian higher education sector.
Authors: Rital B. Bhavsar, Kenta Nakamura, Panagiotis A. Tsonis.
Published: 06-22-2011
Salamanders like newt and axolotl possess the ability to regenerate many of its lost body parts such as limbs, the tail with spinal cord, eye, brain, heart, the jaw 1. Specifically, newts are unique for its lens regeneration capability. Upon lens removal, IPE cells of the dorsal iris transdifferentiate to lens cells and eventually form a new lens in about a month 2,3. This property of regeneration is never exhibited by the ventral iris cells. The regeneration potential of the iris cells can be studied by making transplants of the in vitro cultured IPE cells. For the culture, the dorsal and ventral iris cells are first isolated from the eye and cultured separately for a time period of 2 weeks (Figure 1). These cultured cells are reaggregated and implanted back to the newt eye. Past studies have shown that the dorsal reaggregate maintains its lens forming capacity whereas the ventral aggregate does not form a lens, recapitulating, thus the in vivo process (Figure 2) 4,5. This system of determining regeneration potential of dorsal and ventral iris cells is very useful in studying the role of genes and proteins involved in lens regeneration.
23 Related JoVE Articles!
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Quantification of γH2AX Foci in Response to Ionising Radiation
Authors: Li-Jeen Mah, Raja S. Vasireddy, Michelle M. Tang, George T. Georgiadis, Assam El-Osta, Tom C. Karagiannis.
Institutions: The Alfred Medical Research and Education Precinct, The University of Melbourne, The Alfred Medical Research and Education Precinct.
DNA double-strand breaks (DSBs), which are induced by either endogenous metabolic processes or by exogenous sources, are one of the most critical DNA lesions with respect to survival and preservation of genomic integrity. An early response to the induction of DSBs is phosphorylation of the H2A histone variant, H2AX, at the serine-139 residue, in the highly conserved C-terminal SQEY motif, forming γH2AX1. Following induction of DSBs, H2AX is rapidly phosphorylated by the phosphatidyl-inosito 3-kinase (PIKK) family of proteins, ataxia telangiectasia mutated (ATM), DNA-protein kinase catalytic subunit and ATM and RAD3-related (ATR)2. Typically, only a few base-pairs (bp) are implicated in a DSB, however, there is significant signal amplification, given the importance of chromatin modifications in DNA damage signalling and repair. Phosphorylation of H2AX mediated predominantly by ATM spreads to adjacent areas of chromatin, affecting approximately 0.03% of total cellular H2AX per DSB2,3. This corresponds to phosphorylation of approximately 2000 H2AX molecules spanning ~2 Mbp regions of chromatin surrounding the site of the DSB and results in the formation of discrete γH2AX foci which can be easily visualized and quantitated by immunofluorescence microscopy2. The loss of γH2AX at DSB reflects repair, however, there is some controversy as to what defines complete repair of DSBs; it has been proposed that rejoining of both strands of DNA is adequate however, it has also been suggested that re-instatement of the original chromatin state of compaction is necessary4-8. The disappearence of γH2AX involves at least in part, dephosphorylation by phosphatases, phosphatase 2A and phosphatase 4C5,6. Further, removal of γH2AX by redistribution involving histone exchange with H2A.Z has been implicated7,8. Importantly, the quantitative analysis of γH2AX foci has led to a wide range of applications in medical and nuclear research. Here, we demonstrate the most commonly used immunofluorescence method for evaluation of initial DNA damage by detection and quantitation of γH2AX foci in γ-irradiated adherent human keratinocytes9.
Medicine, Issue 38, H2AX, DNA double-strand break, DNA damage, chromatin modification, repair, ionising radiation
Play Button
Handwriting Analysis Indicates Spontaneous Dyskinesias in Neuroleptic Naïve Adolescents at High Risk for Psychosis
Authors: Derek J. Dean, Hans-Leo Teulings, Michael Caligiuri, Vijay A. Mittal.
Institutions: University of Colorado Boulder, NeuroScript LLC, University of California, San Diego.
Growing evidence suggests that movement abnormalities are a core feature of psychosis. One marker of movement abnormality, dyskinesia, is a result of impaired neuromodulation of dopamine in fronto-striatal pathways. The traditional methods for identifying movement abnormalities include observer-based reports and force stability gauges. The drawbacks of these methods are long training times for raters, experimenter bias, large site differences in instrumental apparatus, and suboptimal reliability. Taking these drawbacks into account has guided the development of better standardized and more efficient procedures to examine movement abnormalities through handwriting analysis software and tablet. Individuals at risk for psychosis showed significantly more dysfluent pen movements (a proximal measure for dyskinesia) in a handwriting task. Handwriting kinematics offers a great advance over previous methods of assessing dyskinesia, which could clearly be beneficial for understanding the etiology of psychosis.
Behavior, Issue 81, Schizophrenia, Disorders with Psychotic Features, Psychology, Clinical, Psychopathology, behavioral sciences, Movement abnormalities, Ultra High Risk, psychosis, handwriting, computer tablet, dyskinesia
Play Button
Measuring Sensitivity to Viewpoint Change with and without Stereoscopic Cues
Authors: Jason Bell, Edwin Dickinson, David R. Badcock, Frederick A. A. Kingdom.
Institutions: Australian National University, University of Western Australia, McGill University.
The speed and accuracy of object recognition is compromised by a change in viewpoint; demonstrating that human observers are sensitive to this transformation. Here we discuss a novel method for simulating the appearance of an object that has undergone a rotation-in-depth, and include an exposition of the differences between perspective and orthographic projections. Next we describe a method by which human sensitivity to rotation-in-depth can be measured. Finally we discuss an apparatus for creating a vivid percept of a 3-dimensional rotation-in-depth; the Wheatstone Eight Mirror Stereoscope. By doing so, we reveal a means by which to evaluate the role of stereoscopic cues in the discrimination of viewpoint rotated shapes and objects.
Behavior, Issue 82, stereo, curvature, shape, viewpoint, 3D, object recognition, rotation-in-depth (RID)
Play Button
A Procedure to Observe Context-induced Renewal of Pavlovian-conditioned Alcohol-seeking Behavior in Rats
Authors: Jean-Marie Maddux, Franca Lacroix, Nadia Chaudhri.
Institutions: Concordia University.
Environmental contexts in which drugs of abuse are consumed can trigger craving, a subjective Pavlovian-conditioned response that can facilitate drug-seeking behavior and prompt relapse in abstinent drug users. We have developed a procedure to study the behavioral and neural processes that mediate the impact of context on alcohol-seeking behavior in rats. Following acclimation to the taste and pharmacological effects of 15% ethanol in the home cage, male Long-Evans rats receive Pavlovian discrimination training (PDT) in conditioning chambers. In each daily (Mon-Fri) PDT session, 16 trials each of two different 10 sec auditory conditioned stimuli occur. During one stimulus, the CS+, 0.2 ml of 15% ethanol is delivered into a fluid port for oral consumption. The second stimulus, the CS-, is not paired with ethanol. Across sessions, entries into the fluid port during the CS+ increase, whereas entries during the CS- stabilize at a lower level, indicating that a predictive association between the CS+ and ethanol is acquired. During PDT each chamber is equipped with a specific configuration of visual, olfactory and tactile contextual stimuli. Following PDT, extinction training is conducted in the same chamber that is now equipped with a different configuration of contextual stimuli. The CS+ and CS- are presented as before, but ethanol is withheld, which causes a gradual decline in port entries during the CS+. At test, rats are placed back into the PDT context and presented with the CS+ and CS- as before, but without ethanol. This manipulation triggers a robust and selective increase in the number of port entries made during the alcohol predictive CS+, with no change in responding during the CS-. This effect, referred to as context-induced renewal, illustrates the powerful capacity of contexts associated with alcohol consumption to stimulate alcohol-seeking behavior in response to Pavlovian alcohol cues.
Behavior, Issue 91, Behavioral neuroscience, alcoholism, relapse, addiction, Pavlovian conditioning, ethanol, reinstatement, discrimination, conditioned approach
Play Button
Surface Renewal: An Advanced Micrometeorological Method for Measuring and Processing Field-Scale Energy Flux Density Data
Authors: Andrew J. McElrone, Thomas M. Shapland, Arturo Calderon, Li Fitzmaurice, Kyaw Tha Paw U, Richard L. Snyder.
Institutions: United States Department of Agriculture-Agricultural Research Service, University of California, Davis, University of Chile, University of California, Davis, URS Corporation Australia Pty. Ltd..
Advanced micrometeorological methods have become increasingly important in soil, crop, and environmental sciences. For many scientists without formal training in atmospheric science, these techniques are relatively inaccessible. Surface renewal and other flux measurement methods require an understanding of boundary layer meteorology and extensive training in instrumentation and multiple data management programs. To improve accessibility of these techniques, we describe the underlying theory of surface renewal measurements, demonstrate how to set up a field station for surface renewal with eddy covariance calibration, and utilize our open-source turnkey data logger program to perform flux data acquisition and processing. The new turnkey program returns to the user a simple data table with the corrected fluxes and quality control parameters, and eliminates the need for researchers to shuttle between multiple processing programs to obtain the final flux data. An example of data generated from these measurements demonstrates how crop water use is measured with this technique. The output information is useful to growers for making irrigation decisions in a variety of agricultural ecosystems. These stations are currently deployed in numerous field experiments by researchers in our group and the California Department of Water Resources in the following crops: rice, wine and raisin grape vineyards, alfalfa, almond, walnut, peach, lemon, avocado, and corn.
Environmental Sciences, Issue 82, Conservation of Natural Resources, Engineering, Agriculture, plants, energy balance, irrigated agriculture, flux data, evapotranspiration, agrometeorology
Play Button
Identification of Key Factors Regulating Self-renewal and Differentiation in EML Hematopoietic Precursor Cells by RNA-sequencing Analysis
Authors: Shan Zong, Shuyun Deng, Kenian Chen, Jia Qian Wu.
Institutions: The University of Texas Graduate School of Biomedical Sciences at Houston.
Hematopoietic stem cells (HSCs) are used clinically for transplantation treatment to rebuild a patient's hematopoietic system in many diseases such as leukemia and lymphoma. Elucidating the mechanisms controlling HSCs self-renewal and differentiation is important for application of HSCs for research and clinical uses. However, it is not possible to obtain large quantity of HSCs due to their inability to proliferate in vitro. To overcome this hurdle, we used a mouse bone marrow derived cell line, the EML (Erythroid, Myeloid, and Lymphocytic) cell line, as a model system for this study. RNA-sequencing (RNA-Seq) has been increasingly used to replace microarray for gene expression studies. We report here a detailed method of using RNA-Seq technology to investigate the potential key factors in regulation of EML cell self-renewal and differentiation. The protocol provided in this paper is divided into three parts. The first part explains how to culture EML cells and separate Lin-CD34+ and Lin-CD34- cells. The second part of the protocol offers detailed procedures for total RNA preparation and the subsequent library construction for high-throughput sequencing. The last part describes the method for RNA-Seq data analysis and explains how to use the data to identify differentially expressed transcription factors between Lin-CD34+ and Lin-CD34- cells. The most significantly differentially expressed transcription factors were identified to be the potential key regulators controlling EML cell self-renewal and differentiation. In the discussion section of this paper, we highlight the key steps for successful performance of this experiment. In summary, this paper offers a method of using RNA-Seq technology to identify potential regulators of self-renewal and differentiation in EML cells. The key factors identified are subjected to downstream functional analysis in vitro and in vivo.
Genetics, Issue 93, EML Cells, Self-renewal, Differentiation, Hematopoietic precursor cell, RNA-Sequencing, Data analysis
Play Button
Modeling Astrocytoma Pathogenesis In Vitro and In Vivo Using Cortical Astrocytes or Neural Stem Cells from Conditional, Genetically Engineered Mice
Authors: Robert S. McNeill, Ralf S. Schmid, Ryan E. Bash, Mark Vitucci, Kristen K. White, Andrea M. Werneke, Brian H. Constance, Byron Huff, C. Ryan Miller.
Institutions: University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, Emory University School of Medicine, University of North Carolina School of Medicine.
Current astrocytoma models are limited in their ability to define the roles of oncogenic mutations in specific brain cell types during disease pathogenesis and their utility for preclinical drug development. In order to design a better model system for these applications, phenotypically wild-type cortical astrocytes and neural stem cells (NSC) from conditional, genetically engineered mice (GEM) that harbor various combinations of floxed oncogenic alleles were harvested and grown in culture. Genetic recombination was induced in vitro using adenoviral Cre-mediated recombination, resulting in expression of mutated oncogenes and deletion of tumor suppressor genes. The phenotypic consequences of these mutations were defined by measuring proliferation, transformation, and drug response in vitro. Orthotopic allograft models, whereby transformed cells are stereotactically injected into the brains of immune-competent, syngeneic littermates, were developed to define the role of oncogenic mutations and cell type on tumorigenesis in vivo. Unlike most established human glioblastoma cell line xenografts, injection of transformed GEM-derived cortical astrocytes into the brains of immune-competent littermates produced astrocytomas, including the most aggressive subtype, glioblastoma, that recapitulated the histopathological hallmarks of human astrocytomas, including diffuse invasion of normal brain parenchyma. Bioluminescence imaging of orthotopic allografts from transformed astrocytes engineered to express luciferase was utilized to monitor in vivo tumor growth over time. Thus, astrocytoma models using astrocytes and NSC harvested from GEM with conditional oncogenic alleles provide an integrated system to study the genetics and cell biology of astrocytoma pathogenesis in vitro and in vivo and may be useful in preclinical drug development for these devastating diseases.
Neuroscience, Issue 90, astrocytoma, cortical astrocytes, genetically engineered mice, glioblastoma, neural stem cells, orthotopic allograft
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Flexible Colonoscopy in Mice to Evaluate the Severity of Colitis and Colorectal Tumors Using a Validated Endoscopic Scoring System
Authors: Tomohiro Kodani, Alex Rodriguez-Palacios, Daniele Corridoni, Loris Lopetuso, Luca Di Martino, Brian Marks, James Pizarro, Theresa Pizarro, Amitabh Chak, Fabio Cominelli.
Institutions: Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland.
The use of modern endoscopy for research purposes has greatly facilitated our understanding of gastrointestinal pathologies. In particular, experimental endoscopy has been highly useful for studies that require repeated assessments in a single laboratory animal, such as those evaluating mechanisms of chronic inflammatory bowel disease and the progression of colorectal cancer. However, the methods used across studies are highly variable. At least three endoscopic scoring systems have been published for murine colitis and published protocols for the assessment of colorectal tumors fail to address the presence of concomitant colonic inflammation. This study develops and validates a reproducible endoscopic scoring system that integrates evaluation of both inflammation and tumors simultaneously. This novel scoring system has three major components: 1) assessment of the extent and severity of colorectal inflammation (based on perianal findings, transparency of the wall, mucosal bleeding, and focal lesions), 2) quantitative recording of tumor lesions (grid map and bar graph), and 3) numerical sorting of clinical cases by their pathological and research relevance based on decimal units with assigned categories of observed lesions and endoscopic complications (decimal identifiers). The video and manuscript presented herein were prepared, following IACUC-approved protocols, to allow investigators to score their own experimental mice using a well-validated and highly reproducible endoscopic methodology, with the system option to differentiate distal from proximal endoscopic colitis (D-PECS).
Medicine, Issue 80, Crohn's disease, ulcerative colitis, colon cancer, Clostridium difficile, SAMP mice, DSS/AOM-colitis, decimal scoring identifier
Play Button
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Authors: Madeleine E. Hackney, Kathleen McKee.
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
Play Button
Aseptic Laboratory Techniques: Plating Methods
Authors: Erin R. Sanders.
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories (BMBL) as well as Material Safety Data Sheets (MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection (ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to: ● Perform plating procedures without contaminating media. ● Isolate single bacterial colonies by the streak-plating method. ● Use pour-plating and spread-plating methods to determine the concentration of bacteria. ● Perform soft agar overlays when working with phage. ● Transfer bacterial cells from one plate to another using the replica-plating procedure. ● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
Play Button
Clonogenic Assay: Adherent Cells
Authors: Haloom Rafehi, Christian Orlowski, George T. Georgiadis, Katherine Ververis, Assam El-Osta, Tom C. Karagiannis.
Institutions: The Alfred Medical Research and Education Precinct, The University of Melbourne, The Alfred Medical Research and Education Precinct, The University of Melbourne.
The clonogenic (or colony forming) assay has been established for more than 50 years; the original paper describing the technique was published in 19561. Apart from documenting the method, the initial landmark study generated the first radiation-dose response curve for X-ray irradiated mammalian (HeLa) cells in culture1. Basically, the clonogenic assay enables an assessment of the differences in reproductive viability (capacity of cells to produce progeny; i.e. a single cell to form a colony of 50 or more cells) between control untreated cells and cells that have undergone various treatments such as exposure to ionising radiation, various chemical compounds (e.g. cytotoxic agents) or in other cases genetic manipulation. The assay has become the most widely accepted technique in radiation biology and has been widely used for evaluating the radiation sensitivity of different cell lines. Further, the clonogenic assay is commonly used for monitoring the efficacy of radiation modifying compounds and for determining the effects of cytotoxic agents and other anti-cancer therapeutics on colony forming ability, in different cell lines. A typical clonogenic survival experiment using adherent cells lines involves three distinct components, 1) treatment of the cell monolayer in tissue culture flasks, 2) preparation of single cell suspensions and plating an appropriate number of cells in petri dishes and 3) fixing and staining colonies following a relevant incubation period, which could range from 1-3 weeks, depending on the cell line. Here we demonstrate the general procedure for performing the clonogenic assay with adherent cell lines with the use of an immortalized human keratinocyte cell line (FEP-1811)2. Also, our aims are to describe common features of clonogenic assays including calculation of the plating efficiency and survival fractions after exposure of cells to radiation, and to exemplify modification of radiation-response with the use of a natural antioxidant formulation.
Cellular Biology, Issue 49, clonogenic assay, clonogenic survival, colony staining, colony counting, radiation sensitivity, radiation modification
Play Button
Quantitation of γH2AX Foci in Tissue Samples
Authors: Michelle M. Tang, Li-Jeen Mah, Raja S. Vasireddy, George T. Georgiadis, Assam El-Osta, Simon G. Royce, Tom C. Karagiannis.
Institutions: The Alfred Medical Research and Education Precinct, The Alfred Medical Research and Education Precinct, The University of Melbourne, Royal Children's Hospital, The University of Melbourne.
DNA double-strand breaks (DSBs) are particularly lethal and genotoxic lesions, that can arise either by endogenous (physiological or pathological) processes or by exogenous factors, particularly ionizing radiation and radiomimetic compounds. Phosphorylation of the H2A histone variant, H2AX, at the serine-139 residue, in the highly conserved C-terminal SQEY motif, forming γH2AX, is an early response to DNA double-strand breaks1. This phosphorylation event is mediated by the phosphatidyl-inosito 3-kinase (PI3K) family of proteins, ataxia telangiectasia mutated (ATM), DNA-protein kinase catalytic subunit and ATM and RAD3-related (ATR)2. Overall, DSB induction results in the formation of discrete nuclear γH2AX foci which can be easily detected and quantitated by immunofluorescence microscopy2. Given the unique specificity and sensitivity of this marker, analysis of γH2AX foci has led to a wide range of applications in biomedical research, particularly in radiation biology and nuclear medicine. The quantitation of γH2AX foci has been most widely investigated in cell culture systems in the context of ionizing radiation-induced DSBs. Apart from cellular radiosensitivity, immunofluorescence based assays have also been used to evaluate the efficacy of radiation-modifying compounds. In addition, γH2AX has been used as a molecular marker to examine the efficacy of various DSB-inducing compounds and is recently being heralded as important marker of ageing and disease, particularly cancer3. Further, immunofluorescence-based methods have been adapted to suit detection and quantitation of γH2AX foci ex vivo and in vivo4,5. Here, we demonstrate a typical immunofluorescence method for detection and quantitation of γH2AX foci in mouse tissues.
Cellular Biology, Issue 40, immunofluorescence, DNA double-strand breaks, histone variant, H2AX, DNA damage, ionising radiation, reactive oxygen species
Play Button
Evaluation of the Spatial Distribution of γH2AX following Ionizing Radiation
Authors: Raja S. Vasireddy, Michelle M. Tang, Li-Jeen Mah, George T. Georgiadis, Assam El-Osta, Tom C. Karagiannis.
Institutions: The Alfred Medical Research and Education Precinct, The Alfred Medical Research and Education Precinct, University of Melbourne.
An early molecular response to DNA double-strand breaks (DSBs) is phosphorylation of the Ser-139 residue within the terminal SQEY motif of the histone H2AX1,2. This phosphorylation of H2AX is mediated by the phosphatidyl-inosito 3-kinase (PI3K) family of proteins, ataxia telangiectasia mutated (ATM), DNA-protein kinase catalytic subunit and ATM and RAD3-related (ATR)3. The phosphorylated form of H2AX, referred to as γH2AX, spreads to adjacent regions of chromatin from the site of the DSB, forming discrete foci, which are easily visualized by immunofluorecence microscopy3. Analysis and quantitation of γH2AX foci has been widely used to evaluate DSB formation and repair, particularly in response to ionizing radiation and for evaluating the efficacy of various radiation modifying compounds and cytotoxic compounds4. Given the exquisite specificity and sensitivity of this de novo marker of DSBs, it has provided new insights into the processes of DNA damage and repair in the context of chromatin. For example, in radiation biology the central paradigm is that the nuclear DNA is the critical target with respect to radiation sensitivity. Indeed, the general consensus in the field has largely been to view chromatin as a homogeneous template for DNA damage and repair. However, with the use of γH2AX as molecular marker of DSBs, a disparity in γ-irradiation-induced γH2AX foci formation in euchromatin and heterochromatin has been observed5-7. Recently, we used a panel of antibodies to either mono-, di- or tri- methylated histone H3 at lysine 9 (H3K9me1, H3K9me2, H3K9me3) which are epigenetic imprints of constitutive heterochromatin and transcriptional silencing and lysine 4 (H3K4me1, H3K4me2, H3K4me3), which are tightly correlated actively transcribing euchromatic regions, to investigate the spatial distribution of γH2AX following ionizing radiation8. In accordance with the prevailing ideas regarding chromatin biology, our findings indicated a close correlation between γH2AX formation and active transcription9. Here we demonstrate our immunofluorescence method for detection and quantitation of γH2AX foci in non-adherent cells, with a particular focus on co-localization with other epigenetic markers, image analysis and 3D-modeling.
Cellular Biology, Issue 42, H2AX, radiation, euchromatin, heterochromatin, immunofluorescence, 3D-modeling
Play Button
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Authors: Rivkeh Y. Haryono, Madeline A. Sprajcer, Russell S. J. Keast.
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
Play Button
Developing Neuroimaging Phenotypes of the Default Mode Network in PTSD: Integrating the Resting State, Working Memory, and Structural Connectivity
Authors: Noah S. Philip, S. Louisa Carpenter, Lawrence H. Sweet.
Institutions: Alpert Medical School, Brown University, University of Georgia.
Complementary structural and functional neuroimaging techniques used to examine the Default Mode Network (DMN) could potentially improve assessments of psychiatric illness severity and provide added validity to the clinical diagnostic process. Recent neuroimaging research suggests that DMN processes may be disrupted in a number of stress-related psychiatric illnesses, such as posttraumatic stress disorder (PTSD). Although specific DMN functions remain under investigation, it is generally thought to be involved in introspection and self-processing. In healthy individuals it exhibits greatest activity during periods of rest, with less activity, observed as deactivation, during cognitive tasks, e.g., working memory. This network consists of the medial prefrontal cortex, posterior cingulate cortex/precuneus, lateral parietal cortices and medial temporal regions. Multiple functional and structural imaging approaches have been developed to study the DMN. These have unprecedented potential to further the understanding of the function and dysfunction of this network. Functional approaches, such as the evaluation of resting state connectivity and task-induced deactivation, have excellent potential to identify targeted neurocognitive and neuroaffective (functional) diagnostic markers and may indicate illness severity and prognosis with increased accuracy or specificity. Structural approaches, such as evaluation of morphometry and connectivity, may provide unique markers of etiology and long-term outcomes. Combined, functional and structural methods provide strong multimodal, complementary and synergistic approaches to develop valid DMN-based imaging phenotypes in stress-related psychiatric conditions. This protocol aims to integrate these methods to investigate DMN structure and function in PTSD, relating findings to illness severity and relevant clinical factors.
Medicine, Issue 89, default mode network, neuroimaging, functional magnetic resonance imaging, diffusion tensor imaging, structural connectivity, functional connectivity, posttraumatic stress disorder
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Using Whole Mount in situ Hybridization to Link Molecular and Organismal Biology
Authors: Nicole L. Jacobs, R. Craig Albertson, Jason R. Wiles.
Institutions: Syracuse University, Syracuse University.
Whole mount in situ hybridization (WISH) is a common technique in molecular biology laboratories used to study gene expression through the localization of specific mRNA transcripts within whole mount specimen. This technique (adapted from Albertson and Yelick, 2005) was used in an upper level undergraduate Comparative Vertebrate Biology laboratory classroom at Syracuse University. The first two thirds of the Comparative Vertebrate Biology lab course gave students the opportunity to study the embryology and gross anatomy of several organisms representing various chordate taxa primarily via traditional dissections and the use of models. The final portion of the course involved an innovative approach to teaching anatomy through observation of vertebrate development employing molecular techniques in which WISH was performed on zebrafish embryos. A heterozygous fibroblast growth factor 8 a (fgf8a) mutant line, ace, was used. Due to Mendelian inheritance, ace intercrosses produced wild type, heterozygous, and homozygous ace/fgf8a mutants in a 1:2:1 ratio. RNA probes with known expression patterns in the midline and in developing anatomical structures such as the heart, somites, tailbud, myotome, and brain were used. WISH was performed using zebrafish at the 13 somite and prim-6 stages, with students performing the staining reaction in class. The study of zebrafish embryos at different stages of development gave students the ability to observe how these anatomical structures changed over ontogeny. In addition, some ace/fgf8a mutants displayed improper heart looping, and defects in somite and brain development. The students in this lab observed the normal development of various organ systems using both external anatomy as well as gene expression patterns. They also identified and described embryos displaying improper anatomical development and gene expression (i.e., putative mutants). For instructors at institutions that do not already own the necessary equipment or where funds for lab and curricular innovation are limited, the financial cost of the reagents and apparatus may be a factor to consider, as will the time and effort required on the part of the instructor regardless of the setting. Nevertheless, we contend that the use of WISH in this type of classroom laboratory setting can provide an important link between developmental genetics and anatomy. As technology advances and the ability to study organismal development at the molecular level becomes easier, cheaper, and increasingly popular, many evolutionary biologists, ecologists, and physiologists are turning to research strategies in the field of molecular biology. Using WISH in a Comparative Vertebrate Biology laboratory classroom is one example of how molecules and anatomy can converge within a single course. This gives upper level college students the opportunity to practice modern biological research techniques, leading to a more diversified education and the promotion of future interdisciplinary scientific research.
Developmental Biology, Issue 49, in situ hybridization, genetics, development, anatomy, vertebrate, undergraduate, education, interdisciplinary
Play Button
Computer-Generated Animal Model Stimuli
Authors: Kevin L. Woo.
Institutions: Macquarie University.
Communication between animals is diverse and complex. Animals may communicate using auditory, seismic, chemosensory, electrical, or visual signals. In particular, understanding the constraints on visual signal design for communication has been of great interest. Traditional methods for investigating animal interactions have used basic observational techniques, staged encounters, or physical manipulation of morphology. Less intrusive methods have tried to simulate conspecifics using crude playback tools, such as mirrors, still images, or models. As technology has become more advanced, video playback has emerged as another tool in which to examine visual communication (Rosenthal, 2000). However, to move one step further, the application of computer-animation now allows researchers to specifically isolate critical components necessary to elicit social responses from conspecifics, and manipulate these features to control interactions. Here, I provide detail on how to create an animation using the Jacky dragon as a model, but this process may be adaptable for other species. In building the animation, I elected to use Lightwave 3D to alter object morphology, add texture, install bones, and provide comparable weight shading that prevents exaggerated movement. The animation is then matched to select motor patterns to replicate critical movement features. Finally, the sequence must rendered into an individual clip for presentation. Although there are other adaptable techniques, this particular method had been demonstrated to be effective in eliciting both conspicuous and social responses in staged interactions.
Neuroscience, Issue 6, behavior, lizard, simulation, animation
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Play Button
Using Learning Outcome Measures to assess Doctoral Nursing Education
Authors: Glenn H. Raup, Jeff King, Romana J. Hughes, Natasha Faidley.
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.