Anxiety testing in zebrafish is often studied in combination with the application of pharmacological substances. In these studies, fish are routinely netted and transported between home aquaria and dosing tanks. In order to enhance the ease of compound administration, a novel method for transferring fish between tanks for drug administration was developed. Inserts that are designed for spawning were used to transfer groups of fish into the drug solution, allowing accurate dosing of all fish in the group. This increases the precision and efficiency of dosing, which becomes very important in long schedules of repeated drug administration. We implemented this procedure for use in a study examining the behavior of zebrafish in the light/dark test after administering ethanol with differing 21 day schedules. In fish exposed to daily-moderate amounts of alcohol there was a significant difference in location preference after 2 days of withdrawal when compared to the control group. However, a significant difference in location preference in a group exposed to weekly-binge administration was not observed.
This protocol can be generalized for use with all types of compounds that are water-soluble and may be used in any situation when the behavior of fish during or after long schedules of drug administration is being examined. The light/dark test is also a valuable method of assessing withdrawal-induced changes in anxiety.
20 Related JoVE Articles!
The Goeckerman Regimen for the Treatment of Moderate to Severe Psoriasis
Institutions: University of Southern California, University of California, San Francisco , University of California Irvine School of Medicine, University of Arizona College of Medicine, Chicago College of Osteopathic Medicine.
Psoriasis is a chronic, immune-mediated inflammatory skin disease affecting approximately 2-3% of the population. The Goeckerman regimen consists of exposure to ultraviolet B (UVB) light and application of crude coal tar (CCT). Goeckerman therapy is extremely effective and relatively safe for the treatment of psoriasis and for improving a patient's quality of life. In the following article, we present our protocol for the Goeckerman therapy that is utilized specifically at the University of California, San Francisco. This protocol details the preparation of supplies, administration of phototherapy and application of topical tar. This protocol also describes how to assess the patient daily, monitor for adverse effects (including pruritus and burning), and adjust the treatment based on the patient's response. Though it is one of the oldest therapies available for psoriasis, there is an absence of any published videos demonstrating the process in detail. The video is beneficial for healthcare providers who want to administer the therapy, for trainees who want to learn more about the process, and for prospective patients who want to undergo treatment for their cutaneous disease.
Medicine, Issue 77, Infection, Biomedical Engineering, Anatomy, Physiology, Immunology, Dermatology, Skin, Dermis, Epidermis, Skin Diseases, Skin Diseases, Eczematous, Goeckerman, Crude Coal Tar, phototherapy, psoriasis, Eczema, Goeckerman regimen, clinical techniques
High-Sensitivity Nuclear Magnetic Resonance at Giga-Pascal Pressures: A New Tool for Probing Electronic and Chemical Properties of Condensed Matter under Extreme Conditions
Institutions: University of Leipzig.
Nuclear Magnetic Resonance (NMR) is one of the most important techniques for the study of condensed matter systems, their chemical structure, and their electronic properties. The application of high pressure enables one to synthesize new materials, but the response of known materials to high pressure is a very useful tool for studying their electronic structure and developing theories. For example, high-pressure synthesis might be at the origin of life; and understanding the behavior of small molecules under extreme pressure will tell us more about fundamental processes in our universe. It is no wonder that there has always been great interest in having NMR available at high pressures. Unfortunately, the desired pressures are often well into the Giga-Pascal (GPa) range and require special anvil cell devices where only very small, secluded volumes are available. This has restricted the use of NMR almost entirely in the past, and only recently, a new approach to high-sensitivity GPa NMR, which has a resonating micro-coil inside the sample chamber, was put forward. This approach enables us to achieve high sensitivity with experiments that bring the power of NMR to Giga-Pascal pressure condensed matter research. First applications, the detection of a topological electronic transition in ordinary aluminum metal and the closing of the pseudo-gap in high-temperature superconductivity, show the power of such an approach. Meanwhile, the range of achievable pressures was increased tremendously with a new generation of anvil cells (up to 10.1 GPa), that fit standard-bore NMR magnets. This approach might become a new, important tool for the investigation of many condensed matter systems, in chemistry, geochemistry, and in physics, since we can now watch structural changes with the eyes of a very versatile probe.
Physics, Issue 92, NMR, micro-coil, anvil cell, high pressures, condensed matter, radio-frequency
Quantification of the Respiratory Burst Response as an Indicator of Innate Immune Health in Zebrafish
Institutions: University of Maine.
The phagocyte respiratory burst is part of the innate immune response to pathogen infection and involves the production of reactive oxygen species (ROS). ROS are toxic and function to kill phagocytized microorganisms. In vivo
quantification of phagocyte-derived ROS provides information regarding an organism's ability to mount a robust innate immune response. Here we describe a protocol to quantify and compare ROS in whole zebrafish embryos upon chemical induction of the phagocyte respiratory burst. This method makes use of a non-fluorescent compound that becomes fluorescent upon oxidation by ROS. Individual zebrafish embryos are pipetted into the wells of a microplate and incubated in this fluorogenic substrate with or without a chemical inducer of the respiratory burst. Fluorescence in each well is quantified at desired time points using a microplate reader. Fluorescence readings are adjusted to eliminate background fluorescence and then compared using an unpaired t-test. This method allows for comparison of the respiratory burst potential of zebrafish embryos at different developmental stages and in response to experimental manipulations such as protein knockdown, overexpression, or treatment with pharmacological agents. This method can also be used to monitor the respiratory burst response in whole dissected kidneys or cell preparations from kidneys of adult zebrafish and some other fish species. We believe that the relative simplicity and adaptability of this protocol will complement existing protocols and will be of interest to researchers who seek to better understand the innate immune response.
Immunology, Issue 79, Phagocytes, Immune System, Zebrafish, Reactive Oxygen Species, Immune System Processes, Host-Pathogen Interactions, Respiratory Burst, Immune System Phenomena, innate immunity, bacteria, virus, infection]
Heterotopic Mucosal Engrafting Procedure for Direct Drug Delivery to the Brain in Mice
Institutions: Boston University, Harvard Medical School.
Delivery of therapeutics into the brain is impeded by the presence of the blood-brain barrier (BBB) which restricts the passage of polar and high molecular weight compounds from the bloodstream and into brain tissue. Some direct delivery success in humans has been achieved via implantation of transcranial catheters; however this method is highly invasive and associated with numerous complications. A less invasive alternative would be to dose the brain through a surgically implanted, semipermeable membrane such as the nasal mucosa that is used to repair skull base defects following endoscopic transnasal tumor removal surgery in humans. Drug transfer though this membrane would effectively bypass the BBB and diffuse directly into the brain and cerebrospinal fluid. Inspired by this approach, a surgical approach in mice was developed that uses a donor septal mucosal membrane engrafted over an extracranial surgical BBB defect. This model has been shown to effectively allow the passage of high molecular weight compounds into the brain. Since numerous drug candidates are incapable of crossing the BBB, this model is valuable for performing preclinical testing of novel therapies for neurological and psychiatric diseases.
Medicine, Issue 89, drug delivery, mucosa membrane, blood-brain barrier, neurosurgery, transnasal, mouse model
Preparation and Pathogen Inactivation of Double Dose Buffy Coat Platelet Products using the INTERCEPT Blood System
Institutions: Örebro University Hospital.
Blood centers are faced with many challenges including maximizing production yield from the blood product donations they receive as well as ensuring the highest possible level of safety for transfusion patients, including protection from transfusion transmitted diseases. This must be accomplished in a fiscally responsible manner which minimizes operating expenses including consumables, equipment, waste, and personnel costs, among others.
Several methods are available to produce platelet concentrates for transfusion. One of the most common is the buffy coat method in which a single therapeutic platelet unit (≥ 2.0 x1011
platelets per unit or per local regulations) is prepared by pooling the buffy coat layer from up to six whole blood donations. A procedure for producing "double dose" whole blood derived platelets has only recently been developed.
Presented here is a novel method for preparing double dose whole blood derived platelet concentrates from pools of 7 buffy coats and subsequently treating the double dose units with the INTERCEPT Blood System for pathogen inactivation. INTERCEPT was developed to inactivate viruses, bacteria, parasites, and contaminating donor white cells which may be present in donated blood. Pairing INTERCEPT with the double dose buffy coat method by utilizing the INTERCEPT Processing Set with Dual Storage Containers (the "DS set"), allows blood centers to treat each of their double dose units in a single pathogen inactivation processing set, thereby maximizing patient safety while minimizing costs. The double dose buffy coat method requires fewer buffy coats and reduces the use of consumables by up to 50% (e.g.
pooling sets, filter sets, platelet additive solution, and sterile connection wafers) compared to preparation and treatment of single dose buffy coat platelet units. Other cost savings include less waste, less equipment maintenance, lower power requirements, reduced personnel time, and lower collection cost compared to the apheresis technique.
Medicine, Issue 70, Immunology, Hematology, Infectious Disease, Pathology, pathogen inactivation, pathogen reduction, double-dose platelets, INTERCEPT Blood System, amotosalen, UVA, platelet, blood processing, buffy coat, IBS, transfusion
Stereotactic Radiosurgery for Gynecologic Cancer
Institutions: University Hospitals Case Medical Center and Case Western Reserve University School of Medicine, University Hospitals Case Medical Center and Case Western Reserve University School of Medicine.
Stereotactic body radiotherapy (SBRT) distinguishes itself by necessitating more rigid patient immobilization, accounting for respiratory motion, intricate treatment planning, on-board imaging, and reduced number of ablative radiation doses to cancer targets usually refractory to chemotherapy and conventional radiation. Steep SBRT radiation dose drop-off permits narrow 'pencil beam' treatment fields to be used for ablative radiation treatment condensed into 1 to 3 treatments.
Treating physicians must appreciate that SBRT comes at a bigger danger of normal tissue injury and chance of geographic tumor miss. Both must be tackled by immobilization of cancer targets and by high-precision treatment delivery. Cancer target immobilization has been achieved through use of indexed customized Styrofoam casts, evacuated bean bags, or body-fix molds with patient-independent abdominal compression.1-3
Intrafraction motion of cancer targets due to breathing now can be reduced by patient-responsive breath hold techniques,4
patient mouthpiece active breathing coordination,5
respiration-correlated computed tomography,6
or image-guided tracking of fiducials implanted within and around a moving tumor.7-9
The Cyberknife system (Accuray [Sunnyvale, CA]) utilizes a radiation linear accelerator mounted on a industrial robotic arm that accurately follows patient respiratory motion by a camera-tracked set of light-emitting diodes (LED) impregnated on a vest fitted to a patient.10
Substantial reductions in radiation therapy margins can be achieved by motion tracking, ultimately rendering a smaller planning target volumes that are irradiated with submillimeter accuracy.11-13
Cancer targets treated by SBRT are irradiated by converging, tightly collimated beams. Resultant radiation dose to cancer target volume histograms have a more pronounced radiation "shoulder" indicating high percentage target coverage and a small high-dose radiation "tail." Thus, increased target conformality comes at the expense of decreased dose uniformity in the SBRT cancer target. This may have implications for both subsequent tumor control in the SBRT target and normal tissue tolerance of organs at-risk. Due to the sharp dose falloff in SBRT, the possibility of occult disease escaping ablative radiation dose occurs when cancer targets are not fully recognized and inadequate SBRT dose margins are applied. Clinical target volume (CTV) expansion by 0.5 cm, resulting in a larger planning target volume (PTV), is associated with increased target control without undue normal tissue injury.7,8
Further reduction in the probability of geographic miss may be achieved by incorporation of 2-[18
F-FDG) positron emission tomography (PET).8
Use of 18
F-FDG PET/CT in SBRT treatment planning is only the beginning of attempts to discover new imaging target molecular signatures for gynecologic cancers.
Medicine, Issue 62, radiosurgery, Cyberknife stereotactic radiosurgery, radiation, ovarian cancer, cervix cancer
High-throughput Screening for Broad-spectrum Chemical Inhibitors of RNA Viruses
Institutions: Institut Pasteur, CNRS UMR3569, Institut Pasteur, CNRS UMR3523, Institut Pasteur.
RNA viruses are responsible for major human diseases such as flu, bronchitis, dengue, Hepatitis C or measles. They also represent an emerging threat because of increased worldwide exchanges and human populations penetrating more and more natural ecosystems. A good example of such an emerging situation is chikungunya virus epidemics of 2005-2006 in the Indian Ocean. Recent progresses in our understanding of cellular pathways controlling viral replication suggest that compounds targeting host cell functions, rather than the virus itself, could inhibit a large panel of RNA viruses. Some broad-spectrum antiviral compounds have been identified with host target-oriented assays. However, measuring the inhibition of viral replication in cell cultures using reduction of cytopathic effects as a readout still represents a paramount screening strategy. Such functional screens have been greatly improved by the development of recombinant viruses expressing reporter enzymes capable of bioluminescence such as luciferase. In the present report, we detail a high-throughput screening pipeline, which combines recombinant measles and chikungunya viruses with cellular viability assays, to identify compounds with a broad-spectrum antiviral profile.
Immunology, Issue 87, Viral infections, high-throughput screening assays, broad-spectrum antivirals, chikungunya virus, measles virus, luciferase reporter, chemical libraries
Pharmacologic Induction of Epidermal Melanin and Protection Against Sunburn in a Humanized Mouse Model
Institutions: University of Kentucky College of Medicine, University of Kentucky College of Medicine, University of Kentucky College of Medicine, University of Kentucky College of Medicine.
Fairness of skin, UV sensitivity and skin cancer risk all correlate with the physiologic function of the melanocortin 1 receptor, a Gs
-coupled signaling protein found on the surface of melanocytes. Mc1r stimulates adenylyl cyclase and cAMP production which, in turn, up-regulates melanocytic production of melanin in the skin. In order to study the mechanisms by which Mc1r signaling protects the skin against UV injury, this study relies on a mouse model with "humanized skin" based on epidermal expression of stem cell factor (Scf). K14-Scf
transgenic mice retain melanocytes in the epidermis and therefore have the ability to deposit melanin in the epidermis. In this animal model, wild type Mc1r status results in robust deposition of black eumelanin pigment and a UV-protected phenotype. In contrast, K14-Scf
animals with defective Mc1r signaling ability exhibit a red/blonde pigmentation, very little eumelanin in the skin and a UV-sensitive phenotype. Reasoning that eumelanin deposition might be enhanced by topical agents that mimic Mc1r signaling, we found that direct application of forskolin extract to the skin of Mc1r-defective fair-skinned mice resulted in robust eumelanin induction and UV protection 1
. Here we describe the method for preparing and applying a forskolin-containing natural root extract to K14-Scf
fair-skinned mice and report a method for measuring UV sensitivity by determining minimal erythematous dose (MED). Using this animal model, it is possible to study how epidermal cAMP induction and melanization of the skin affect physiologic responses to UV exposure.
Medicine, Issue 79, Skin, Inflammation, Photometry, Ultraviolet Rays, Skin Pigmentation, melanocortin 1 receptor, Mc1r, forskolin, cAMP, mean erythematous dose, skin pigmentation, melanocyte, melanin, sunburn, UV, inflammation
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Voluntary Breath-hold Technique for Reducing Heart Dose in Left Breast Radiotherapy
Institutions: Royal Marsden NHS Foundation Trust, University of Surrey, Institute of Cancer Research, Sutton, UK, Institute of Cancer Research, Sutton, UK.
Breath-holding techniques reduce the amount of radiation received by cardiac structures during tangential-field left breast radiotherapy. With these techniques, patients hold their breath while radiotherapy is delivered, pushing the heart down and away from the radiotherapy field. Despite clear dosimetric benefits, these techniques are not yet in widespread use. One reason for this is that commercially available solutions require specialist equipment, necessitating not only significant capital investment, but often also incurring ongoing costs such as a need for daily disposable mouthpieces. The voluntary breath-hold technique described here does not require any additional specialist equipment. All breath-holding techniques require a surrogate to monitor breath-hold consistency and whether breath-hold is maintained. Voluntary breath-hold uses the distance moved by the anterior and lateral reference marks (tattoos) away from the treatment room lasers in breath-hold to monitor consistency at CT-planning and treatment setup. Light fields are then used to monitor breath-hold consistency prior to and during radiotherapy delivery.
Medicine, Issue 89, breast, radiotherapy, heart, cardiac dose, breath-hold
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Recording Single Neurons' Action Potentials from Freely Moving Pigeons Across Three Stages of Learning
Institutions: Ruhr-University Bochum.
While the subject of learning has attracted immense interest from both behavioral and neural scientists, only relatively few investigators have observed single-neuron activity while animals are acquiring an operantly conditioned response, or when that response is extinguished. But even in these cases, observation periods usually encompass only a single stage of learning, i.e.
acquisition or extinction, but not both (exceptions include protocols employing reversal learning; see Bingman et al.1
for an example). However, acquisition and extinction entail different learning mechanisms and are therefore expected to be accompanied by different types and/or loci of neural plasticity.
Accordingly, we developed a behavioral paradigm which institutes three stages of learning in a single behavioral session and which is well suited for the simultaneous recording of single neurons' action potentials. Animals are trained on a single-interval forced choice task which requires mapping each of two possible choice responses to the presentation of different novel visual stimuli (acquisition). After having reached a predefined performance criterion, one of the two choice responses is no longer reinforced (extinction). Following a certain decrement in performance level, correct responses are reinforced again (reacquisition). By using a new set of stimuli in every session, animals can undergo the acquisition-extinction-reacquisition process repeatedly. Because all three stages of learning occur in a single behavioral session, the paradigm is ideal for the simultaneous observation of the spiking output of multiple single neurons. We use pigeons as model systems, but the task can easily be adapted to any other species capable of conditioned discrimination learning.
Neuroscience, Issue 88, pigeon, single unit recording, learning, memory, extinction, spike sorting, operant conditioning, reward, electrophysiology, animal cognition, model species
Intranasal Administration of CNS Therapeutics to Awake Mice
Institutions: HealthPartners Institute for Education and Research.
Intranasal administration is a method of delivering therapeutic agents to the central nervous system (CNS). It is non-invasive and allows large molecules that do not cross the blood-brain barrier access to the CNS. Drugs are directly targeted to the CNS with intranasal delivery, reducing systemic exposure and thus unwanted systemic side effects1
. Delivery from the nose to the CNS occurs within minutes along both the olfactory and trigeminal neural pathways via an extracellular route and does not require drug to bind to any receptor or axonal transport2
. Intranasal delivery is a widely publicized method and is currently being used in human clinical trials3
Intranasal delivery of drugs in animal models allows for initial evaluation of pharmacokinetic distribution and efficacy. With mice, it is possible to administer drugs to awake (non-anesthetized) animals on a regular basis using a specialized intranasal grip. Awake delivery is beneficial because it allows for long-term chronic dosing without anesthesia, it takes less time than with anesthesia, and can be learned and done by many people so that teams of technicians can dose large numbers of mice in short periods. Efficacy of therapeutics administered intranasally in this way to mice has been demonstrated in a number of studies including insulin in diabetic mouse models 4-6
and deferoxamine in Alzheimer's mouse models. 7,8
The intranasal grip for mice can be learned, but is not easy and requires practice, skill, and a precise grip to effectively deliver drug to the brain and avoid drainage to the lung and stomach. Mice are restrained by hand using a modified scruff in the non-dominant hand with the neck held parallel to the floor, while drug is delivered with a pipettor using the dominant hand. It usually takes 3-4 weeks of acclimating to handling before mice can be held with this grip without a stress response. We have prepared this JoVE video to make this intranasal delivery technique more accessible.
Medicine, Issue 74, Biomedical Engineering, Neuroscience, Anatomy, Physiology, Bioengineering, Neurobiology, Pharmacology, Intranasal, nasal, awake, mice, drug delivery, brain targeting, CNS, mouse acclimation, animal model, therapeutics, clinical techniques
In Vivo Modeling of the Morbid Human Genome using Danio rerio
Institutions: Duke University Medical Center, Duke University, Duke University Medical Center.
Here, we present methods for the development of assays to query potentially clinically significant nonsynonymous changes using in vivo
complementation in zebrafish. Zebrafish (Danio rerio
) are a useful animal system due to their experimental tractability; embryos are transparent to enable facile viewing, undergo rapid development ex vivo,
and can be genetically manipulated.1
These aspects have allowed for significant advances in the analysis of embryogenesis, molecular processes, and morphogenetic signaling. Taken together, the advantages of this vertebrate model make zebrafish highly amenable to modeling the developmental defects in pediatric disease, and in some cases, adult-onset disorders. Because the zebrafish genome is highly conserved with that of humans (~70% orthologous), it is possible to recapitulate human disease states in zebrafish. This is accomplished either through the injection of mutant human mRNA to induce dominant negative or gain of function alleles, or utilization of morpholino (MO) antisense oligonucleotides to suppress genes to mimic loss of function variants. Through complementation of MO-induced phenotypes with capped human mRNA, our approach enables the interpretation of the deleterious effect of mutations on human protein sequence based on the ability of mutant mRNA to rescue a measurable, physiologically relevant phenotype. Modeling of the human disease alleles occurs through microinjection of zebrafish embryos with MO and/or human mRNA at the 1-4 cell stage, and phenotyping up to seven days post fertilization (dpf). This general strategy can be extended to a wide range of disease phenotypes, as demonstrated in the following protocol. We present our established models for morphogenetic signaling, craniofacial, cardiac, vascular integrity, renal function, and skeletal muscle disorder phenotypes, as well as others.
Molecular Biology, Issue 78, Genetics, Biomedical Engineering, Medicine, Developmental Biology, Biochemistry, Anatomy, Physiology, Bioengineering, Genomics, Medical, zebrafish, in vivo, morpholino, human disease modeling, transcription, PCR, mRNA, DNA, Danio rerio, animal model
Minimal Erythema Dose (MED) Testing
Institutions: Fox Chase Cancer Center , University of Pennsylvania , Drexel University , Fox Chase Cancer Center , The Cancer Institute of New Jersey.
Ultraviolet radiation (UV) therapy is sometimes used as a treatment for various common skin conditions, including psoriasis, acne, and eczema. The dosage of UV light is prescribed according to an individual's skin sensitivity. Thus, to establish the proper dosage of UV light to administer to a patient, the patient is sometimes screened to determine a minimal erythema dose (MED), which is the amount of UV radiation that will produce minimal erythema (sunburn or redness caused by engorgement of capillaries) of an individual's skin within a few hours following exposure. This article describes how to conduct minimal erythema dose (MED) testing. There is currently no easy way to determine an appropriate UV dose for clinical or research purposes without conducting formal MED testing, requiring observation hours after testing, or informal trial and error testing with the risks of under- or over-dosing. However, some alternative methods are discussed.
Medicine, Issue 75, Anatomy, Physiology, Dermatology, Analytical, Diagnostic, Therapeutic Techniques, Equipment, Health Care, Minimal erythema dose (MED) testing, skin sensitivity, ultraviolet radiation, spectrophotometry, UV exposure, psoriasis, acne, eczema, clinical techniques
A Simple Way to Measure Ethanol Sensitivity in Flies
Institutions: University of Texas Southwestern Medical Center.
Low doses of ethanol cause flies to become hyperactive, while high doses are sedating. The sensitivity to ethanol-induced sedation of a given fly strain is correlated with that strain s ethanol preference, and therefore sedation is a highly relevant measure to study the genetics of alcohol responses and drinking. We demonstrate a simple way to expose flies to ethanol and measure its intoxicating effects. The assay we describe can determine acute sensitivity, as well as ethanol tolerance induced by repeat exposure. It does not require a technically involved setup, and can therefore be applied in any laboratory with basic fly culture tools.
Neuroscience, Issue 48, Drosophila, behavior, alcohol, addiction
Improving IV Insulin Administration in a Community Hospital
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4
The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5
It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance.
The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6
Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8
Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia.
Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
X-ray Dose Reduction through Adaptive Exposure in Fluoroscopic Imaging
Institutions: Triple Ring Technologies.
X-ray fluoroscopy is widely used for image guidance during cardiac intervention. However, radiation dose in these procedures can be high, and this is a significant concern, particularly in pediatric applications. Pediatrics procedures are in general much more complex than those performed on adults and thus are on average four to eight times longer1
. Furthermore, children can undergo up to 10 fluoroscopic procedures by the age of 10, and have been shown to have a three-fold higher risk of developing fatal cancer throughout their life than the general population2,3
We have shown that radiation dose can be significantly reduced in adult cardiac procedures by using our scanning beam digital x-ray (SBDX) system4
-- a fluoroscopic imaging system that employs an inverse imaging geometry5,6
(Figure 1, Movie 1 and Figure 2). Instead of a single focal spot and an extended detector as used in conventional systems, our approach utilizes an extended X-ray source with multiple focal spots focused on a small detector. Our X-ray source consists of a scanning electron beam sequentially illuminating up to 9,000 focal spot positions. Each focal spot projects a small portion of the imaging volume onto the detector. In contrast to a conventional system where the final image is directly projected onto the detector, the SBDX uses a dedicated algorithm to reconstruct the final image from the 9,000 detector images.
For pediatric applications, dose savings with the SBDX system are expected to be smaller than in adult procedures. However, the SBDX system allows for additional dose savings by implementing an electronic adaptive exposure technique. Key to this method is the multi-beam scanning technique of the SBDX system: rather than exposing every part of the image with the same radiation dose, we can dynamically vary the exposure depending on the opacity of the region exposed. Therefore, we can significantly reduce exposure in radiolucent areas and maintain exposure in more opaque regions. In our current implementation, the adaptive exposure requires user interaction (Figure 3). However, in the future, the adaptive exposure will be real time and fully automatic.
We have performed experiments with an anthropomorphic phantom and compared measured radiation dose with and without adaptive exposure using a dose area product (DAP) meter. In the experiment presented here, we find a dose reduction of 30%.
Bioengineering, Issue 55, Scanning digital X-ray, fluoroscopy, pediatrics, interventional cardiology, adaptive exposure, dose savings