JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
The relationship between inflation and inflation uncertainty. Empirical evidence for the newest EU countries.
PUBLISHED: 01-01-2014
The objective of this paper is to verify the hypotheses presented in the literature on the causal relationship between inflation and its uncertainty, for the newest EU countries. To ensure the robustness of the results, in the study four models for inflation uncertainty are estimated in parallel: ARCH (1), GARCH (1,1), EGARCH (1,1,1) and PARCH (1,1,1). The Granger method is used to test the causality between two variables. The working hypothesis is that groups of countries with a similar political and economic background in 1990 and are likely to be characterized by the same causal relationship between inflation and inflation uncertainty. Empirical results partially confirm this hypothesis.
Authors: Toby K. McGovern, Annette Robichaud, Liah Fereydoonzad, Thomas F. Schuessler, James G. Martin.
Published: 05-15-2013
The forced oscillation technique (FOT) is a powerful, integrative and translational tool permitting the experimental assessment of lung function in mice in a comprehensive, detailed, precise and reproducible manner. It provides measurements of respiratory system mechanics through the analysis of pressure and volume signals acquired in reaction to predefined, small amplitude, oscillatory airflow waveforms, which are typically applied at the subject's airway opening. The present protocol details the steps required to adequately execute forced oscillation measurements in mice using a computer-controlled piston ventilator (flexiVent; SCIREQ Inc, Montreal, Qc, Canada). The description is divided into four parts: preparatory steps, mechanical ventilation, lung function measurements, and data analysis. It also includes details of how to assess airway responsiveness to inhaled methacholine in anesthetized mice, a common application of this technique which also extends to other outcomes and various lung pathologies. Measurements obtained in naïve mice as well as from an oxidative-stress driven model of airway damage are presented to illustrate how this tool can contribute to a better characterization and understanding of studied physiological changes or disease models as well as to applications in new research areas.
18 Related JoVE Articles!
Play Button
The Utilization of Oropharyngeal Intratracheal PAMP Administration and Bronchoalveolar Lavage to Evaluate the Host Immune Response in Mice
Authors: Irving C. Allen.
Institutions: Virginia Polytechnic Institute and State University.
The host immune response to pathogens is a complex biological process. The majority of in vivo studies classically employed to characterize host-pathogen interactions take advantage of intraperitoneal injections of select bacteria or pathogen associated molecular patterns (PAMPs) in mice. While these techniques have yielded tremendous data associated with infectious disease pathobiology, intraperitoneal injection models are not always appropriate for host-pathogen interaction studies in the lung. Utilizing an acute lung inflammation model in mice, it is possible to conduct a high resolution analysis of the host innate immune response utilizing lipopolysaccharide (LPS). Here, we describe the methods to administer LPS using nonsurgical oropharyngeal intratracheal administration, monitor clinical parameters associated with disease pathogenesis, and utilize bronchoalveolar lavage fluid to evaluate the host immune response. The techniques that are described are widely applicable for studying the host innate immune response to a diverse range of PAMPs and pathogens. Likewise, with minor modifications, these techniques can also be applied in studies evaluating allergic airway inflammation and in pharmacological applications.
Infection, Issue 86, LPS, Lipopolysaccharide, mouse, pneumonia, gram negative bacteria, inflammation, acute lung inflammation, innate immunity, host pathogen interaction, lung, respiratory disease
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Diagnostic Necropsy and Selected Tissue and Sample Collection in Rats and Mice
Authors: Christina M. Parkinson, Alexandra O'Brien, Theresa M. Albers, Meredith A. Simon, Charles B. Clifford, Kathleen R. Pritchett-Corning.
Institutions: Charles River, Charles River, University of Washington.
There are multiple sample types that may be collected from a euthanized animal in order to help diagnose or discover infectious agents in an animal colony. Proper collection of tissues for further histological processing can impact the quality of testing results. This article describes the conduct of a basic gross examination including identification of heart, liver, lungs, kidneys, and spleen, as well as how to collect those organs. Additionally four of the more difficult tissue/sample collection techniques are demonstrated. Lung collection and perfusion can be particularly challenging as the tissue needs to be properly inflated with a fixative in order for inside of the tissue to fix properly and to enable thorough histologic evaluation. This article demonstrates the step by step technique to remove the lung and inflate it with fixative in order to achieve optimal fixation of the tissue within 24 hours. Brain collection can be similarly challenging as the tissue is soft and easily damaged. This article demonstrates the step by step technique to expose and remove the brain from the skull with minimal damage to the tissue. The mesenteric lymph node is a good sample type in which to detect many common infectious agents as enteric viruses persist longer in the lymph node than they are shed in feces. This article demonstrates the step by step procedure for locating and aseptically removing the mesenteric lymph node. Finally, identification of infectious agents of the respiratory tract may be performed by bacterial culture or PCR testing of nasal and/or bronchial fluid aspirates taken at necropsy. This procedure describes obtaining and preparing the respiratory aspirate sample for bacterial culture and PCR testing.
Anatomy, Issue 54, rodent, necropsy, diagnostic assay, bacteriology, PCR, organ collection, tissue sampling
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Authors: Brian H. Smith, Christina M. Burden.
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g., food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides. We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
Play Button
Laboratory Estimation of Net Trophic Transfer Efficiencies of PCB Congeners to Lake Trout (Salvelinus namaycush) from Its Prey
Authors: Charles P. Madenjian, Richard R. Rediske, James P. O'Keefe, Solomon R. David.
Institutions: U. S. Geological Survey, Grand Valley State University, Shedd Aquarium.
A technique for laboratory estimation of net trophic transfer efficiency (γ) of polychlorinated biphenyl (PCB) congeners to piscivorous fish from their prey is described herein. During a 135-day laboratory experiment, we fed bloater (Coregonus hoyi) that had been caught in Lake Michigan to lake trout (Salvelinus namaycush) kept in eight laboratory tanks. Bloater is a natural prey for lake trout. In four of the tanks, a relatively high flow rate was used to ensure relatively high activity by the lake trout, whereas a low flow rate was used in the other four tanks, allowing for low lake trout activity. On a tank-by-tank basis, the amount of food eaten by the lake trout on each day of the experiment was recorded. Each lake trout was weighed at the start and end of the experiment. Four to nine lake trout from each of the eight tanks were sacrificed at the start of the experiment, and all 10 lake trout remaining in each of the tanks were euthanized at the end of the experiment. We determined concentrations of 75 PCB congeners in the lake trout at the start of the experiment, in the lake trout at the end of the experiment, and in bloaters fed to the lake trout during the experiment. Based on these measurements, γ was calculated for each of 75 PCB congeners in each of the eight tanks. Mean γ was calculated for each of the 75 PCB congeners for both active and inactive lake trout. Because the experiment was replicated in eight tanks, the standard error about mean γ could be estimated. Results from this type of experiment are useful in risk assessment models to predict future risk to humans and wildlife eating contaminated fish under various scenarios of environmental contamination.
Environmental Sciences, Issue 90, trophic transfer efficiency, polychlorinated biphenyl congeners, lake trout, activity, contaminants, accumulation, risk assessment, toxic equivalents
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
A Method for Mouse Pancreatic Islet Isolation and Intracellular cAMP Determination
Authors: Joshua C. Neuman, Nathan A. Truchan, Jamie W. Joseph, Michelle E. Kimple.
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison, University of Waterloo.
Uncontrolled glycemia is a hallmark of diabetes mellitus and promotes morbidities like neuropathy, nephropathy, and retinopathy. With the increasing prevalence of diabetes, both immune-mediated type 1 and obesity-linked type 2, studies aimed at delineating diabetes pathophysiology and therapeutic mechanisms are of critical importance. The β-cells of the pancreatic islets of Langerhans are responsible for appropriately secreting insulin in response to elevated blood glucose concentrations. In addition to glucose and other nutrients, the β-cells are also stimulated by specific hormones, termed incretins, which are secreted from the gut in response to a meal and act on β-cell receptors that increase the production of intracellular cyclic adenosine monophosphate (cAMP). Decreased β-cell function, mass, and incretin responsiveness are well-understood to contribute to the pathophysiology of type 2 diabetes, and are also being increasingly linked with type 1 diabetes. The present mouse islet isolation and cAMP determination protocol can be a tool to help delineate mechanisms promoting disease progression and therapeutic interventions, particularly those that are mediated by the incretin receptors or related receptors that act through modulation of intracellular cAMP production. While only cAMP measurements will be described, the described islet isolation protocol creates a clean preparation that also allows for many other downstream applications, including glucose stimulated insulin secretion, [3H]-thymidine incorporation, protein abundance, and mRNA expression.
Physiology, Issue 88, islet, isolation, insulin secretion, β-cell, diabetes, cAMP production, mouse
Play Button
Characterization of Surface Modifications by White Light Interferometry: Applications in Ion Sputtering, Laser Ablation, and Tribology Experiments
Authors: Sergey V. Baryshev, Robert A. Erck, Jerry F. Moore, Alexander V. Zinovev, C. Emil Tripa, Igor V. Veryovkin.
Institutions: Argonne National Laboratory, Argonne National Laboratory, MassThink LLC.
In materials science and engineering it is often necessary to obtain quantitative measurements of surface topography with micrometer lateral resolution. From the measured surface, 3D topographic maps can be subsequently analyzed using a variety of software packages to extract the information that is needed. In this article we describe how white light interferometry, and optical profilometry (OP) in general, combined with generic surface analysis software, can be used for materials science and engineering tasks. In this article, a number of applications of white light interferometry for investigation of surface modifications in mass spectrometry, and wear phenomena in tribology and lubrication are demonstrated. We characterize the products of the interaction of semiconductors and metals with energetic ions (sputtering), and laser irradiation (ablation), as well as ex situ measurements of wear of tribological test specimens. Specifically, we will discuss: Aspects of traditional ion sputtering-based mass spectrometry such as sputtering rates/yields measurements on Si and Cu and subsequent time-to-depth conversion. Results of quantitative characterization of the interaction of femtosecond laser irradiation with a semiconductor surface. These results are important for applications such as ablation mass spectrometry, where the quantities of evaporated material can be studied and controlled via pulse duration and energy per pulse. Thus, by determining the crater geometry one can define depth and lateral resolution versus experimental setup conditions. Measurements of surface roughness parameters in two dimensions, and quantitative measurements of the surface wear that occur as a result of friction and wear tests. Some inherent drawbacks, possible artifacts, and uncertainty assessments of the white light interferometry approach will be discussed and explained.
Materials Science, Issue 72, Physics, Ion Beams (nuclear interactions), Light Reflection, Optical Properties, Semiconductor Materials, White Light Interferometry, Ion Sputtering, Laser Ablation, Femtosecond Lasers, Depth Profiling, Time-of-flight Mass Spectrometry, Tribology, Wear Analysis, Optical Profilometry, wear, friction, atomic force microscopy, AFM, scanning electron microscopy, SEM, imaging, visualization
Play Button
Direct Pressure Monitoring Accurately Predicts Pulmonary Vein Occlusion During Cryoballoon Ablation
Authors: Ioanna Kosmidou, Shannnon Wooden, Brian Jones, Thomas Deering, Andrew Wickliffe, Dan Dan.
Institutions: Piedmont Heart Institute, Medtronic Inc..
Cryoballoon ablation (CBA) is an established therapy for atrial fibrillation (AF). Pulmonary vein (PV) occlusion is essential for achieving antral contact and PV isolation and is typically assessed by contrast injection. We present a novel method of direct pressure monitoring for assessment of PV occlusion. Transcatheter pressure is monitored during balloon advancement to the PV antrum. Pressure is recorded via a single pressure transducer connected to the inner lumen of the cryoballoon. Pressure curve characteristics are used to assess occlusion in conjunction with fluoroscopic or intracardiac echocardiography (ICE) guidance. PV occlusion is confirmed when loss of typical left atrial (LA) pressure waveform is observed with recordings of PA pressure characteristics (no A wave and rapid V wave upstroke). Complete pulmonary vein occlusion as assessed with this technique has been confirmed with concurrent contrast utilization during the initial testing of the technique and has been shown to be highly accurate and readily reproducible. We evaluated the efficacy of this novel technique in 35 patients. A total of 128 veins were assessed for occlusion with the cryoballoon utilizing the pressure monitoring technique; occlusive pressure was demonstrated in 113 veins with resultant successful pulmonary vein isolation in 111 veins (98.2%). Occlusion was confirmed with subsequent contrast injection during the initial ten procedures, after which contrast utilization was rapidly reduced or eliminated given the highly accurate identification of occlusive pressure waveform with limited initial training. Verification of PV occlusive pressure during CBA is a novel approach to assessing effective PV occlusion and it accurately predicts electrical isolation. Utilization of this method results in significant decrease in fluoroscopy time and volume of contrast.
Medicine, Issue 72, Anatomy, Physiology, Cardiology, Biomedical Engineering, Surgery, Cardiovascular System, Cardiovascular Diseases, Surgical Procedures, Operative, Investigative Techniques, Atrial fibrillation, Cryoballoon Ablation, Pulmonary Vein Occlusion, Pulmonary Vein Isolation, electrophysiology, catheterizatoin, heart, vein, clinical, surgical device, surgical techniques
Play Button
Right Ventricular Systolic Pressure Measurements in Combination with Harvest of Lung and Immune Tissue Samples in Mice
Authors: Wen-Chi Chen, Sung-Hyun Park, Carol Hoffman, Cecil Philip, Linda Robinson, James West, Gabriele Grunig.
Institutions: New York University School of Medicine, Tuxedo, Vanderbilt University Medical Center, New York University School of Medicine.
The function of the right heart is to pump blood through the lungs, thus linking right heart physiology and pulmonary vascular physiology. Inflammation is a common modifier of heart and lung function, by elaborating cellular infiltration, production of cytokines and growth factors, and by initiating remodeling processes 1. Compared to the left ventricle, the right ventricle is a low-pressure pump that operates in a relatively narrow zone of pressure changes. Increased pulmonary artery pressures are associated with increased pressure in the lung vascular bed and pulmonary hypertension 2. Pulmonary hypertension is often associated with inflammatory lung diseases, for example chronic obstructive pulmonary disease, or autoimmune diseases 3. Because pulmonary hypertension confers a bad prognosis for quality of life and life expectancy, much research is directed towards understanding the mechanisms that might be targets for pharmaceutical intervention 4. The main challenge for the development of effective management tools for pulmonary hypertension remains the complexity of the simultaneous understanding of molecular and cellular changes in the right heart, the lungs and the immune system. Here, we present a procedural workflow for the rapid and precise measurement of pressure changes in the right heart of mice and the simultaneous harvest of samples from heart, lungs and immune tissues. The method is based on the direct catheterization of the right ventricle via the jugular vein in close-chested mice, first developed in the late 1990s as surrogate measure of pressures in the pulmonary artery5-13. The organized team-approach facilitates a very rapid right heart catheterization technique. This makes it possible to perform the measurements in mice that spontaneously breathe room air. The organization of the work-flow in distinct work-areas reduces time delay and opens the possibility to simultaneously perform physiology experiments and harvest immune, heart and lung tissues. The procedural workflow outlined here can be adapted for a wide variety of laboratory settings and study designs, from small, targeted experiments, to large drug screening assays. The simultaneous acquisition of cardiac physiology data that can be expanded to include echocardiography5,14-17 and harvest of heart, lung and immune tissues reduces the number of animals needed to obtain data that move the scientific knowledge basis forward. The procedural workflow presented here also provides an ideal basis for gaining knowledge of the networks that link immune, lung and heart function. The same principles outlined here can be adapted to study other or additional organs as needed.
Immunology, Issue 71, Medicine, Anatomy, Physiology, Cardiology, Surgery, Cardiovascular Abnormalities, Inflammation, Respiration Disorders, Immune System Diseases, Cardiac physiology, mouse, pulmonary hypertension, right heart function, lung immune response, lung inflammation, lung remodeling, catheterization, mice, tissue, animal model
Play Button
Mouse Islet of Langerhans Isolation using a Combination of Purified Collagenase and Neutral Protease
Authors: Natalie D. Stull, Andrew Breite, Robert McCarthy, Sarah A. Tersey, Raghavendra G. Mirmira.
Institutions: Indiana University School of Medicine, VITACYTE, LLC, Indiana University School of Medicine, Indiana University School of Medicine.
The interrogation of beta cell gene expression and function in vitro has squarely shifted over the years from the study of rodent tumorigenic cell lines to the study of isolated rodent islets. Primary islets offer the distinct advantage that they more faithfully reflect the biology of intracellular signaling pathways and secretory responses. Whereas the method of islet isolation using tissue dissociating enzyme (TDE) preparations has been well established in many laboratories1-4, variations in the consistency of islet yield and quality from any given rodent strain limit the extent and feasibility of primary islet studies. These variations often occur as a result of the crude partially purified TDEs used in the islet isolation procedure; TDEs frequently exhibit lot-to-lot variations in activity and often require adjustments to the dose of enzyme used. A small number of reports have used purified TDEs for rodent cell isolations5, 6, but the practice is not widespread despite the routine use and advantages of purified TDEs for human islet isolations. In collaboration with VitaCyte, LLC (Indianapolis, IN), we developed a modified mouse islet isolation protocol based on that described by Gotoh7, 8, in which the TDEs are perfused directly into the pancreatic duct of mice, followed by crude tissue fractionation through a Histopaque gradient9, and isolation of purified islets. A significant difference in our protocol is the use of purified collagenase (CIzyme MA) and neutral protease (CIzyme BP) combination. The collagenase was characterized by the use of a6 fluorescence collagen degrading activity (CDA) assay that utilized fluorescently labeled soluble calf skin fibrils as substrate6. This substrate is more predictive of the kinetics of collagen degradation in the tissue matrix because it relies on native collagen as the substrate. The protease was characterized with a sensitive fluorescent kinetic assay10. Utilizing these improved assays along with more traditional biochemical analysis enable the TDE to be manufactured more consistently, leading to improved performance consistency between lots. The protocol described in here was optimized for maximal islet yield and optimal islet morphology using C57BL/6 mice. During the development of this protocol, several combinations of collagenase and neutral proteases were evaluated at different concentrations, and the final ratio of collagenase:neutral protease of 35:10 represents enzyme performance comparable to Sigma Type XI. Because significant variability in average islet yields from different strains of rats and mice have been reported, additional modifications of the TDE composition should be made to improve the yield and quality of islets recovered from different species and strains.
Cellular Biology, Issue 67, Islet, collagenase, mouse, insulin, fluorescence
Play Button
Monitoring the Wall Mechanics During Stent Deployment in a Vessel
Authors: Brian D. Steinert, Shijia Zhao, Linxia Gu.
Institutions: University of Nebraska-Lincoln.
Clinical trials have reported different restenosis rates for various stent designs1. It is speculated that stent-induced strain concentrations on the arterial wall lead to tissue injury, which initiates restenosis2-7. This hypothesis needs further investigations including better quantifications of non-uniform strain distribution on the artery following stent implantation. A non-contact surface strain measurement method for the stented artery is presented in this work. ARAMIS stereo optical surface strain measurement system uses two optical high speed cameras to capture the motion of each reference point, and resolve three dimensional strains over the deforming surface8,9. As a mesh stent is deployed into a latex vessel with a random contrasting pattern sprayed or drawn on its outer surface, the surface strain is recorded at every instant of the deformation. The calculated strain distributions can then be used to understand the local lesion response, validate the computational models, and formulate hypotheses for further in vivo study.
Biomedical Engineering, Issue 63, Stent, vessel, interaction, strain distribution, stereo optical surface strain measurement system, bioengineering
Play Button
Mechanical Testing of Mouse Carotid Arteries: from Newborn to Adult
Authors: Mazyar Amin, Victoria P. Le, Jessica E. Wagenseil.
Institutions: Saint Louis University.
The large conducting arteries in vertebrates are composed of a specialized extracellular matrix designed to provide pulse dampening and reduce the work performed by the heart. The mix of matrix proteins determines the passive mechanical properties of the arterial wall1. When the matrix proteins are altered in development, aging, disease or injury, the arterial wall remodels, changing the mechanical properties and leading to subsequent cardiac adaptation2. In normal development, the remodeling leads to a functional cardiac and cardiovascular system optimized for the needs of the adult organism. In disease, the remodeling often leads to a negative feedback cycle that can cause cardiac failure and death. By quantifying passive arterial mechanical properties in development and disease, we can begin to understand the normal remodeling process to recreate it in tissue engineering and the pathological remodeling process to test disease treatments. Mice are useful models for studying passive arterial mechanics in development and disease. They have a relatively short lifespan (mature adults by 3 months and aged adults by 2 years), so developmental3 and aging studies4 can be carried out over a limited time course. The advances in mouse genetics provide numerous genotypes and phenotypes to study changes in arterial mechanics with disease progression5 and disease treatment6. Mice can also be manipulated experimentally to study the effects of changes in hemodynamic parameters on the arterial remodeling process7. One drawback of the mouse model, especially for examining young ages, is the size of the arteries. We describe a method for passive mechanical testing of carotid arteries from mice aged 3 days to adult (approximately 90 days). We adapt a commercial myograph system to mount the arteries and perform multiple pressure or axial stretch protocols on each specimen. We discuss suitable protocols for each age, the necessary measurements and provide example data. We also include data analysis strategies for rigorous mechanical characterization of the arteries.
Bioengineering, Issue 60, blood vessel, artery, mechanics, pressure, diameter, postnatal development
Play Button
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
Play Button
Using the Threat Probability Task to Assess Anxiety and Fear During Uncertain and Certain Threat
Authors: Daniel E. Bradford, Katherine P. Magruder, Rachel A. Korhumel, John J. Curtin.
Institutions: University of Wisconsin-Madison.
Fear of certain threat and anxiety about uncertain threat are distinct emotions with unique behavioral, cognitive-attentional, and neuroanatomical components. Both anxiety and fear can be studied in the laboratory by measuring the potentiation of the startle reflex. The startle reflex is a defensive reflex that is potentiated when an organism is threatened and the need for defense is high. The startle reflex is assessed via electromyography (EMG) in the orbicularis oculi muscle elicited by brief, intense, bursts of acoustic white noise (i.e., “startle probes”). Startle potentiation is calculated as the increase in startle response magnitude during presentation of sets of visual threat cues that signal delivery of mild electric shock relative to sets of matched cues that signal the absence of shock (no-threat cues). In the Threat Probability Task, fear is measured via startle potentiation to high probability (100% cue-contingent shock; certain) threat cues whereas anxiety is measured via startle potentiation to low probability (20% cue-contingent shock; uncertain) threat cues. Measurement of startle potentiation during the Threat Probability Task provides an objective and easily implemented alternative to assessment of negative affect via self-report or other methods (e.g., neuroimaging) that may be inappropriate or impractical for some researchers. Startle potentiation has been studied rigorously in both animals (e.g., rodents, non-human primates) and humans which facilitates animal-to-human translational research. Startle potentiation during certain and uncertain threat provides an objective measure of negative affective and distinct emotional states (fear, anxiety) to use in research on psychopathology, substance use/abuse and broadly in affective science. As such, it has been used extensively by clinical scientists interested in psychopathology etiology and by affective scientists interested in individual differences in emotion.
Behavior, Issue 91, Startle; electromyography; shock; addiction; uncertainty; fear; anxiety; humans; psychophysiology; translational
Play Button
An Assay for Permeability of the Zebrafish Embryonic Neuroepithelium
Authors: Jessica T. Chang, Hazel Sive.
Institutions: Massachusetts Institute of Technology, Whitehead Institute of Biomedical Research.
The brain ventricular system is conserved among vertebrates and is composed of a series of interconnected cavities called brain ventricles, which form during the earliest stages of brain development and are maintained throughout the animal's life. The brain ventricular system is found in vertebrates, and the ventricles develop after neural tube formation, when the central lumen fills with cerebrospinal fluid (CSF) 1,2. CSF is a protein rich fluid that is essential for normal brain development and function3-6. In zebrafish, brain ventricle inflation begins at approximately 18 hr post fertilization (hpf), after the neural tube is closed. Multiple processes are associated with brain ventricle formation, including formation of a neuroepithelium, tight junction formation that regulates permeability and CSF production. We showed that the Na,K-ATPase is required for brain ventricle inflation, impacting all these processes 7,8, while claudin 5a is necessary for tight junction formation 9. Additionally, we showed that "relaxation" of the embryonic neuroepithelium, via inhibition of myosin, is associated with brain ventricle inflation. To investigate the regulation of permeability during zebrafish brain ventricle inflation, we developed a ventricular dye retention assay. This method uses brain ventricle injection in a living zebrafish embryo, a technique previously developed in our lab10, to fluorescently label the cerebrospinal fluid. Embryos are then imaged over time as the fluorescent dye moves through the brain ventricles and neuroepithelium. The distance the dye front moves away from the basal (non-luminal) side of the neuroepithelium over time is quantified and is a measure of neuroepithelial permeability (Figure 1). We observe that dyes 70 kDa and smaller will move through the neuroepithelium and can be detected outside the embryonic zebrafish brain at 24 hpf (Figure 2). This dye retention assay can be used to analyze neuroepithelial permeability in a variety of different genetic backgrounds, at different times during development, and after environmental perturbations. It may also be useful in examining pathological accumulation of CSF. Overall, this technique allows investigators to analyze the role and regulation of permeability during development and disease.
Neuroscience, Issue 68, Zebrafish, neuroepithelium, brain ventricle, permeability
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.