JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Regular health checks: cross-sectional survey.
To investigate whether Danish providers of general health checks present a balanced account of possible benefits and harms on their websites and whether the health checks are evidence-based.
Authors: Mauro W. Zappaterra, Anthony S. LaMantia, Christopher A. Walsh, Maria K. Lehtinen.
Published: 03-11-2013
The CSF is a complex fluid with a dynamically varying proteome throughout development and in adulthood. During embryonic development, the nascent CSF differentiates from the amniotic fluid upon closure of the anterior neural tube. CSF volume then increases over subsequent days as the neuroepithelial progenitor cells lining the ventricles and the choroid plexus generate CSF. The embryonic CSF contacts the apical, ventricular surface of the neural stem cells of the developing brain and spinal cord. CSF provides crucial fluid pressure for the expansion of the developing brain and distributes important growth promoting factors to neural progenitor cells in a temporally-specific manner. To investigate the function of the CSF, it is important to isolate pure samples of embryonic CSF without contamination from blood or the developing telencephalic tissue. Here, we describe a technique to isolate relatively pure samples of ventricular embryonic CSF that can be used for a wide range of experimental assays including mass spectrometry, protein electrophoresis, and cell and primary explant culture. We demonstrate how to dissect and culture cortical explants on porous polycarbonate membranes in order to grow developing cortical tissue with reduced volumes of media or CSF. With this method, experiments can be performed using CSF from varying ages or conditions to investigate the biological activity of the CSF proteome on target cells.
25 Related JoVE Articles!
Play Button
Measurement Of Neuromagnetic Brain Function In Pre-school Children With Custom Sized MEG
Authors: Graciela Tesan, Blake W. Johnson, Melanie Reid, Rosalind Thornton, Stephen Crain.
Institutions: Macquarie University.
Magnetoencephalography is a technique that detects magnetic fields associated with cortical activity [1]. The electrophysiological activity of the brain generates electric fields - that can be recorded using electroencephalography (EEG)- and their concomitant magnetic fields - detected by MEG. MEG signals are detected by specialized sensors known as superconducting quantum interference devices (SQUIDs). Superconducting sensors require cooling with liquid helium at -270 °C. They are contained inside a vacumm-insulated helmet called a dewar, which is filled with liquid. SQUIDS are placed in fixed positions inside the helmet dewar in the helium coolant, and a subject's head is placed inside the helmet dewar for MEG measurements. The helmet dewar must be sized to satisfy opposing constraints. Clearly, it must be large enough to fit most or all of the heads in the population that will be studied. However, the helmet must also be small enough to keep most of the SQUID sensors within range of the tiny cerebral fields that they are to measure. Conventional whole-head MEG systems are designed to accommodate more than 90% of adult heads. However adult systems are not well suited for measuring brain function in pre-school chidren whose heads have a radius several cm smaller than adults. The KIT-Macquarie Brain Research Laboratory at Macquarie University uses a MEG system custom sized to fit the heads of pre-school children. This child system has 64 first-order axial gradiometers with a 50 mm baseline[2] and is contained inside a magnetically-shielded room (MSR) together with a conventional adult-sized MEG system [3,4]. There are three main advantages of the customized helmet dewar for studying children. First, the smaller radius of the sensor configuration brings the SQUID sensors into range of the neuromagnetic signals of children's heads. Second, the smaller helmet allows full insertion of a child's head into the dewar. Full insertion is prevented in adult dewar helmets because of the smaller crown to shoulder distance in children. These two factors are fundamental in recording brain activity using MEG because neuromagnetic signals attenuate rapidly with distance. Third, the customized child helmet aids in the symmetric positioning of the head and limits the freedom of movement of the child's head within the dewar. When used with a protocol that aligns the requirements of data collection with the motivational and behavioral capacities of children, these features significantly facilitate setup, positioning, and measurement of MEG signals.
Neuroscience, Issue 36, Magnetoencephalography, Pediatrics, Brain Mapping, Language, Brain Development, Cognitive Neuroscience, Language Acquisition, Linguistics
Play Button
Fundus Photography as a Convenient Tool to Study Microvascular Responses to Cardiovascular Disease Risk Factors in Epidemiological Studies
Authors: Patrick De Boever, Tijs Louwies, Eline Provost, Luc Int Panis, Tim S. Nawrot.
Institutions: Flemish Institute for Technological Research (VITO), Hasselt University, Hasselt University, Leuven University.
The microcirculation consists of blood vessels with diameters less than 150 µm. It makes up a large part of the circulatory system and plays an important role in maintaining cardiovascular health. The retina is a tissue that lines the interior of the eye and it is the only tissue that allows for a non-invasive analysis of the microvasculature. Nowadays, high-quality fundus images can be acquired using digital cameras. Retinal images can be collected in 5 min or less, even without dilatation of the pupils. This unobtrusive and fast procedure for visualizing the microcirculation is attractive to apply in epidemiological studies and to monitor cardiovascular health from early age up to old age. Systemic diseases that affect the circulation can result in progressive morphological changes in the retinal vasculature. For example, changes in the vessel calibers of retinal arteries and veins have been associated with hypertension, atherosclerosis, and increased risk of stroke and myocardial infarction. The vessel widths are derived using image analysis software and the width of the six largest arteries and veins are summarized in the Central Retinal Arteriolar Equivalent (CRAE) and the Central Retinal Venular Equivalent (CRVE). The latter features have been shown useful to study the impact of modifiable lifestyle and environmental cardiovascular disease risk factors. The procedures to acquire fundus images and the analysis steps to obtain CRAE and CRVE are described. Coefficients of variation of repeated measures of CRAE and CRVE are less than 2% and within-rater reliability is very high. Using a panel study, the rapid response of the retinal vessel calibers to short-term changes in particulate air pollution, a known risk factor for cardiovascular mortality and morbidity, is reported. In conclusion, retinal imaging is proposed as a convenient and instrumental tool for epidemiological studies to study microvascular responses to cardiovascular disease risk factors.
Medicine, Issue 92, retina, microvasculature, image analysis, Central Retinal Arteriolar Equivalent, Central Retinal Venular Equivalent, air pollution, particulate matter, black carbon
Play Button
Quantitative Optical Microscopy: Measurement of Cellular Biophysical Features with a Standard Optical Microscope
Authors: Kevin G. Phillips, Sandra M. Baker-Groberg, Owen J.T. McCarty.
Institutions: Oregon Health & Science University, School of Medicine, Oregon Health & Science University, School of Medicine, Oregon Health & Science University, School of Medicine.
We describe the use of a standard optical microscope to perform quantitative measurements of mass, volume, and density on cellular specimens through a combination of bright field and differential interference contrast imagery. Two primary approaches are presented: noninterferometric quantitative phase microscopy (NIQPM), to perform measurements of total cell mass and subcellular density distribution, and Hilbert transform differential interference contrast microscopy (HTDIC) to determine volume. NIQPM is based on a simplified model of wave propagation, termed the paraxial approximation, with three underlying assumptions: low numerical aperture (NA) illumination, weak scattering, and weak absorption of light by the specimen. Fortunately, unstained cellular specimens satisfy these assumptions and low NA illumination is easily achieved on commercial microscopes. HTDIC is used to obtain volumetric information from through-focus DIC imagery under high NA illumination conditions. High NA illumination enables enhanced sectioning of the specimen along the optical axis. Hilbert transform processing on the DIC image stacks greatly enhances edge detection algorithms for localization of the specimen borders in three dimensions by separating the gray values of the specimen intensity from those of the background. The primary advantages of NIQPM and HTDIC lay in their technological accessibility using “off-the-shelf” microscopes. There are two basic limitations of these methods: slow z-stack acquisition time on commercial scopes currently abrogates the investigation of phenomena faster than 1 frame/minute, and secondly, diffraction effects restrict the utility of NIQPM and HTDIC to objects from 0.2 up to 10 (NIQPM) and 20 (HTDIC) μm in diameter, respectively. Hence, the specimen and its associated time dynamics of interest must meet certain size and temporal constraints to enable the use of these methods. Excitingly, most fixed cellular specimens are readily investigated with these methods.
Bioengineering, Issue 86, Label-free optics, quantitative microscopy, cellular biophysics, cell mass, cell volume, cell density
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
Play Button
Procedure for the Development of Multi-depth Circular Cross-sectional Endothelialized Microchannels-on-a-chip
Authors: Xiang Li, Samantha Marie Mearns, Manuela Martins-Green, Yuxin Liu.
Institutions: West Virginia University, University of California at Riverside.
Efforts have been focused on developing in vitro assays for the study of microvessels because in vivo animal studies are more time-consuming, expensive, and observation and quantification are very challenging. However, conventional in vitro microvessel assays have limitations when representing in vivo microvessels with respect to three-dimensional (3D) geometry and providing continuous fluid flow. Using a combination of photolithographic reflowable photoresist technique, soft lithography, and microfluidics, we have developed a multi-depth circular cross-sectional endothelialized microchannels-on-a-chip, which mimics the 3D geometry of in vivo microvessels and runs under controlled continuous perfusion flow. A positive reflowable photoresist was used to fabricate a master mold with a semicircular cross-sectional microchannel network. By the alignment and bonding of the two polydimethylsiloxane (PDMS) microchannels replicated from the master mold, a cylindrical microchannel network was created. The diameters of the microchannels can be well controlled. In addition, primary human umbilical vein endothelial cells (HUVECs) seeded inside the chip showed that the cells lined the inner surface of the microchannels under controlled perfusion lasting for a time period between 4 days to 2 weeks.
Bioengineering, Issue 80, Bioengineering, Tissue Engineering, Miniaturization, Microtechnology, Microfluidics, Reflow photoresist, PDMS, Perfusion flow, Primary endothelial cells
Play Button
A Novel Application of Musculoskeletal Ultrasound Imaging
Authors: Avinash Eranki, Nelson Cortes, Zrinka Gregurić Ferenček, Siddhartha Sikdar.
Institutions: George Mason University, George Mason University, George Mason University, George Mason University.
Ultrasound is an attractive modality for imaging muscle and tendon motion during dynamic tasks and can provide a complementary methodological approach for biomechanical studies in a clinical or laboratory setting. Towards this goal, methods for quantification of muscle kinematics from ultrasound imagery are being developed based on image processing. The temporal resolution of these methods is typically not sufficient for highly dynamic tasks, such as drop-landing. We propose a new approach that utilizes a Doppler method for quantifying muscle kinematics. We have developed a novel vector tissue Doppler imaging (vTDI) technique that can be used to measure musculoskeletal contraction velocity, strain and strain rate with sub-millisecond temporal resolution during dynamic activities using ultrasound. The goal of this preliminary study was to investigate the repeatability and potential applicability of the vTDI technique in measuring musculoskeletal velocities during a drop-landing task, in healthy subjects. The vTDI measurements can be performed concurrently with other biomechanical techniques, such as 3D motion capture for joint kinematics and kinetics, electromyography for timing of muscle activation and force plates for ground reaction force. Integration of these complementary techniques could lead to a better understanding of dynamic muscle function and dysfunction underlying the pathogenesis and pathophysiology of musculoskeletal disorders.
Medicine, Issue 79, Anatomy, Physiology, Joint Diseases, Diagnostic Imaging, Muscle Contraction, ultrasonic applications, Doppler effect (acoustics), Musculoskeletal System, biomechanics, musculoskeletal kinematics, dynamic function, ultrasound imaging, vector Doppler, strain, strain rate
Play Button
Microfluidic Fabrication of Polymeric and Biohybrid Fibers with Predesigned Size and Shape
Authors: Darryl A. Boyd, Andre A. Adams, Michael A. Daniele, Frances S. Ligler.
Institutions: US Naval Research Laboratory, North Carolina State University and University of North Carolina at Chapel Hill.
A “sheath” fluid passing through a microfluidic channel at low Reynolds number can be directed around another “core” stream and used to dictate the shape as well as the diameter of a core stream. Grooves in the top and bottom of a microfluidic channel were designed to direct the sheath fluid and shape the core fluid. By matching the viscosity and hydrophilicity of the sheath and core fluids, the interfacial effects are minimized and complex fluid shapes can be formed. Controlling the relative flow rates of the sheath and core fluids determines the cross-sectional area of the core fluid. Fibers have been produced with sizes ranging from 300 nm to ~1 mm, and fiber cross-sections can be round, flat, square, or complex as in the case with double anchor fibers. Polymerization of the core fluid downstream from the shaping region solidifies the fibers. Photoinitiated click chemistries are well suited for rapid polymerization of the core fluid by irradiation with ultraviolet light. Fibers with a wide variety of shapes have been produced from a list of polymers including liquid crystals, poly(methylmethacrylate), thiol-ene and thiol-yne resins, polyethylene glycol, and hydrogel derivatives. Minimal shear during the shaping process and mild polymerization conditions also makes the fabrication process well suited for encapsulation of cells and other biological components.
Bioengineering, Issue 83, hydrodynamic focusing, polymer fiber, biohybrid, microfabrication, sheath flow, click chemistry
Play Button
Tissue Triage and Freezing for Models of Skeletal Muscle Disease
Authors: Hui Meng, Paul M.L. Janssen, Robert W. Grange, Lin Yang, Alan H. Beggs, Lindsay C. Swanson, Stacy A. Cossette, Alison Frase, Martin K. Childers, Henk Granzier, Emanuela Gussoni, Michael W. Lawlor.
Institutions: Medical College of Wisconsin, The Ohio State University, Virginia Tech, University of Kentucky, Boston Children's Hospital, Harvard Medical School, Cure Congenital Muscular Dystrophy, Joshua Frase Foundation, University of Washington, University of Arizona.
Skeletal muscle is a unique tissue because of its structure and function, which requires specific protocols for tissue collection to obtain optimal results from functional, cellular, molecular, and pathological evaluations. Due to the subtlety of some pathological abnormalities seen in congenital muscle disorders and the potential for fixation to interfere with the recognition of these features, pathological evaluation of frozen muscle is preferable to fixed muscle when evaluating skeletal muscle for congenital muscle disease. Additionally, the potential to produce severe freezing artifacts in muscle requires specific precautions when freezing skeletal muscle for histological examination that are not commonly used when freezing other tissues. This manuscript describes a protocol for rapid freezing of skeletal muscle using isopentane (2-methylbutane) cooled with liquid nitrogen to preserve optimal skeletal muscle morphology. This procedure is also effective for freezing tissue intended for genetic or protein expression studies. Furthermore, we have integrated our freezing protocol into a broader procedure that also describes preferred methods for the short term triage of tissue for (1) single fiber functional studies and (2) myoblast cell culture, with a focus on the minimum effort necessary to collect tissue and transport it to specialized research or reference labs to complete these studies. Overall, this manuscript provides an outline of how fresh tissue can be effectively distributed for a variety of phenotypic studies and thereby provides standard operating procedures (SOPs) for pathological studies related to congenital muscle disease.
Basic Protocol, Issue 89, Tissue, Freezing, Muscle, Isopentane, Pathology, Functional Testing, Cell Culture
Play Button
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Authors: Rivkeh Y. Haryono, Madeline A. Sprajcer, Russell S. J. Keast.
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
Play Button
A Low Mortality Rat Model to Assess Delayed Cerebral Vasospasm After Experimental Subarachnoid Hemorrhage
Authors: Rahul V. Dudhani, Michele Kyle, Christina Dedeo, Margaret Riordan, Eric M. Deshaies.
Institutions: SUNY Upstate Medical University, SUNY Upstate Medical University.
Objective: To characterize and establish a reproducible model that demonstrates delayed cerebral vasospasm after aneurysmal subarachnoid hemorrhage (SAH) in rats, in order to identify the initiating events, pathophysiological changes and potential targets for treatment. Methods: Twenty-eight male Sprague-Dawley rats (250 - 300 g) were arbitrarily assigned to one of two groups - SAH or saline control. Rat subarachnoid hemorrhage in the SAH group (n=15) was induced by double injection of autologous blood, 48 hr apart, into the cisterna magna. Similarly, normal saline (n=13) was injected into the cisterna magna of the saline control group. Rats were sacrificed on day five after the second blood injection and the brains were preserved for histological analysis. The degree of vasospasm was measured using sections of the basilar artery, by measuring the internal luminal cross sectional area using NIH Image-J software. The significance was tested using Tukey/Kramer's statistical analysis. Results: After analysis of histological sections, basilar artery luminal cross sectional area were smaller in the SAH than in the saline group, consistent with cerebral vasospasm in the former group. In the SAH group, basilar artery internal area (.056 μm ± 3) were significantly smaller from vasospasm five days after the second blood injection (seven days after the initial blood injection), compared to the saline control group with internal area (.069 ± 3; p=0.004). There were no mortalities from cerebral vasospasm. Conclusion: The rat double SAH model induces a mild, survivable, basilar artery vasospasm that can be used to study the pathophysiological mechanisms of cerebral vasospasm in a small animal model. A low and acceptable mortality rate is a significant criterion to be satisfied for an ideal SAH animal model so that the mechanisms of vasospasm can be elucidated 7, 8. Further modifications of the model can be made to adjust for increased severity of vasospasm and neurological exams.
Medicine, Issue 71, Anatomy, Physiology, Neurobiology, Neuroscience, Immunology, Surgery, Aneurysm, cerebral, hemorrhage, model, mortality, rat, rodent, subarachnoid, vasospasm, animal model
Play Button
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Authors: Aya D. Pusic, Yelena Y. Grinberg, Heidi M. Mitchell, Richard P. Kraig.
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers. In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic and modulate SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
Play Button
Bronchoalveolar Lavage (BAL) for Research; Obtaining Adequate Sample Yield
Authors: Andrea M. Collins, Jamie Rylance, Daniel G. Wootton, Angela D. Wright, Adam K. A. Wright, Duncan G. Fullerton, Stephen B. Gordon.
Institutions: National Institute for Health Research, Royal Liverpool and Broadgreen University Hospital Trust, Liverpool School of Tropical Medicine, University of Liverpool, Royal Liverpool and Broadgreen University Hospital Trust, University Hospital Aintree.
We describe a research technique for fiberoptic bronchoscopy with bronchoalveolar lavage (BAL) using manual hand held suction in order to remove nonadherent cells and lung lining fluid from the mucosal surface. In research environments, BAL allows sampling of innate (lung macrophage), cellular (B- and T- cells), and humoral (immunoglobulin) responses within the lung. BAL is internationally accepted for research purposes and since 1999 the technique has been performed in > 1,000 subjects in the UK and Malawi by our group. Our technique uses gentle hand-held suction of instilled fluid; this is designed to maximize BAL volume returned and apply minimum shear force on ciliated epithelia in order to preserve the structure and function of cells within the BAL fluid and to preserve viability to facilitate the growth of cells in ex vivo culture. The research technique therefore uses a larger volume instillate (typically in the order of 200 ml) and employs manual suction to reduce cell damage. Patients are given local anesthetic, offered conscious sedation (midazolam), and tolerate the procedure well with minimal side effects. Verbal and written subject information improves tolerance and written informed consent is mandatory. Safety of the subject is paramount. Subjects are carefully selected using clear inclusion and exclusion criteria. This protocol includes a description of the potential risks, and the steps taken to mitigate them, a list of contraindications, pre- and post-procedure checks, as well as precise bronchoscopy and laboratory techniques.
Medicine, Issue 85, Research bronchoscopy, bronchoalveolar lavage (BAL), fiberoptic bronchoscopy, lymphocyte, macrophage
Play Button
Irrelevant Stimuli and Action Control: Analyzing the Influence of Ignored Stimuli via the Distractor-Response Binding Paradigm
Authors: Birte Moeller, Hartmut Schächinger, Christian Frings.
Institutions: Trier University, Trier University.
Selection tasks in which simple stimuli (e.g. letters) are presented and a target stimulus has to be selected against one or more distractor stimuli are frequently used in the research on human action control. One important question in these settings is how distractor stimuli, competing with the target stimulus for a response, influence actions. The distractor-response binding paradigm can be used to investigate this influence. It is particular useful to separately analyze response retrieval and distractor inhibition effects. Computer-based experiments are used to collect the data (reaction times and error rates). In a number of sequentially presented pairs of stimulus arrays (prime-probe design), participants respond to targets while ignoring distractor stimuli. Importantly, the factors response relation in the arrays of each pair (repetition vs. change) and distractor relation (repetition vs. change) are varied orthogonally. The repetition of the same distractor then has a different effect depending on response relation (repetition vs. change) between arrays. This result pattern can be explained by response retrieval due to distractor repetition. In addition, distractor inhibition effects are indicated by a general advantage due to distractor repetition. The described paradigm has proven useful to determine relevant parameters for response retrieval effects on human action.
Behavior, Issue 87, stimulus-response binding, distractor-response binding, response retrieval, distractor inhibition, event file, action control, selection task
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Using Continuous Data Tracking Technology to Study Exercise Adherence in Pulmonary Rehabilitation
Authors: Amanda K. Rizk, Rima Wardini, Emilie Chan-Thim, Barbara Trutschnigg, Amélie Forget, Véronique Pepin.
Institutions: Concordia University, Concordia University, Hôpital du Sacré-Coeur de Montréal.
Pulmonary rehabilitation (PR) is an important component in the management of respiratory diseases. The effectiveness of PR is dependent upon adherence to exercise training recommendations. The study of exercise adherence is thus a key step towards the optimization of PR programs. To date, mostly indirect measures, such as rates of participation, completion, and attendance, have been used to determine adherence to PR. The purpose of the present protocol is to describe how continuous data tracking technology can be used to measure adherence to a prescribed aerobic training intensity on a second-by-second basis. In our investigations, adherence has been defined as the percent time spent within a specified target heart rate range. As such, using a combination of hardware and software, heart rate is measured, tracked, and recorded during cycling second-by-second for each participant, for each exercise session. Using statistical software, the data is subsequently extracted and analyzed. The same protocol can be applied to determine adherence to other measures of exercise intensity, such as time spent at a specified wattage, level, or speed on the cycle ergometer. Furthermore, the hardware and software is also available to measure adherence to other modes of training, such as the treadmill, elliptical, stepper, and arm ergometer. The present protocol, therefore, has a vast applicability to directly measure adherence to aerobic exercise.
Medicine, Issue 81, Data tracking, exercise, rehabilitation, adherence, patient compliance, health behavior, user-computer interface.
Play Button
Flexible Colonoscopy in Mice to Evaluate the Severity of Colitis and Colorectal Tumors Using a Validated Endoscopic Scoring System
Authors: Tomohiro Kodani, Alex Rodriguez-Palacios, Daniele Corridoni, Loris Lopetuso, Luca Di Martino, Brian Marks, James Pizarro, Theresa Pizarro, Amitabh Chak, Fabio Cominelli.
Institutions: Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland.
The use of modern endoscopy for research purposes has greatly facilitated our understanding of gastrointestinal pathologies. In particular, experimental endoscopy has been highly useful for studies that require repeated assessments in a single laboratory animal, such as those evaluating mechanisms of chronic inflammatory bowel disease and the progression of colorectal cancer. However, the methods used across studies are highly variable. At least three endoscopic scoring systems have been published for murine colitis and published protocols for the assessment of colorectal tumors fail to address the presence of concomitant colonic inflammation. This study develops and validates a reproducible endoscopic scoring system that integrates evaluation of both inflammation and tumors simultaneously. This novel scoring system has three major components: 1) assessment of the extent and severity of colorectal inflammation (based on perianal findings, transparency of the wall, mucosal bleeding, and focal lesions), 2) quantitative recording of tumor lesions (grid map and bar graph), and 3) numerical sorting of clinical cases by their pathological and research relevance based on decimal units with assigned categories of observed lesions and endoscopic complications (decimal identifiers). The video and manuscript presented herein were prepared, following IACUC-approved protocols, to allow investigators to score their own experimental mice using a well-validated and highly reproducible endoscopic methodology, with the system option to differentiate distal from proximal endoscopic colitis (D-PECS).
Medicine, Issue 80, Crohn's disease, ulcerative colitis, colon cancer, Clostridium difficile, SAMP mice, DSS/AOM-colitis, decimal scoring identifier
Play Button
RNA Isolation from Embryonic Zebrafish and cDNA Synthesis for Gene Expression Analysis
Authors: Samuel M. Peterson, Jennifer L. Freeman.
Institutions: Purdue University.
Many important and complex laboratory procedures require an input of high quality, intact RNA. A degraded sample or the presence of impurities can lead to disastrous results in downstream experimental applications. It is therefore, of utmost importance to use solid techniques with numerous safeguards and quality control checks to ensure a superior sample. Herein, we detail a protocol to isolate total RNA from whole zebrafish embryos using a commercially available chemical denaturant and subsequent cleanup to remove traces of DNA and impurities using a commercial RNA isolation kit. As RNA is relatively unstable and easily prone to cleavage by RNAses, most protocols assay gene expression using a cDNA product that is directly synthesized from an RNA template. We detail a procedure to convert RNA into the more stable cDNA product using a commercially available kit. Throughout these procedures there are numerous quality control checks to ensure that the sample is not degraded or contaminated. The end product of these protocols is cDNA that is suitable for microarray analysis, RT-PCR or long-term storage.
Developmental Biology, Issue 30, zebrafish, RNA, cDNA, expression, microarray, gene
Play Button
Automated Sholl Analysis of Digitized Neuronal Morphology at Multiple Scales
Authors: Melinda K. Kutzing, Christopher G. Langhammer, Vincent Luo, Hersh Lakdawala, Bonnie L. Firestein.
Institutions: Rutgers University, Rutgers University.
Neuronal morphology plays a significant role in determining how neurons function and communicate1-3. Specifically, it affects the ability of neurons to receive inputs from other cells2 and contributes to the propagation of action potentials4,5. The morphology of the neurites also affects how information is processed. The diversity of dendrite morphologies facilitate local and long range signaling and allow individual neurons or groups of neurons to carry out specialized functions within the neuronal network6,7. Alterations in dendrite morphology, including fragmentation of dendrites and changes in branching patterns, have been observed in a number of disease states, including Alzheimer's disease8, schizophrenia9,10, and mental retardation11. The ability to both understand the factors that shape dendrite morphologies and to identify changes in dendrite morphologies is essential in the understanding of nervous system function and dysfunction. Neurite morphology is often analyzed by Sholl analysis and by counting the number of neurites and the number of branch tips. This analysis is generally applied to dendrites, but it can also be applied to axons. Performing this analysis by hand is both time consuming and inevitably introduces variability due to experimenter bias and inconsistency. The Bonfire program is a semi-automated approach to the analysis of dendrite and axon morphology that builds upon available open-source morphological analysis tools. Our program enables the detection of local changes in dendrite and axon branching behaviors by performing Sholl analysis on subregions of the neuritic arbor. For example, Sholl analysis is performed on both the neuron as a whole as well as on each subset of processes (primary, secondary, terminal, root, etc.) Dendrite and axon patterning is influenced by a number of intracellular and extracellular factors, many acting locally. Thus, the resulting arbor morphology is a result of specific processes acting on specific neurites, making it necessary to perform morphological analysis on a smaller scale in order to observe these local variations12. The Bonfire program requires the use of two open-source analysis tools, the NeuronJ plugin to ImageJ and NeuronStudio. Neurons are traced in ImageJ, and NeuronStudio is used to define the connectivity between neurites. Bonfire contains a number of custom scripts written in MATLAB (MathWorks) that are used to convert the data into the appropriate format for further analysis, check for user errors, and ultimately perform Sholl analysis. Finally, data are exported into Excel for statistical analysis. A flow chart of the Bonfire program is shown in Figure 1.
Neuroscience, Issue 45, Sholl Analysis, Neurite, Morphology, Computer-assisted, Tracing
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Play Button
Characterizing Herbivore Resistance Mechanisms: Spittlebugs on Brachiaria spp. as an Example
Authors: Soroush Parsa, Guillermo Sotelo, Cesar Cardona.
Institutions: CIAT.
Plants can resist herbivore damage through three broad mechanisms: antixenosis, antibiosis and tolerance1. Antixenosis is the degree to which the plant is avoided when the herbivore is able to select other plants2. Antibiosis is the degree to which the plant affects the fitness of the herbivore feeding on it1.Tolerance is the degree to which the plant can withstand or repair damage caused by the herbivore, without compromising the herbivore's growth and reproduction1. The durability of herbivore resistance in an agricultural setting depends to a great extent on the resistance mechanism favored during crop breeding efforts3. We demonstrate a no-choice experiment designed to estimate the relative contributions of antibiosis and tolerance to spittlebug resistance in Brachiaria spp. Several species of African grasses of the genus Brachiaria are valuable forage and pasture plants in the Neotropics, but they can be severely challenged by several native species of spittlebugs (Hemiptera: Cercopidae)4.To assess their resistance to spittlebugs, plants are vegetatively-propagated by stem cuttings and allowed to grow for approximately one month, allowing the growth of superficial roots on which spittlebugs can feed. At that point, each test plant is individually challenged with six spittlebug eggs near hatching. Infestations are allowed to progress for one month before evaluating plant damage and insect survival. Scoring plant damage provides an estimate of tolerance while scoring insect survival provides an estimate of antibiosis. This protocol has facilitated our plant breeding objective to enhance spittlebug resistance in commercial brachiariagrases5.
Plant Biology, Issue 52, host plant resistance, antibiosis, antixenosis, tolerance, Brachiaria, spittlebugs
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Play Button
Using Visual and Narrative Methods to Achieve Fair Process in Clinical Care
Authors: Laura S. Lorenz, Jon A. Chilingerian.
Institutions: Brandeis University, Brandeis University.
The Institute of Medicine has targeted patient-centeredness as an important area of quality improvement. A major dimension of patient-centeredness is respect for patient's values, preferences, and expressed needs. Yet specific approaches to gaining this understanding and translating it to quality care in the clinical setting are lacking. From a patient perspective quality is not a simple concept but is best understood in terms of five dimensions: technical outcomes; decision-making efficiency; amenities and convenience; information and emotional support; and overall patient satisfaction. Failure to consider quality from this five-pronged perspective results in a focus on medical outcomes, without considering the processes central to quality from the patient's perspective and vital to achieving good outcomes. In this paper, we argue for applying the concept of fair process in clinical settings. Fair process involves using a collaborative approach to exploring diagnostic issues and treatments with patients, explaining the rationale for decisions, setting expectations about roles and responsibilities, and implementing a core plan and ongoing evaluation. Fair process opens the door to bringing patient expertise into the clinical setting and the work of developing health care goals and strategies. This paper provides a step by step illustration of an innovative visual approach, called photovoice or photo-elicitation, to achieve fair process in clinical work with acquired brain injury survivors and others living with chronic health conditions. Applying this visual tool and methodology in the clinical setting will enhance patient-provider communication; engage patients as partners in identifying challenges, strengths, goals, and strategies; and support evaluation of progress over time. Asking patients to bring visuals of their lives into the clinical interaction can help to illuminate gaps in clinical knowledge, forge better therapeutic relationships with patients living with chronic conditions such as brain injury, and identify patient-centered goals and possibilities for healing. The process illustrated here can be used by clinicians, (primary care physicians, rehabilitation therapists, neurologists, neuropsychologists, psychologists, and others) working with people living with chronic conditions such as acquired brain injury, mental illness, physical disabilities, HIV/AIDS, substance abuse, or post-traumatic stress, and by leaders of support groups for the types of patients described above and their family members or caregivers.
Medicine, Issue 48, person-centered care, participatory visual methods, photovoice, photo-elicitation, narrative medicine, acquired brain injury, disability, rehabilitation, palliative care
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.