JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
The effect of chance variability in blood pressure readings on the decision making of general practitioners: an internet-based case vignette study.
Guidelines for the management of blood pressure (BP) in primary care generally suggest that decisions be made on the basis of specific threshold values (e.g. BP 140/90 mmHg); but this fails to adequately accommodate a common cause of variation--the play of chance.
Authors: Peter Novak.
Published: 07-19-2011
Disorders associated with dysfunction of autonomic nervous system are quite common yet frequently unrecognized. Quantitative autonomic testing can be invaluable tool for evaluation of these disorders, both in clinic and research. There are number of autonomic tests, however, only few were validated clinically or are quantitative. Here, fully quantitative and clinically validated protocol for testing of autonomic functions is presented. As a bare minimum the clinical autonomic laboratory should have a tilt table, ECG monitor, continuous noninvasive blood pressure monitor, respiratory monitor and a mean for evaluation of sudomotor domain. The software for recording and evaluation of autonomic tests is critical for correct evaluation of data. The presented protocol evaluates 3 major autonomic domains: cardiovagal, adrenergic and sudomotor. The tests include deep breathing, Valsalva maneuver, head-up tilt, and quantitative sudomotor axon test (QSART). The severity and distribution of dysautonomia is quantitated using Composite Autonomic Severity Scores (CASS). Detailed protocol is provided highlighting essential aspects of testing with emphasis on proper data acquisition, obtaining the relevant parameters and unbiased evaluation of autonomic signals. The normative data and CASS algorithm for interpretation of results are provided as well.
25 Related JoVE Articles!
Play Button
Surgical Management of Meatal Stenosis with Meatoplasty
Authors: Ming-Hsien Wang.
Institutions: Johns Hopkins School of Medicine.
Meatal stenosis is a common urologic complication after circumcision. Children present to their primary care physicians with complaints of deviated urinary stream, difficult-to-aim, painful urination, and urinary frequency. Clinical exam reveals a pinpoint meatus and if the child is asked to urinate, he will usually have an upward, thin, occasionally forceful urinary stream with incomplete bladder emptying. The mainstay of management is meatoplasty (reconstruction of the distal urethra /meatus). This educational video will demonstrate how this is performed.
Medicine, Issue 45, Urinary obstruction, pediatric urology, deviated urinary stream, meatal stenosis, operative repair, meatotomy, meatoplasty
Play Button
Measuring Left Ventricular Pressure in Late Embryonic and Neonatal Mice
Authors: Victoria P. Le, Attila Kovacs, Jessica E. Wagenseil.
Institutions: Saint Louis University, Washington University School of Medicine.
Blood pressure increases significantly during embryonic and postnatal development in vertebrate animals. In the mouse, blood flow is first detectable around embryonic day (E) 8.51. Systolic left ventricular (LV) pressure is 2 mmHg at E9.5 and 11 mmHg at E14.52. At these mid-embryonic stages, the LV is clearly visible through the chest wall for invasive pressure measurements because the ribs and skin are not fully developed. Between E14.5 and birth (approximately E21) imaging methods must be used to view the LV. After birth, mean arterial pressure increases from 30 - 70 mmHg from postnatal day (P) 2 - 353. Beyond P20, arterial pressure can be measured with solid-state catheters (i.e. Millar or Scisense). Before P20, these catheters are too big for developing mouse arteries and arterial pressure must be measured with custom pulled plastic catheters attached to fluid-filled pressure transducers3 or glass micropipettes attached to servo null pressure transducers4. Our recent work has shown that the greatest increase in blood pressure occurs during the late embryonic to early postnatal period in mice5-7. This large increase in blood pressure may influence smooth muscle cell (SMC) phenotype in developing arteries and trigger important mechanotransduction events. In human disease, where the mechanical properties of developing arteries are compromised by defects in extracellular matrix proteins (i.e. Marfan's Syndrome8 and Supravalvular Aortic Stenosis9) the rapid changes in blood pressure during this period may contribute to disease phenotype and severity through alterations in mechanotransduction signals. Therefore, it is important to be able to measure blood pressure changes during late embryonic and neonatal periods in mouse models of human disease. We describe a method for measuring LV pressure in late embryonic (E18) and early postnatal (P1 - 20) mice. A needle attached to a fluid-filled pressure transducer is inserted into the LV under ultrasound guidance. Care is taken to maintain normal cardiac function during the experimental protocol, especially for the embryonic mice. Representative data are presented and limitations of the protocol are discussed.
Bioengineering, Issue 60, systolic, diastolic, pulse, heart, artery, postnatal development
Play Button
Assessment of Cardiac Function and Energetics in Isolated Mouse Hearts Using 31P NMR Spectroscopy
Authors: Stephen C. Kolwicz Jr., Rong Tian.
Institutions: University of Washington School of Medicine.
Bioengineered mouse models have become powerful research tools in determining causal relationships between molecular alterations and models of cardiovascular disease. Although molecular biology is necessary in identifying key changes in the signaling pathway, it is not a surrogate for functional significance. While physiology can provide answers to the question of function, combining physiology with biochemical assessment of metabolites in the intact, beating heart allows for a complete picture of cardiac function and energetics. For years, our laboratory has utilized isolated heart perfusions combined with nuclear magnetic resonance (NMR) spectroscopy to accomplish this task. Left ventricular function is assessed by Langendorff-mode isolated heart perfusions while cardiac energetics is measured by performing 31P magnetic resonance spectroscopy of the perfused hearts. With these techniques, indices of cardiac function in combination with levels of phosphocreatine and ATP can be measured simultaneously in beating hearts. Furthermore, these parameters can be monitored while physiologic or pathologic stressors are instituted. For example, ischemia/reperfusion or high workload challenge protocols can be adopted. The use of aortic banding or other models of cardiac pathology are apt as well. Regardless of the variants within the protocol, the functional and energetic significance of molecular modifications of transgenic mouse models can be adequately described, leading to new insights into the associated enzymatic and metabolic pathways. Therefore, 31P NMR spectroscopy in the isolated perfused heart is a valuable research technique in animal models of cardiovascular disease.
Medicine, Issue 42, cardiac physiology, high energy phosphate, phosphocreatine, ATP
Play Button
Mechanical Testing of Mouse Carotid Arteries: from Newborn to Adult
Authors: Mazyar Amin, Victoria P. Le, Jessica E. Wagenseil.
Institutions: Saint Louis University.
The large conducting arteries in vertebrates are composed of a specialized extracellular matrix designed to provide pulse dampening and reduce the work performed by the heart. The mix of matrix proteins determines the passive mechanical properties of the arterial wall1. When the matrix proteins are altered in development, aging, disease or injury, the arterial wall remodels, changing the mechanical properties and leading to subsequent cardiac adaptation2. In normal development, the remodeling leads to a functional cardiac and cardiovascular system optimized for the needs of the adult organism. In disease, the remodeling often leads to a negative feedback cycle that can cause cardiac failure and death. By quantifying passive arterial mechanical properties in development and disease, we can begin to understand the normal remodeling process to recreate it in tissue engineering and the pathological remodeling process to test disease treatments. Mice are useful models for studying passive arterial mechanics in development and disease. They have a relatively short lifespan (mature adults by 3 months and aged adults by 2 years), so developmental3 and aging studies4 can be carried out over a limited time course. The advances in mouse genetics provide numerous genotypes and phenotypes to study changes in arterial mechanics with disease progression5 and disease treatment6. Mice can also be manipulated experimentally to study the effects of changes in hemodynamic parameters on the arterial remodeling process7. One drawback of the mouse model, especially for examining young ages, is the size of the arteries. We describe a method for passive mechanical testing of carotid arteries from mice aged 3 days to adult (approximately 90 days). We adapt a commercial myograph system to mount the arteries and perform multiple pressure or axial stretch protocols on each specimen. We discuss suitable protocols for each age, the necessary measurements and provide example data. We also include data analysis strategies for rigorous mechanical characterization of the arteries.
Bioengineering, Issue 60, blood vessel, artery, mechanics, pressure, diameter, postnatal development
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Psychophysiological Stress Assessment Using Biofeedback
Authors: Inna Khazan.
Institutions: Cambridge Health Alliance, Harvard Medical School.
In the last half century, research in biofeedback has shown the extent to which the human mind can influence the functioning of the autonomic nervous system, previously thought to be outside of conscious control. By letting people observe signals from their own bodies, biofeedback enables them to develop greater awareness of their physiological and psychological reactions, such as stress, and to learn to modify these reactions. Biofeedback practitioners can facilitate this process by assessing people s reactions to mildly stressful events and formulating a biofeedback-based treatment plan. During stress assessment the practitioner first records a baseline for physiological readings, and then presents the client with several mild stressors, such as a cognitive, physical and emotional stressor. Variety of stressors is presented in order to determine a person's stimulus-response specificity, or differences in each person's reaction to qualitatively different stimuli. This video will demonstrate the process of psychophysiological stress assessment using biofeedback and present general guidelines for treatment planning.
Neuroscience, Issue 29, Stress, biofeedback, psychophysiological, assessment
Play Button
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
Play Button
Setting Limits on Supersymmetry Using Simplified Models
Authors: Christian Gütschow, Zachary Marshall.
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
Play Button
The Dig Task: A Simple Scent Discrimination Reveals Deficits Following Frontal Brain Damage
Authors: Kris M. Martens, Cole Vonder Haar, Blake A. Hutsell, Michael R. Hoane.
Institutions: Southern Illinois University at Carbondale.
Cognitive impairment is the most frequent cause of disability in humans following brain damage, yet the behavioral tasks used to assess cognition in rodent models of brain injury is lacking. Borrowing from the operant literature our laboratory utilized a basic scent discrimination paradigm1-4 in order to assess deficits in frontally-injured rats. Previously we have briefly described the Dig task and demonstrated that rats with frontal brain damage show severe deficits across multiple tests within the task5. Here we present a more detailed protocol for this task. Rats are placed into a chamber and allowed to discriminate between two scented sands, one of which contains a reinforcer. The trial ends after the rat either correctly discriminates (defined as digging in the correct scented sand), incorrectly discriminates, or 30 sec elapses. Rats that correctly discriminate are allowed to recover and consume the reinforcer. Rats that discriminate incorrectly are immediately removed from the chamber. This can continue through a variety of reversals and novel scents. The primary analysis is the accuracy for each scent pairing (cumulative proportion correct for each scent). The general findings from the Dig task suggest that it is a simple experimental preparation that can assess deficits in rats with bilateral frontal cortical damage compared to rats with unilateral parietal damage. The Dig task can also be easily incorporated into an existing cognitive test battery. The use of more tasks such as this one can lead to more accurate testing of frontal function following injury, which may lead to therapeutic options for treatment. All animal use was conducted in accordance with protocols approved by the Institutional Animal Care and Use Committee.
Neuroscience, Issue 71, Medicine, Neurobiology, Anatomy, Physiology, Psychology, Behavior, cognitive assessment, dig task, scent discrimination, olfactory, brain injury, traumatic brain injury, TBI, brain damage, rats, animal model
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
Play Button
Lesion Explorer: A Video-guided, Standardized Protocol for Accurate and Reliable MRI-derived Volumetrics in Alzheimer's Disease and Normal Elderly
Authors: Joel Ramirez, Christopher J.M. Scott, Alicia A. McNeely, Courtney Berezuk, Fuqiang Gao, Gregory M. Szilagyi, Sandra E. Black.
Institutions: Sunnybrook Health Sciences Centre, University of Toronto.
Obtaining in vivo human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g. Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests1,2. However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs. LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors. While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.
Medicine, Issue 86, Brain, Vascular Diseases, Magnetic Resonance Imaging (MRI), Neuroimaging, Alzheimer Disease, Aging, Neuroanatomy, brain extraction, ventricles, white matter hyperintensities, cerebrovascular disease, Alzheimer disease
Play Button
Imaging Leukocyte Adhesion to the Vascular Endothelium at High Intraluminal Pressure
Authors: Danielle L. Michell, Karen L. Andrews, Kevin J. Woollard, Jaye P.F. Chin-Dusting.
Institutions: Monash University.
Worldwide, hypertension is reported to be in approximately a quarter of the population and is the leading biomedical risk factor for mortality worldwide. In the vasculature hypertension is associated with endothelial dysfunction and increased inflammation leading to atherosclerosis and various disease states such as chronic kidney disease2, stroke3 and heart failure4. An initial step in vascular inflammation leading to atherogenesis is the adhesion cascade which involves the rolling, tethering, adherence and subsequent transmigration of leukocytes through the endothelium. Recruitment and accumulation of leukocytes to the endothelium is mediated by an upregulation of adhesion molecules such as vascular cell adhesion molecule-1 (VCAM-1), intracellular cell adhesion molecule-1 (ICAM-1) and E-selectin as well as increases in cytokine and chemokine release and an upregulation of reactive oxygen species5. In vitro methods such as static adhesion assays help to determine mechanisms involved in cell-to-cell adhesion as well as the analysis of cell adhesion molecules. Methods employed in previous in vitro studies have demonstrated that acute increases in pressure on the endothelium can lead to monocyte adhesion, an upregulation of adhesion molecules and inflammatory markers6 however, similar to many in vitro assays, these findings have not been performed in real time under physiological flow conditions, nor with whole blood. Therefore, in vivo assays are increasingly utilised in animal models to demonstrate vascular inflammation and plaque development. Intravital microscopy is now widely used to assess leukocyte adhesion, rolling, migration and transmigration7-9. When combining the effects of pressure on leukocyte to endothelial adhesion the in vivo studies are less extensive. One such study examines the real time effects of flow and shear on arterial growth and remodelling but inflammatory markers were only assessed via immunohistochemistry10. Here we present a model for recording leukocyte adhesion in real time in intact pressurised blood vessels using whole blood perfusion. The methodology is a modification of an ex vivo vessel chamber perfusion model9 which enables real-time analysis of leukocyte -endothelial adhesive interactions in intact vessels. Our modification enables the manipulation of the intraluminal pressure up to 200 mmHg allowing for study not only under physiological flow conditions but also pressure conditions. While pressure myography systems have been previously demonstrated to observe vessel wall and lumen diameter11 as well as vessel contraction this is the first time demonstrating leukocyte-endothelial interactions in real time. Here we demonstrate the technique using carotid arteries harvested from rats and cannulated to a custom-made flow chamber coupled to a fluorescent microscope. The vessel chamber is equipped with a large bottom coverglass allowing a large diameter objective lens with short working distance to image the vessel. Furthermore, selected agonist and/or antagonists can be utilized to further investigate the mechanisms controlling cell adhesion. Advantages of this method over intravital microscopy include no involvement of invasive surgery and therefore a higher throughput can be obtained. This method also enables the use of localised inhibitor treatment to the desired vessel whereas intravital only enables systemic inhibitor treatment.
Immunology, Issue 54, Leukocyte adhesion, intraluminal pressure, endothelial dysfunction, inflammation, hypertension
Play Button
A Novel Rescue Technique for Difficult Intubation and Difficult Ventilation
Authors: Maria M. Zestos, Dima Daaboul, Zulfiqar Ahmed, Nasser Durgham, Roland Kaddoum.
Institutions: Children’s Hospital of Michigan, St. Jude Children’s Research Hospital.
We describe a novel non surgical technique to maintain oxygenation and ventilation in a case of difficult intubation and difficult ventilation, which works especially well with poor mask fit. Can not intubate, can not ventilate" (CICV) is a potentially life threatening situation. In this video we present a simulation of the technique we used in a case of CICV where oxygenation and ventilation were maintained by inserting an endotracheal tube (ETT) nasally down to the level of the naso-pharynx while sealing the mouth and nares for successful positive pressure ventilation. A 13 year old patient was taken to the operating room for incision and drainage of a neck abcess and direct laryngobronchoscopy. After preoxygenation, anesthesia was induced intravenously. Mask ventilation was found to be extremely difficult because of the swelling of the soft tissue. The face mask could not fit properly on the face due to significant facial swelling as well. A direct laryngoscopy was attempted with no visualization of the larynx. Oxygen saturation was difficult to maintain, with saturations falling to 80%. In order to oxygenate and ventilate the patient, an endotracheal tube was then inserted nasally after nasal spray with nasal decongestant and lubricant. The tube was pushed gently and blindly into the hypopharynx. The mouth and nose of the patient were sealed by hand and positive pressure ventilation was possible with 100% O2 with good oxygen saturation during that period of time. Once the patient was stable and well sedated, a rigid bronchoscope was introduced by the otolaryngologist showing extensive subglottic and epiglottic edema, and a mass effect from the abscess, contributing to the airway compromise. The airway was secured with an ETT tube by the otolaryngologist.This video will show a simulation of the technique on a patient undergoing general anesthesia for dental restorations.
Medicine, Issue 47, difficult ventilation, difficult intubation, nasal, saturation
Play Button
Tilt Testing with Combined Lower Body Negative Pressure: a "Gold Standard" for Measuring Orthostatic Tolerance
Authors: Clare L. Protheroe, Henrike (Rianne) J.C. Ravensbergen, Jessica A. Inskip, Victoria E. Claydon.
Institutions: Simon Fraser University .
Orthostatic tolerance (OT) refers to the ability to maintain cardiovascular stability when upright, against the hydrostatic effects of gravity, and hence to maintain cerebral perfusion and prevent syncope (fainting). Various techniques are available to assess OT and the effects of gravitational stress upon the circulation, typically by reproducing a presyncopal event (near-fainting episode) in a controlled laboratory environment. The time and/or degree of stress required to provoke this response provides the measure of OT. Any technique used to determine OT should: enable distinction between patients with orthostatic intolerance (of various causes) and asymptomatic control subjects; be highly reproducible, enabling evaluation of therapeutic interventions; avoid invasive procedures, which are known to impair OT1. In the late 1980s head-upright tilt testing was first utilized for diagnosing syncope2. Since then it has been used to assess OT in patients with syncope of unknown cause, as well as in healthy subjects to study postural cardiovascular reflexes2-6. Tilting protocols comprise three categories: passive tilt; passive tilt accompanied by pharmacological provocation; and passive tilt with combined lower body negative pressure (LBNP). However, the effects of tilt testing (and other orthostatic stress testing modalities) are often poorly reproducible, with low sensitivity and specificity to diagnose orthostatic intolerance7. Typically, a passive tilt includes 20-60 min of orthostatic stress continued until the onset of presyncope in patients2-6. However, the main drawback of this procedure is its inability to invoke presyncope in all individuals undergoing the test, and corresponding low sensitivity8,9. Thus, different methods were explored to increase the orthostatic stress and improve sensitivity. Pharmacological provocation has been used to increase the orthostatic challenge, for example using isoprenaline4,7,10,11 or sublingual nitrate12,13. However, the main drawback of these approaches are increases in sensitivity at the cost of unacceptable decreases in specificity10,14, with a high positive response rate immediately after administration15. Furthermore, invasive procedures associated with some pharmacological provocations greatly increase the false positive rate1. Another approach is to combine passive tilt testing with LBNP, providing a stronger orthostatic stress without invasive procedures or drug side-effects, using the technique pioneered by Professor Roger Hainsworth in the 1990s16-18. This approach provokes presyncope in almost all subjects (allowing for symptom recognition in patients with syncope), while discriminating between patients with syncope and healthy controls, with a specificity of 92%, sensitivity of 85%, and repeatability of 1.1±0.6 min16,17. This allows not only diagnosis and pathophysiological assessment19-22, but also the evaluation of treatments for orthostatic intolerance due to its high repeatability23-30. For these reasons, we argue this should be the "gold standard" for orthostatic stress testing, and accordingly this will be the method described in this paper.
Medicine, Issue 73, Anatomy, Physiology, Biomedical Engineering, Neurobiology, Kinesiology, Cardiology, tilt test, lower body negative pressure, orthostatic stress, syncope, orthostatic tolerance, fainting, gravitational stress, head upright, stroke, clinical techniques
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Conducting Miller-Urey Experiments
Authors: Eric T. Parker, James H. Cleaves, Aaron S. Burton, Daniel P. Glavin, Jason P. Dworkin, Manshui Zhou, Jeffrey L. Bada, Facundo M. Fernández.
Institutions: Georgia Institute of Technology, Tokyo Institute of Technology, Institute for Advanced Study, NASA Johnson Space Center, NASA Goddard Space Flight Center, University of California at San Diego.
In 1953, Stanley Miller reported the production of biomolecules from simple gaseous starting materials, using an apparatus constructed to simulate the primordial Earth's atmosphere-ocean system. Miller introduced 200 ml of water, 100 mmHg of H2, 200 mmHg of CH4, and 200 mmHg of NH3 into the apparatus, then subjected this mixture, under reflux, to an electric discharge for a week, while the water was simultaneously heated. The purpose of this manuscript is to provide the reader with a general experimental protocol that can be used to conduct a Miller-Urey type spark discharge experiment, using a simplified 3 L reaction flask. Since the experiment involves exposing inflammable gases to a high voltage electric discharge, it is worth highlighting important steps that reduce the risk of explosion. The general procedures described in this work can be extrapolated to design and conduct a wide variety of electric discharge experiments simulating primitive planetary environments.
Chemistry, Issue 83, Geosciences (General), Exobiology, Miller-Urey, Prebiotic chemistry, amino acids, spark discharge
Play Button
Simultaneous Quantification of T-Cell Receptor Excision Circles (TRECs) and K-Deleting Recombination Excision Circles (KRECs) by Real-time PCR
Authors: Alessandra Sottini, Federico Serana, Diego Bertoli, Marco Chiarini, Monica Valotti, Marion Vaglio Tessitore, Luisa Imberti.
Institutions: Spedali Civili di Brescia.
T-cell receptor excision circles (TRECs) and K-deleting recombination excision circles (KRECs) are circularized DNA elements formed during recombination process that creates T- and B-cell receptors. Because TRECs and KRECs are unable to replicate, they are diluted after each cell division, and therefore persist in the cell. Their quantity in peripheral blood can be considered as an estimation of thymic and bone marrow output. By combining well established and commonly used TREC assay with a modified version of KREC assay, we have developed a duplex quantitative real-time PCR that allows quantification of both newly-produced T and B lymphocytes in a single assay. The number of TRECs and KRECs are obtained using a standard curve prepared by serially diluting TREC and KREC signal joints cloned in a bacterial plasmid, together with a fragment of T-cell receptor alpha constant gene that serves as reference gene. Results are reported as number of TRECs and KRECs/106 cells or per ml of blood. The quantification of these DNA fragments have been proven useful for monitoring immune reconstitution following bone marrow transplantation in both children and adults, for improved characterization of immune deficiencies, or for better understanding of certain immunomodulating drug activity.
Immunology, Issue 94, B lymphocytes, primary immunodeficiency, real-time PCR, immune recovery, T-cell homeostasis, T lymphocytes, thymic output, bone marrow output
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
Creating Dynamic Images of Short-lived Dopamine Fluctuations with lp-ntPET: Dopamine Movies of Cigarette Smoking
Authors: Evan D. Morris, Su Jin Kim, Jenna M. Sullivan, Shuo Wang, Marc D. Normandin, Cristian C. Constantinescu, Kelly P. Cosgrove.
Institutions: Yale University, Yale University, Yale University, Yale University, Massachusetts General Hospital, University of California, Irvine.
We describe experimental and statistical steps for creating dopamine movies of the brain from dynamic PET data. The movies represent minute-to-minute fluctuations of dopamine induced by smoking a cigarette. The smoker is imaged during a natural smoking experience while other possible confounding effects (such as head motion, expectation, novelty, or aversion to smoking repeatedly) are minimized. We present the details of our unique analysis. Conventional methods for PET analysis estimate time-invariant kinetic model parameters which cannot capture short-term fluctuations in neurotransmitter release. Our analysis - yielding a dopamine movie - is based on our work with kinetic models and other decomposition techniques that allow for time-varying parameters 1-7. This aspect of the analysis - temporal-variation - is key to our work. Because our model is also linear in parameters, it is practical, computationally, to apply at the voxel level. The analysis technique is comprised of five main steps: pre-processing, modeling, statistical comparison, masking and visualization. Preprocessing is applied to the PET data with a unique 'HYPR' spatial filter 8 that reduces spatial noise but preserves critical temporal information. Modeling identifies the time-varying function that best describes the dopamine effect on 11C-raclopride uptake. The statistical step compares the fit of our (lp-ntPET) model 7 to a conventional model 9. Masking restricts treatment to those voxels best described by the new model. Visualization maps the dopamine function at each voxel to a color scale and produces a dopamine movie. Interim results and sample dopamine movies of cigarette smoking are presented.
Behavior, Issue 78, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Medicine, Anatomy, Physiology, Image Processing, Computer-Assisted, Receptors, Dopamine, Dopamine, Functional Neuroimaging, Binding, Competitive, mathematical modeling (systems analysis), Neurotransmission, transient, dopamine release, PET, modeling, linear, time-invariant, smoking, F-test, ventral-striatum, clinical techniques
Play Button
Optimized PCR-based Detection of Mycoplasma
Authors: Paige L. Dobrovolny, Dan Bess.
Institutions: Sigma-Aldrich.
The maintenance of contamination-free cell lines is essential to cell-based research. Among the biggest contaminant concerns are mycoplasma contamination. Although mycoplasma do not usually kill contaminated cells, they are difficult to detect and can cause a variety of effects on cultured cells, including altered metabolism, slowed proliferation and chromosomal aberrations. In short, mycoplasma contamination compromises the value of those cell lines in providing accurate data for life science research. The sources of mycoplasma contamination in the laboratory are very challenging to completely control. As certain mycoplasma species are found on human skin, they can be introduced through poor aseptic technique. Additionally, they can come from contaminated supplements such as fetal bovine serum, and most importantly from other contaminated cell cultures. Once mycoplasma contaminates a culture, it can quickly spread to contaminate other areas of the lab. Strict adherence to good laboratory practices such as good aseptic technique are key, and routine testing for mycoplasma is highly recommended for successful control of mycoplasma contamination. PCR-based detection of mycoplasma has become a very popular method for routine cell line maintenance. PCR-based detection methods are highly sensitive and can provide rapid results, which allows researchers to respond quickly to isolate and eliminate contamination once it is detected in comparison to the time required using microbiological techniques. The LookOut Mycoplasma PCR Detection Kit is highly sensitive, with a detection limit of only 2 genomes per μl. Taking advantage of the highly specific JumpStart Taq DNA Polymerase and a proprietary primer design, false positives are greatly reduced. The convenient 8-tube format, strips pre-coated with dNTPs, and associated primers helps increase the throughput to meet the needs of customers with larger collections of cell lines. Given the extreme sensitivity of the kit, great care must be taken to prevent inadvertent contamination of samples and reagents. The step-by-step protocol we demonstrate highlights the precautions and practices required for reliable mycoplasma detection. We also show and discuss typical results and their interpretation. Our goal is to ensure the success of researchers using the LookOut Mycoplasma PCR Detection Kit.
Microbiology, Issue 52, Mycoplasma detection, mycoplasma contamination, cell culture, sigma mycoplasma detection, acholeplasma contamination, polymerase chain reaction, PCR
Play Button
Radio Frequency Identification and Motion-sensitive Video Efficiently Automate Recording of Unrewarded Choice Behavior by Bumblebees
Authors: Levente L. Orbán, Catherine M.S. Plowright.
Institutions: University of Ottawa.
We present two methods for observing bumblebee choice behavior in an enclosed testing space. The first method consists of Radio Frequency Identification (RFID) readers built into artificial flowers that display various visual cues, and RFID tags (i.e., passive transponders) glued to the thorax of bumblebee workers. The novelty in our implementation is that RFID readers are built directly into artificial flowers that are capable of displaying several distinct visual properties such as color, pattern type, spatial frequency (i.e., “busyness” of the pattern), and symmetry (spatial frequency and symmetry were not manipulated in this experiment). Additionally, these visual displays in conjunction with the automated systems are capable of recording unrewarded and untrained choice behavior. The second method consists of recording choice behavior at artificial flowers using motion-sensitive high-definition camcorders. Bumblebees have number tags glued to their thoraces for unique identification. The advantage in this implementation over RFID is that in addition to observing landing behavior, alternate measures of preference such as hovering and antennation may also be observed. Both automation methods increase experimental control, and internal validity by allowing larger scale studies that take into account individual differences. External validity is also improved because bees can freely enter and exit the testing environment without constraints such as the availability of a research assistant on-site. Compared to human observation in real time, the automated methods are more cost-effective and possibly less error-prone.
Neuroscience, Issue 93, bumblebee, unlearned behaviors, floral choice, visual perception, Bombus spp, information processing, radio-frequency identification, motion-sensitive video
Play Button
Measuring Blood Pressure in Mice using Volume Pressure Recording, a Tail-cuff Method
Authors: Alan Daugherty, Debra Rateri, Lu Hong, Anju Balakrishnan.
Institutions: University of Kentucky.
The CODA 8-Channel High Throughput Non-Invasive Blood Pressure system measures the blood pressure in up to 8 mice or rats simultaneously. The CODA tail-cuff system uses Volume Pressure Recording (VPR) to measure the blood pressure by determining the tail blood volume. A specially designed differential pressure transducer and an occlusion tail-cuff measure the total blood volume in the tail without the need to obtain the individual pulse signal. Special attention is afforded to the length of the occlusion cuff in order to derive the most accurate blood pressure readings. VPR can easily obtain readings on dark-skinned rodents, such as C57BL6 mice and is MRI compatible. The CODA system provides you with measurements of six (6) different blood pressure parameters; systolic and diastolic blood pressure, heart rate, mean blood pressure, tail blood flow, and tail blood volume. Measurements can be made on either awake or anesthetized mice or rats. The CODA system includes a controller, laptop computer, software, cuffs, animal holders, infrared warming pads, and an infrared thermometer. There are seven different holder sizes for mice as small as 8 grams to rats as large as 900 grams.
Medicine, Issue 27, blood pressure, systolic, diastolic, tail-cuff, mouse, rat
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.