The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone.
Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia.
The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction.
There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia.
This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
24 Related JoVE Articles!
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Measuring Frailty in HIV-infected Individuals. Identification of Frail Patients is the First Step to Amelioration and Reversal of Frailty
Institutions: University of Arizona, University of Arizona.
A simple, validated protocol consisting of a battery of tests is available to identify elderly patients with frailty syndrome. This syndrome of decreased reserve and resistance to stressors increases in incidence with increasing age. In the elderly, frailty may pursue a step-wise loss of function from non-frail to pre-frail to frail. We studied frailty in HIV-infected patients and found that ~20% are frail using the Fried phenotype using stringent criteria developed for the elderly1,2
. In HIV infection the syndrome occurs at a younger age.
HIV patients were checked for 1) unintentional weight loss; 2) slowness as determined by walking speed; 3) weakness as measured by a grip dynamometer; 4) exhaustion by responses to a depression scale; and 5) low physical activity was determined by assessing kilocalories expended in a week's time. Pre-frailty was present with any two of five criteria and frailty was present if any three of the five criteria were abnormal.
The tests take approximately 10-15 min to complete and they can be performed by medical assistants during routine clinic visits. Test results are scored by referring to standard tables. Understanding which of the five components contribute to frailty in an individual patient can allow the clinician to address relevant underlying problems, many of which are not evident in routine HIV clinic visits.
Medicine, Issue 77, Infection, Virology, Infectious Diseases, Anatomy, Physiology, Molecular Biology, Biomedical Engineering, Retroviridae Infections, Body Weight Changes, Diagnostic Techniques and Procedures, Physical Examination, Muscle Strength, Behavior, Virus Diseases, Pathological Conditions, Signs and Symptoms, Diagnosis, Musculoskeletal and Neural Physiological Phenomena, HIV, HIV-1, AIDS, Frailty, Depression, Weight Loss, Weakness, Slowness, Exhaustion, Aging, clinical techniques
Models and Methods to Evaluate Transport of Drug Delivery Systems Across Cellular Barriers
Institutions: University of Maryland, University of Maryland.
Sub-micrometer carriers (nanocarriers; NCs) enhance efficacy of drugs by improving solubility, stability, circulation time, targeting, and release. Additionally, traversing cellular barriers in the body is crucial for both oral delivery of therapeutic NCs into the circulation and transport from the blood into tissues, where intervention is needed. NC transport across cellular barriers is achieved by: (i) the paracellular route, via transient disruption of the junctions that interlock adjacent cells, or (ii) the transcellular route, where materials are internalized by endocytosis, transported across the cell body, and secreted at the opposite cell surface (transyctosis). Delivery across cellular barriers can be facilitated by coupling therapeutics or their carriers with targeting agents that bind specifically to cell-surface markers involved in transport. Here, we provide methods to measure the extent and mechanism of NC transport across a model cell barrier, which consists of a monolayer of gastrointestinal (GI) epithelial cells grown on a porous membrane located in a transwell insert. Formation of a permeability barrier is confirmed by measuring transepithelial electrical resistance (TEER), transepithelial transport of a control substance, and immunostaining of tight junctions. As an example, ~200 nm polymer NCs are used, which carry a therapeutic cargo and are coated with an antibody that targets a cell-surface determinant. The antibody or therapeutic cargo is labeled with 125
I for radioisotope tracing and labeled NCs are added to the upper chamber over the cell monolayer for varying periods of time. NCs associated to the cells and/or transported to the underlying chamber can be detected. Measurement of free 125
I allows subtraction of the degraded fraction. The paracellular route is assessed by determining potential changes caused by NC transport to the barrier parameters described above. Transcellular transport is determined by addressing the effect of modulating endocytosis and transcytosis pathways.
Bioengineering, Issue 80, Antigens, Enzymes, Biological Therapy, bioengineering (general), Pharmaceutical Preparations, Macromolecular Substances, Therapeutics, Digestive System and Oral Physiological Phenomena, Biological Phenomena, Cell Physiological Phenomena, drug delivery systems, targeted nanocarriers, transcellular transport, epithelial cells, tight junctions, transepithelial electrical resistance, endocytosis, transcytosis, radioisotope tracing, immunostaining
Meal Duration as a Measure of Orofacial Nociceptive Responses in Rodents
Institutions: Texas A&M University Baylor College of Dentistry.
A lengthening in meal duration can be used to measure an increase in orofacial mechanical hyperalgesia having similarities to the guarding behavior of humans with orofacial pain. To measure meal duration unrestrained rats are continuously kept in sound attenuated, computerized feeding modules for days to weeks to record feeding behavior. These sound-attenuated chambers are equipped with chow pellet dispensers. The dispenser has a pellet trough with a photobeam placed at the bottom of the trough and when a rodent removes a pellet from the feeder trough this beam is no longer blocked, signaling the computer to drop another pellet. The computer records the date and time when the pellets were taken from the trough and from this data the experimenter can calculate the meal parameters. When calculating meal parameters a meal was defined based on previous work and was set at 10 min (in other words when the animal does not eat for 10 min that would be the end of the animal's meal) also the minimum meal size was set at 3 pellets. The meal duration, meal number, food intake, meal size and inter-meal interval can then be calculated by the software for any time period that the operator desires. Of the feeding parameters that can be calculated meal duration has been shown to be a continuous noninvasive biological marker of orofacial nociception in male rats and mice and female rats. Meal duration measurements are quantitative, require no training or animal manipulation, require cortical participation, and do not compete with other experimentally induced behaviors. These factors distinguish this assay from other operant or reflex methods for recording orofacial nociception.
Behavior, Issue 83, Pain, rat, nociception, myofacial, orofacial, tooth, temporomandibular joint (TMJ)
Computerized Dynamic Posturography for Postural Control Assessment in Patients with Intermittent Claudication
Institutions: University of Sydney, University of Hull, Hull and East Yorkshire Hospitals, Addenbrookes Hospital.
Computerized dynamic posturography with the EquiTest is an objective technique for measuring postural strategies under challenging static and dynamic conditions. As part of a diagnostic assessment, the early detection of postural deficits is important so that appropriate and targeted interventions can be prescribed. The Sensory Organization Test (SOT) on the EquiTest determines an individual's use of the sensory systems (somatosensory, visual, and vestibular) that are responsible for postural control. Somatosensory and visual input are altered by the calibrated sway-referenced support surface and visual surround, which move in the anterior-posterior direction in response to the individual's postural sway. This creates a conflicting sensory experience. The Motor Control Test (MCT) challenges postural control by creating unexpected postural disturbances in the form of backwards and forwards translations. The translations are graded in magnitude and the time to recover from the perturbation is computed.
Intermittent claudication, the most common symptom of peripheral arterial disease, is characterized by a cramping pain in the lower limbs and caused by muscle ischemia secondary to reduced blood flow to working muscles during physical exertion. Claudicants often display poor balance, making them susceptible to falls and activity avoidance. The Ankle Brachial Pressure Index (ABPI) is a noninvasive method for indicating the presence of peripheral arterial disease and intermittent claudication, a common symptom in the lower extremities. ABPI is measured as the highest systolic pressure from either the dorsalis pedis or posterior tibial artery divided by the highest brachial artery systolic pressure from either arm. This paper will focus on the use of computerized dynamic posturography in the assessment of balance in claudicants.
Medicine, Issue 82, Posture, Computerized dynamic posturography, Ankle brachial pressure index, Peripheral arterial disease, Intermittent claudication, Balance, Posture, EquiTest, Sensory Organization Test, Motor Control Test
Dynamic Visual Tests to Identify and Quantify Visual Damage and Repair Following Demyelination in Optic Neuritis Patients
Institutions: Hadassah Hebrew-University Medical Center.
In order to follow optic neuritis patients and evaluate the effectiveness of their treatment, a handy, accurate and quantifiable tool is required to assess changes in myelination at the central nervous system (CNS). However, standard measurements, including routine visual tests and MRI scans, are not sensitive enough for this purpose. We present two visual tests addressing dynamic monocular and binocular functions which may closely associate with the extent of myelination along visual pathways. These include Object From Motion (OFM) extraction and Time-constrained stereo protocols. In the OFM test, an array of dots compose an object, by moving the dots within the image rightward while moving the dots outside the image leftward or vice versa. The dot pattern generates a camouflaged object that cannot be detected when the dots are stationary or moving as a whole. Importantly, object recognition is critically dependent on motion perception. In the Time-constrained Stereo protocol, spatially disparate images are presented for a limited length of time, challenging binocular 3-dimensional integration in time. Both tests are appropriate for clinical usage and provide a simple, yet powerful, way to identify and quantify processes of demyelination and remyelination along visual pathways. These protocols may be efficient to diagnose and follow optic neuritis and multiple sclerosis patients.
In the diagnostic process, these protocols may reveal visual deficits that cannot be identified via current standard visual measurements. Moreover, these protocols sensitively identify the basis of the currently unexplained continued visual complaints of patients following recovery of visual acuity. In the longitudinal follow up course, the protocols can be used as a sensitive marker of demyelinating and remyelinating processes along time. These protocols may therefore be used to evaluate the efficacy of current and evolving therapeutic strategies, targeting myelination of the CNS.
Medicine, Issue 86, Optic neuritis, visual impairment, dynamic visual functions, motion perception, stereopsis, demyelination, remyelination
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33
. To help improve this understanding, proton magnetic resonance spectroscopy (1
H-MRS) can be used as it allows the in vivo
quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41
. In fact, a recent study demonstrated that 1
H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34
. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1
H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31
. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
A Method for Mouse Pancreatic Islet Isolation and Intracellular cAMP Determination
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison, University of Waterloo.
Uncontrolled glycemia is a hallmark of diabetes mellitus and promotes morbidities like neuropathy, nephropathy, and retinopathy. With the increasing prevalence of diabetes, both immune-mediated type 1 and obesity-linked type 2, studies aimed at delineating diabetes pathophysiology and therapeutic mechanisms are of critical importance. The β-cells of the pancreatic islets of Langerhans are responsible for appropriately secreting insulin in response to elevated blood glucose concentrations. In addition to glucose and other nutrients, the β-cells are also stimulated by specific hormones, termed incretins, which are secreted from the gut in response to a meal and act on β-cell receptors that increase the production of intracellular cyclic adenosine monophosphate (cAMP). Decreased β-cell function, mass, and incretin responsiveness are well-understood to contribute to the pathophysiology of type 2 diabetes, and are also being increasingly linked with type 1 diabetes. The present mouse islet isolation and cAMP determination protocol can be a tool to help delineate mechanisms promoting disease progression and therapeutic interventions, particularly those that are mediated by the incretin receptors or related receptors that act through modulation of intracellular cAMP production. While only cAMP measurements will be described, the described islet isolation protocol creates a clean preparation that also allows for many other downstream applications, including glucose stimulated insulin secretion, [3H
]-thymidine incorporation, protein abundance, and mRNA expression.
Physiology, Issue 88, islet, isolation, insulin secretion, β-cell, diabetes, cAMP production, mouse
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4
is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1
). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6
. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8
. Using logistic regression analysis of subject scores (i.e.
pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e.
composite networks with improved discrimination of patients from healthy control subjects5,6
. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9
. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10
. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11
. These standardized values can in turn be used to assist in differential diagnosis12,13
and to assess disease progression and treatment effects at the network level7,14-16
. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Corneal Confocal Microscopy: A Novel Non-invasive Technique to Quantify Small Fibre Pathology in Peripheral Neuropathies
Institutions: University of Manchester.
The accurate quantification of peripheral neuropathy is important to define at risk patients, anticipate deterioration, and assess new therapies. Conventional methods assess neurological deficits and electrophysiology and quantitative sensory testing quantifies functional alterations to detect neuropathy. However, the earliest damage appears to be to the small fibres and yet these tests primarily assess large fibre dysfunction and have a limited ability to demonstrate regeneration and repair. The only techniques which allow a direct examination of unmyelinated nerve fibre damage and repair are sural nerve biopsy with electron microscopy and skin-punch biopsy. However, both are invasive procedures and require lengthy laboratory procedures and considerable expertise. Corneal Confocal microscopy is a non-invasive clinical technique which provides in-vivo
imaging of corneal nerve fibres. We have demonstrated early nerve damage, which precedes loss of intraepidermal nerve fibres in skin biopsies together with stratification of neuropathic severity and repair following pancreas transplantation in diabetic patients. We have also demonstrated nerve damage in idiopathic small fibre neuropathy and Fabry's disease.
Medicine, Issue 47, Corneal Confocal Microscopy, Corneal nerves, Peripheral Neuropathy, Diabetic Neuropathy
Bronchial Thermoplasty: A Novel Therapeutic Approach to Severe Asthma
Institutions: Virginia Hospital Center, Virginia Hospital Center.
Bronchial thermoplasty is a non-drug procedure for severe persistent asthma that delivers thermal energy to the airway wall in a precisely controlled manner to reduce excessive airway smooth muscle. Reducing airway smooth muscle decreases the ability of the airways to constrict, thereby reducing the frequency of asthma attacks. Bronchial thermoplasty is delivered by the Alair System and is performed in three outpatient procedure visits, each scheduled approximately three weeks apart. The first procedure treats the airways of the right lower lobe, the second treats the airways of the left lower lobe and the third and final procedure treats the airways in both upper lobes. After all three procedures are performed the bronchial thermoplasty treatment is complete.
Bronchial thermoplasty is performed during bronchoscopy with the patient under moderate sedation. All accessible airways distal to the mainstem bronchi between 3 and 10 mm in diameter, with the exception of the right middle lobe, are treated under bronchoscopic visualization. Contiguous and non-overlapping activations of the device are used, moving from distal to proximal along the length of the airway, and systematically from airway to airway as described previously. Although conceptually straightforward, the actual execution of bronchial thermoplasty is quite intricate and procedural duration for the treatment of a single lobe is often substantially longer than encountered during routine bronchoscopy. As such, bronchial thermoplasty should be considered a complex interventional bronchoscopy and is intended for the experienced bronchoscopist. Optimal patient management is critical in any such complex and longer duration bronchoscopic procedure. This article discusses the importance of careful patient selection, patient preparation, patient management, procedure duration, postoperative care and follow-up to ensure that bronchial thermoplasty is performed safely.
Bronchial thermoplasty is expected to complement asthma maintenance medications by providing long-lasting asthma control and improving asthma-related quality of life of patients with severe asthma. In addition, bronchial thermoplasty has been demonstrated to reduce severe exacerbations (asthma attacks) emergency rooms visits for respiratory symptoms, and time lost from work, school and other daily activities due to asthma.
Medicine, Issue 45, bronchial thermoplasty, severe asthma, airway smooth muscle, bronchoscopy, radiofrequency energy, patient management, moderate sedation
Isolation of Human Islets from Partially Pancreatectomized Patients
Institutions: University Hospital Carl Gustav Carus, University of Technology Dresden, Paul Langerhans Institute Dresden, University Hospital Carl Gustav Carus, University of Technology Dresden.
Investigations into the pathogenesis of type 2 diabetes and islets of Langerhans malfunction 1
have been hampered by the limited availability of type 2 diabetic islets from organ donors2
. Here we share our protocol for isolating islets from human pancreatic tissue obtained from type 2 diabetic and non-diabetic patients who have undergone partial pancreatectomy due to different pancreatic diseases (benign or malignant pancreatic tumors, chronic pancreatitis, and common bile duct or duodenal tumors). All patients involved gave their consent to this study, which had also been approved by the local ethics committee. The surgical specimens were immediately delivered to the pathologist who selected soft and healthy appearing pancreatic tissue for islet isolation, retaining the damaged tissue for diagnostic purposes. We found that to isolate more than 1,000 islets, we had to begin with at least 2 g of pancreatic tissue. Also essential to our protocol was to visibly distend the tissue when injecting the enzyme-containing media and subsequently mince it to aid digestion by increasing the surface area.
To extend the applicability of our protocol to include the occasional case in which a large amount (>15g) of human pancreatic tissue is available , we used a Ricordi chamber (50 ml) to digest the tissue. During digestion, we manually shook the Ricordi chamber3
at an intensity that varied by specimen according to its level of tissue fibrosis. A discontinous Ficoll gradient was then used to separate the islets from acinar tissue. We noted that the tissue pellet should be small enough to be homogenously resuspended in Ficoll medium with a density of 1.125 g/ml. After isolation, we cultured the islets under stress free conditions (no shaking or rotation) with 5% CO2
at 37 °C for at least 48 h in order to facilitate their functional recovery. Widespread application of our protocol and its future improvement could enable the timely harvesting of large quantities of human islets from diabetic and clinically matched non-diabetic subjects, greatly advancing type 2 diabetes research.
Medicine, Issue 53, human islets, Diabetes mellitus, partial pancreatectomy, human islet isolation
Doppler Optical Coherence Tomography of Retinal Circulation
Institutions: Oregon Health and Science University , University of Southern California.
Noncontact retinal blood flow measurements are performed with a Fourier domain optical coherence tomography (OCT) system using a circumpapillary double circular scan (CDCS) that scans around the optic nerve head at 3.40 mm and 3.75 mm diameters. The double concentric circles are performed 6 times consecutively over 2 sec. The CDCS scan is saved with Doppler shift information from which flow can be calculated. The standard clinical protocol calls for 3 CDCS scans made with the OCT beam passing through the superonasal edge of the pupil and 3 CDCS scan through the inferonal pupil. This double-angle protocol ensures that acceptable Doppler angle is obtained on each retinal branch vessel in at least 1 scan. The CDCS scan data, a 3-dimensional volumetric OCT scan of the optic disc scan, and a color photograph of the optic disc are used together to obtain retinal blood flow measurement on an eye. We have developed a blood flow measurement software called "Doppler optical coherence tomography of retinal circulation" (DOCTORC). This semi-automated software is used to measure total retinal blood flow, vessel cross section area, and average blood velocity. The flow of each vessel is calculated from the Doppler shift in the vessel cross-sectional area and the Doppler angle between the vessel and the OCT beam. Total retinal blood flow measurement is summed from the veins around the optic disc. The results obtained at our Doppler OCT reading center showed good reproducibility between graders and methods (<10%). Total retinal blood flow could be useful in the management of glaucoma, other retinal diseases, and retinal diseases. In glaucoma patients, OCT retinal blood flow measurement was highly correlated with visual field loss (R2
>0.57 with visual field pattern deviation). Doppler OCT is a new method to perform rapid, noncontact, and repeatable measurement of total retinal blood flow using widely available Fourier-domain OCT instrumentation. This new technology may improve the practicality of making these measurements in clinical studies and routine clinical practice.
Medicine, Issue 67, Ophthalmology, Physics, Doppler optical coherence tomography, total retinal blood flow, dual circular scan pattern, image analysis, semi-automated grading software, optic disc
Rapid Determination of the Thermal Nociceptive Threshold in Diabetic Rats
Institutions: Wright State University, Universidade São Judas Tadeu.
Painful diabetic neuropathy (PDN) is characterized by hyperalgesia i.e.
, increased sensitivity to noxious stimulus, and allodynia i.e.,
hypersensitivity to normally innocuous stimuli1
. Hyperalgesia and allodynia have been studied in many different rodent models of diabetes mellitus2
. However, as stated by Bölcskei et al
, determination of "pain
" in animal models is challenging due to its subjective nature3
. Moreover, the traditional methods used to determine behavioral responses to noxious thermal stimuli usually lack reproducibility and pharmacological sensitivity3
. For instance, by using the hot-plate method of Ankier4
, flinch, withdrawal and/or licking of either hind- and/or fore-paws is quantified as reflex latencies at constant high thermal stimuli (52-55 °C). However, animals that are hyperalgesic to thermal stimulus do not reproducibly show differences in reflex latencies using those supra-threshold temperatures3,5
. As the recently described method of Bölcskei et al.6
, the procedures described here allows for the rapid, sensitive and reproducible determination of thermal nociceptive thresholds (TNTs) in mice and rats. The method uses slowly increasing thermal stimulus applied mostly to the skin of mouse/rat plantar surface. The method is particularly sensitive to study anti-nociception during hyperalgesic states such as PDN. The procedures described bellow are based on the ones published in detail by Almási et al 5
and Bölcskei et al 3
. The procedures described here have been approved the Laboratory Animal Care and Use Committee (LACUC), Wright State University.
Neuroscience, Issue 63, Diabetes, painful diabetic neuropathy, nociception, thermal nociceptive threshold, nocifensive behavior
Tilt Testing with Combined Lower Body Negative Pressure: a "Gold Standard" for Measuring Orthostatic Tolerance
Institutions: Simon Fraser University .
Orthostatic tolerance (OT) refers to the ability to maintain cardiovascular stability when upright, against the hydrostatic effects of gravity, and hence to maintain cerebral perfusion and prevent syncope (fainting). Various techniques are available to assess OT and the effects of gravitational stress upon the circulation, typically by reproducing a presyncopal event (near-fainting episode) in a controlled laboratory environment. The time and/or degree of stress required to provoke this response provides the measure of OT. Any technique used to determine OT should: enable distinction between patients with orthostatic intolerance (of various causes) and asymptomatic control subjects; be highly reproducible, enabling evaluation of therapeutic interventions; avoid invasive procedures, which are known to impair OT1
In the late 1980s head-upright tilt testing was first utilized for diagnosing syncope2
. Since then it has been used to assess OT in patients with syncope of unknown cause, as well as in healthy subjects to study postural cardiovascular reflexes2-6
. Tilting protocols comprise three categories: passive tilt; passive tilt accompanied by pharmacological provocation; and passive tilt with combined lower body negative pressure (LBNP). However, the effects of tilt testing (and other orthostatic stress testing modalities) are often poorly reproducible, with low sensitivity and specificity to diagnose orthostatic intolerance7
Typically, a passive tilt includes 20-60 min of orthostatic stress continued until the onset of presyncope in patients2-6
. However, the main drawback of this procedure is its inability to invoke presyncope in all individuals undergoing the test, and corresponding low sensitivity8,9
. Thus, different methods were explored to increase the orthostatic stress and improve sensitivity.
Pharmacological provocation has been used to increase the orthostatic challenge, for example using isoprenaline4,7,10,11
or sublingual nitrate12,13
. However, the main drawback of these approaches are increases in sensitivity at the cost of unacceptable decreases in specificity10,14
, with a high positive response rate immediately after administration15
. Furthermore, invasive procedures associated with some pharmacological provocations greatly increase the false positive rate1
Another approach is to combine passive tilt testing with LBNP, providing a stronger orthostatic stress without invasive procedures or drug side-effects, using the technique pioneered by Professor Roger Hainsworth in the 1990s16-18
. This approach provokes presyncope in almost all subjects (allowing for symptom recognition in patients with syncope), while discriminating between patients with syncope and healthy controls, with a specificity of 92%, sensitivity of 85%, and repeatability of 1.1±0.6 min16,17
. This allows not only diagnosis and pathophysiological assessment19-22
, but also the evaluation of treatments for orthostatic intolerance due to its high repeatability23-30
. For these reasons, we argue this should be the "gold standard" for orthostatic stress testing, and accordingly this will be the method described in this paper.
Medicine, Issue 73, Anatomy, Physiology, Biomedical Engineering, Neurobiology, Kinesiology, Cardiology, tilt test, lower body negative pressure, orthostatic stress, syncope, orthostatic tolerance, fainting, gravitational stress, head upright, stroke, clinical techniques
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
A Zebrafish Model of Diabetes Mellitus and Metabolic Memory
Institutions: Rosalind Franklin University of Medicine and Science, Rosalind Franklin University of Medicine and Science.
Diabetes mellitus currently affects 346 million individuals and this is projected to increase to 400 million by 2030. Evidence from both the laboratory and large scale clinical trials has revealed that diabetic complications progress unimpeded via the phenomenon of metabolic memory even when glycemic control is pharmaceutically achieved. Gene expression can be stably altered through epigenetic changes which not only allow cells and organisms to quickly respond to changing environmental stimuli but also confer the ability of the cell to "memorize" these encounters once the stimulus is removed. As such, the roles that these mechanisms play in the metabolic memory phenomenon are currently being examined.
We have recently reported the development of a zebrafish model of type I diabetes mellitus and characterized this model to show that diabetic zebrafish not only display the known secondary complications including the changes associated with diabetic retinopathy, diabetic nephropathy and impaired wound healing but also exhibit impaired caudal fin regeneration. This model is unique in that the zebrafish is capable to regenerate its damaged pancreas and restore a euglycemic state similar to what would be expected in post-transplant human patients. Moreover, multiple rounds of caudal fin amputation allow for the separation and study of pure epigenetic effects in an in vivo
system without potential complicating factors from the previous diabetic state. Although euglycemia is achieved following pancreatic regeneration, the diabetic secondary complication of fin regeneration and skin wound healing persists indefinitely. In the case of impaired fin regeneration, this pathology is retained even after multiple rounds of fin regeneration in the daughter fin tissues. These observations point to an underlying epigenetic process existing in the metabolic memory state. Here we present the methods needed to successfully generate the diabetic and metabolic memory groups of fish and discuss the advantages of this model.
Medicine, Issue 72, Genetics, Genomics, Physiology, Anatomy, Biomedical Engineering, Metabolomics, Zebrafish, diabetes, metabolic memory, tissue regeneration, streptozocin, epigenetics, Danio rerio, animal model, diabetes mellitus, diabetes, drug discovery, hyperglycemia
Improving IV Insulin Administration in a Community Hospital
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4
The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5
It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance.
The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6
Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8
Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia.
Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Protocols for Oral Infection of Lepidopteran Larvae with Baculovirus
Institutions: Iowa State University.
Baculoviruses are widely used both as protein expression vectors and as insect pest control agents. This video shows how lepidopteran larvae can be infected with polyhedra by droplet feeding and diet plug-based bioassays. This accompanying Springer Protocols section provides an overview of the baculovirus lifecycle and use of baculoviruses as insecticidal agents, including discussion of the pros and cons for use of baculoviruses as insecticides, and progress made in genetic enhancement of baculoviruses for improved insecticidal efficacy.
Plant Biology, Issue 19, Springer Protocols, Baculovirus insecticides, recombinant baculovirus, insect pest management
Loading Drosophila Nerve Terminals with Calcium Indicators
Institutions: University of Texas Health Science Center at San Antonio (UTHSCSA).
Calcium plays many roles in the nervous system but none more impressive than as the trigger for neurotransmitter release, and none more profound than as the messenger essential for the synaptic plasticity that supports learning and memory. To further elucidate the molecular underpinnings of Ca2+
-dependent synaptic mechanisms, a model system is required that is both genetically malleable and physiologically accessible. Drosophila melanogaster provides such a model. In this system, genetically-encoded fluorescent indicators are available to detect Ca2+
changes in nerve terminals. However, these indicators have limited sensitivity to Ca2+
and often show a non-linear response. Synthetic fluorescent indicators are better suited for measuring the rapid Ca2+
changes associated with nerve activity. Here we demonstrate a technique for loading dextran-conjugated synthetic Ca2+
indicators into live nerve terminals in Drosophila larvae. Particular emphasis is placed on those aspects of the protocol most critical to the technique's success, such as how to avoid static electricity discharges along the isolated nerves, maintaining the health of the preparation during extended loading periods, and ensuring axon survival by providing Ca2+
to promote sealing of severed axon endings. Low affinity dextran-conjugated Ca2+
-indicators, such as fluo-4 and rhod, are available which show a high signal-to-noise ratio while minimally disrupting presynaptic Ca2+
dynamics. Dextran-conjugation helps prevent Ca2+
indicators being sequestered into organelles such as mitochondria. The loading technique can be applied equally to larvae, embryos and adults.
Neuroscience, Issue 6, Drosophila, neuron, imaging