Muscle is a dynamic tissue that responds to changes in nutrition, exercise, and disease state. The loss of muscle mass and function with disease and age are significant public health burdens. We currently understand little about the genetic regulation of muscle health with disease or age. The nematode C. elegans is an established model for understanding the genomic regulation of biological processes of interest. This worm’s body wall muscles display a large degree of homology with the muscles of higher metazoan species. Since C. elegans is a transparent organism, the localization of GFP to mitochondria and sarcomeres allows visualization of these structures in vivo. Similarly, feeding animals cationic dyes, which accumulate based on the existence of a mitochondrial membrane potential, allows the assessment of mitochondrial function in vivo. These methods, as well as assessment of muscle protein homeostasis, are combined with assessment of whole animal muscle function, in the form of movement assays, to allow correlation of sub-cellular defects with functional measures of muscle performance. Thus, C. elegans provides a powerful platform with which to assess the impact of mutations, gene knockdown, and/or chemical compounds upon muscle structure and function. Lastly, as GFP, cationic dyes, and movement assays are assessed non-invasively, prospective studies of muscle structure and function can be conducted across the whole life course and this at present cannot be easily investigated in vivo in any other organism.
29 Related JoVE Articles!
Direct Pressure Monitoring Accurately Predicts Pulmonary Vein Occlusion During Cryoballoon Ablation
Institutions: Piedmont Heart Institute, Medtronic Inc..
Cryoballoon ablation (CBA) is an established therapy for atrial fibrillation (AF). Pulmonary vein (PV) occlusion is essential for achieving antral contact and PV isolation and is typically assessed by contrast injection. We present a novel method of direct pressure monitoring for assessment of PV occlusion.
Transcatheter pressure is monitored during balloon advancement to the PV antrum. Pressure is recorded via a single pressure transducer connected to the inner lumen of the cryoballoon. Pressure curve characteristics are used to assess occlusion in conjunction with fluoroscopic or intracardiac echocardiography (ICE) guidance. PV occlusion is confirmed when loss of typical left atrial (LA) pressure waveform is observed with recordings of PA pressure characteristics (no A wave and rapid V wave upstroke). Complete pulmonary vein occlusion as assessed with this technique has been confirmed with concurrent contrast utilization during the initial testing of the technique and has been shown to be highly accurate and readily reproducible.
We evaluated the efficacy of this novel technique in 35 patients. A total of 128 veins were assessed for occlusion with the cryoballoon utilizing the pressure monitoring technique; occlusive pressure was demonstrated in 113 veins with resultant successful pulmonary vein isolation in 111 veins (98.2%). Occlusion was confirmed with subsequent contrast injection during the initial ten procedures, after which contrast utilization was rapidly reduced or eliminated given the highly accurate identification of occlusive pressure waveform with limited initial training.
Verification of PV occlusive pressure during CBA is a novel approach to assessing effective PV occlusion and it accurately predicts electrical isolation. Utilization of this method results in significant decrease in fluoroscopy time and volume of contrast.
Medicine, Issue 72, Anatomy, Physiology, Cardiology, Biomedical Engineering, Surgery, Cardiovascular System, Cardiovascular Diseases, Surgical Procedures, Operative, Investigative Techniques, Atrial fibrillation, Cryoballoon Ablation, Pulmonary Vein Occlusion, Pulmonary Vein Isolation, electrophysiology, catheterizatoin, heart, vein, clinical, surgical device, surgical techniques
Implantation of the Syncardia Total Artificial Heart
Institutions: Virginia Commonwealth University, Virginia Commonwealth University.
With advances in technology, the use of mechanical circulatory support devices for end stage heart failure has rapidly increased. The vast majority of such patients are generally well served by left ventricular assist devices (LVADs). However, a subset of patients with late stage biventricular failure or other significant anatomic lesions are not adequately treated by isolated left ventricular mechanical support. Examples of concomitant cardiac pathology that may be better treated by resection and TAH replacement includes: post infarction ventricular septal defect, aortic root aneurysm / dissection, cardiac allograft failure, massive ventricular thrombus, refractory malignant arrhythmias (independent of filling pressures), hypertrophic / restrictive cardiomyopathy, and complex congenital heart disease. Patients often present with cardiogenic shock and multi system organ dysfunction. Excision of both ventricles and orthotopic replacement with a total artificial heart (TAH) is an effective, albeit extreme, therapy for rapid restoration of blood flow and resuscitation. Perioperative management is focused on end organ resuscitation and physical rehabilitation. In addition to the usual concerns of infection, bleeding, and thromboembolism common to all mechanically supported patients, TAH patients face unique risks with regard to renal failure and anemia. Supplementation of the abrupt decrease in brain natriuretic peptide following ventriculectomy appears to have protective renal effects. Anemia following TAH implantation can be profound and persistent. Nonetheless, the anemia is generally well tolerated and transfusion are limited to avoid HLA sensitization. Until recently, TAH patients were confined as inpatients tethered to a 500 lb pneumatic console driver. Recent introduction of a backpack sized portable driver (currently under clinical trial) has enabled patients to be discharged home and even return to work. Despite the profound presentation of these sick patients, there is a 79-87% success in bridge to transplantation.
Medicine, Issue 89, mechanical circulatory support, total artificial heart, biventricular failure, operative techniques
Prehospital Thrombolysis: A Manual from Berlin
Institutions: Charité - Universitätsmedizin Berlin, Charité - Universitätsmedizin Berlin, Universitätsklinikum Hamburg - Eppendorf, Berliner Feuerwehr, STEMO-Consortium.
In acute ischemic stroke, time from symptom onset to intervention is a decisive prognostic factor. In order to reduce this time, prehospital thrombolysis at the emergency site would be preferable. However, apart from neurological expertise and laboratory investigations a computed tomography (CT) scan is necessary to exclude hemorrhagic stroke prior to thrombolysis. Therefore, a specialized ambulance equipped with a CT scanner and point-of-care laboratory was designed and constructed. Further, a new stroke identifying interview algorithm was developed and implemented in the Berlin emergency medical services. Since February 2011 the identification of suspected stroke in the dispatch center of the Berlin Fire Brigade prompts the deployment of this ambulance, a stroke emergency mobile (STEMO). On arrival, a neurologist, experienced in stroke care and with additional training in emergency medicine, takes a neurological examination. If stroke is suspected a CT scan excludes intracranial hemorrhage. The CT-scans are telemetrically transmitted to the neuroradiologist on-call. If coagulation status of the patient is normal and patient's medical history reveals no contraindication, prehospital thrombolysis is applied according to current guidelines (intravenous recombinant tissue plasminogen activator, iv rtPA, alteplase, Actilyse).
Thereafter patients are transported to the nearest hospital with a certified stroke unit for further treatment and assessment of strokeaetiology. After a pilot-phase, weeks were randomized into blocks either with or without STEMO care. Primary end-point of this study is time from alarm to the initiation of thrombolysis. We hypothesized that alarm-to-treatment time can be reduced by at least 20 min compared to regular care.
Medicine, Issue 81, Telemedicine, Emergency Medical Services, Stroke, Tomography, X-Ray Computed, Emergency Treatment,[stroke, thrombolysis, prehospital, emergency medical services, ambulance
Using Continuous Data Tracking Technology to Study Exercise Adherence in Pulmonary Rehabilitation
Institutions: Concordia University, Concordia University, Hôpital du Sacré-Coeur de Montréal.
Pulmonary rehabilitation (PR) is an important component in the management of respiratory diseases. The effectiveness of PR is dependent upon adherence to exercise training recommendations. The study of exercise adherence is thus a key step towards the optimization of PR programs. To date, mostly indirect measures, such as rates of participation, completion, and attendance, have been used to determine adherence to PR. The purpose of the present protocol is to describe how continuous data tracking technology can be used to measure adherence to a prescribed aerobic training intensity on a second-by-second basis.
In our investigations, adherence has been defined as the percent time spent within a specified target heart rate range. As such, using a combination of hardware and software, heart rate is measured, tracked, and recorded during cycling second-by-second for each participant, for each exercise session. Using statistical software, the data is subsequently extracted and analyzed. The same protocol can be applied to determine adherence to other measures of exercise intensity, such as time spent at a specified wattage, level, or speed on the cycle ergometer. Furthermore, the hardware and software is also available to measure adherence to other modes of training, such as the treadmill, elliptical, stepper, and arm ergometer. The present protocol, therefore, has a vast applicability to directly measure adherence to aerobic exercise.
Medicine, Issue 81, Data tracking, exercise, rehabilitation, adherence, patient compliance, health behavior, user-computer interface.
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Dynamic Visual Tests to Identify and Quantify Visual Damage and Repair Following Demyelination in Optic Neuritis Patients
Institutions: Hadassah Hebrew-University Medical Center.
In order to follow optic neuritis patients and evaluate the effectiveness of their treatment, a handy, accurate and quantifiable tool is required to assess changes in myelination at the central nervous system (CNS). However, standard measurements, including routine visual tests and MRI scans, are not sensitive enough for this purpose. We present two visual tests addressing dynamic monocular and binocular functions which may closely associate with the extent of myelination along visual pathways. These include Object From Motion (OFM) extraction and Time-constrained stereo protocols. In the OFM test, an array of dots compose an object, by moving the dots within the image rightward while moving the dots outside the image leftward or vice versa. The dot pattern generates a camouflaged object that cannot be detected when the dots are stationary or moving as a whole. Importantly, object recognition is critically dependent on motion perception. In the Time-constrained Stereo protocol, spatially disparate images are presented for a limited length of time, challenging binocular 3-dimensional integration in time. Both tests are appropriate for clinical usage and provide a simple, yet powerful, way to identify and quantify processes of demyelination and remyelination along visual pathways. These protocols may be efficient to diagnose and follow optic neuritis and multiple sclerosis patients.
In the diagnostic process, these protocols may reveal visual deficits that cannot be identified via current standard visual measurements. Moreover, these protocols sensitively identify the basis of the currently unexplained continued visual complaints of patients following recovery of visual acuity. In the longitudinal follow up course, the protocols can be used as a sensitive marker of demyelinating and remyelinating processes along time. These protocols may therefore be used to evaluate the efficacy of current and evolving therapeutic strategies, targeting myelination of the CNS.
Medicine, Issue 86, Optic neuritis, visual impairment, dynamic visual functions, motion perception, stereopsis, demyelination, remyelination
Reduced Itraconazole Concentration and Durations Are Successful in Treating Batrachochytrium dendrobatidis Infection in Amphibians
Institutions: James Cook University.
Amphibians are experiencing the greatest decline of any vertebrate class and a leading cause of these declines is a fungal pathogen, Batrachochytrium dendrobatidis
), which causes the disease chytridiomycosis. Captive assurance colonies are important worldwide for threatened amphibian species and may be the only lifeline for those in critical threat of extinction. Maintaining disease free colonies is a priority of captive managers, yet safe and effective treatments for all species and across life stages have not been identified. The most widely used chemotherapeutic treatment is itraconazole, although the dosage commonly used can be harmful to some individuals and species. We performed a clinical treatment trial to assess whether a lower and safer but effective dose of itraconazole could be found to cure Bd
infections. We found that by reducing the treatment concentration from 0.01-0.0025% and reducing the treatment duration from 11-6 days of 5 min baths, frogs could be cured of Bd
infection with fewer side effects and less treatment-associated mortality.
Immunology, Issue 85, Batrachochytrium dendrobatidis, itraconazole, chytridiomycosis, captive assurance colonies, amphibian conservation
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
qPCR Is a Sensitive and Rapid Method for Detection of Cytomegaloviral DNA in Formalin-fixed, Paraffin-embedded Biopsy Tissue
Institutions: Indiana University School of Medicine, Indiana University Health.
It is crucial to identify cytomegalovirus (CMV) infection in the gastrointestinal (GI) tract of immunosuppressed patients, given their greater risk for developing severe infection. Many laboratory methods for the detection of CMV infection have been developed, including serology, viral culture, and molecular methods. Often, these methods reflect systemic involvement with CMV and do not specifically identify local tissue involvement. Therefore, detection of CMV infection in the GI tract is frequently done by traditional histology of biopsy tissue. Hematoxylin and eosin (H&E) staining in conjunction with immunohistochemistry (IHC) have remained the mainstays of examining these biopsies. H&E and IHC sometimes result in atypical (equivocal) staining patterns, making interpretation difficult. It was shown that quantitative polymerase chain reaction (qPCR) for CMV can successfully be performed on formalin-fixed, paraffin-embedded (FFPE) biopsy tissue for very high sensitivity and specificity. The goal of this protocol is to demonstrate how to perform qPCR testing for the detection of CMV in FFPE biopsy tissue in a clinical laboratory setting. This method is likely to be of great benefit for patients in cases of equivocal staining for CMV in GI biopsies.
Genetics, Issue 89, qPCR, cytomegalovirus, CMV, biopsy, real-time PCR, gastrointestinal, formalin-fixed, paraffin-embedded tissue
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Ischemic Tissue Injury in the Dorsal Skinfold Chamber of the Mouse: A Skin Flap Model to Investigate Acute Persistent Ischemia
Institutions: Technische Universität München, University Hospital of Basel, University of Saarland, University Hospital Zurich.
Despite profound expertise and advanced surgical techniques, ischemia-induced complications ranging from wound breakdown to extensive tissue necrosis are still occurring, particularly in reconstructive flap surgery. Multiple experimental flap models have been developed to analyze underlying causes and mechanisms and to investigate treatment strategies to prevent ischemic complications. The limiting factor of most models is the lacking possibility to directly and repetitively visualize microvascular architecture and hemodynamics. The goal of the protocol was to present a well-established mouse model affiliating these before mentioned lacking elements. Harder et al.
have developed a model of a musculocutaneous flap with a random perfusion pattern that undergoes acute persistent ischemia and results in ~50% necrosis after 10 days if kept untreated. With the aid of intravital epi-fluorescence microscopy, this chamber model allows repetitive visualization of morphology and hemodynamics in different regions of interest over time. Associated processes such as apoptosis, inflammation, microvascular leakage and angiogenesis can be investigated and correlated to immunohistochemical and molecular protein assays. To date, the model has proven feasibility and reproducibility in several published experimental studies investigating the effect of pre-, peri- and postconditioning of ischemically challenged tissue.
Medicine, Issue 93, flap, ischemia, microcirculation, angiogenesis, skin, necrosis, inflammation, apoptosis, preconditioning, persistent ischemia, in vivo model, muscle.
Performing Behavioral Tasks in Subjects with Intracranial Electrodes
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Johns Hopkins University.
Patients having stereo-electroencephalography (SEEG) electrode, subdural grid or depth electrode implants have a multitude of electrodes implanted in different areas of their brain for the localization of their seizure focus and eloquent areas. After implantation, the patient must remain in the hospital until the pathological area of brain is found and possibly resected. During this time, these patients offer a unique opportunity to the research community because any number of behavioral paradigms can be performed to uncover the neural correlates that guide behavior. Here we present a method for recording brain activity from intracranial implants as subjects perform a behavioral task designed to assess decision-making and reward encoding. All electrophysiological data from the intracranial electrodes are recorded during the behavioral task, allowing for the examination of the many brain areas involved in a single function at time scales relevant to behavior. Moreover, and unlike animal studies, human patients can learn a wide variety of behavioral tasks quickly, allowing for the ability to perform more than one task in the same subject or for performing controls. Despite the many advantages of this technique for understanding human brain function, there are also methodological limitations that we discuss, including environmental factors, analgesic effects, time constraints and recordings from diseased tissue. This method may be easily implemented by any institution that performs intracranial assessments; providing the opportunity to directly examine human brain function during behavior.
Behavior, Issue 92, Cognitive neuroscience, Epilepsy, Stereo-electroencephalography, Subdural grids, Behavioral method, Electrophysiology
Preparation of a Blood Culture Pellet for Rapid Bacterial Identification and Antibiotic Susceptibility Testing
Institutions: University Hospital Center and University of Lausanne.
Bloodstream infections and sepsis are a major cause of morbidity and mortality. The successful outcome of patients suffering from bacteremia depends on a rapid identification of the infectious agent to guide optimal antibiotic treatment. The analysis of Gram stains from positive blood culture can be rapidly conducted and already significantly impact the antibiotic regimen. However, the accurate identification of the infectious agent is still required to establish the optimal targeted treatment. We present here a simple and fast bacterial pellet preparation from a positive blood culture that can be used as a sample for several essential downstream applications such as identification by MALDI-TOF MS, antibiotic susceptibility testing (AST) by disc diffusion assay or automated AST systems and by automated PCR-based diagnostic testing. The performance of these different identification and AST systems applied directly on the blood culture bacterial pellets is very similar to the performance normally obtained from isolated colonies grown on agar plates. Compared to conventional approaches, the rapid acquisition of a bacterial pellet significantly reduces the time to report both identification and AST. Thus, following blood culture positivity, identification by MALDI-TOF can be reported within less than 1 hr whereas results of AST by automated AST systems or disc diffusion assays within 8 to 18 hr, respectively. Similarly, the results of a rapid PCR-based assay can be communicated to the clinicians less than 2 hr following the report of a bacteremia. Together, these results demonstrate that the rapid preparation of a blood culture bacterial pellet has a significant impact on the identification and AST turnaround time and thus on the successful outcome of patients suffering from bloodstream infections.
Immunology, Issue 92, blood culture, bacteriology, identification, antibiotic susceptibility testing, MALDI-TOF MS.
A Research Method For Detecting Transient Myocardial Ischemia In Patients With Suspected Acute Coronary Syndrome Using Continuous ST-segment Analysis
Institutions: University of Nevada, Reno, St. Joseph's Medical Center, University of Rochester Medical Center .
Each year, an estimated 785,000 Americans will have a new coronary attack, or acute coronary syndrome (ACS). The pathophysiology of ACS involves rupture of an atherosclerotic plaque; hence, treatment is aimed at plaque stabilization in order to prevent cellular death. However, there is considerable debate among clinicians, about which treatment pathway is best: early invasive using percutaneous coronary intervention (PCI/stent) when indicated or a conservative approach (i.e.
, medication only with PCI/stent if recurrent symptoms occur).
There are three types of ACS: ST elevation myocardial infarction (STEMI), non-ST elevation MI (NSTEMI), and unstable angina (UA). Among the three types, NSTEMI/UA is nearly four times as common as STEMI. Treatment decisions for NSTEMI/UA are based largely on symptoms and resting or exercise electrocardiograms (ECG). However, because of the dynamic and unpredictable nature of the atherosclerotic plaque, these methods often under detect myocardial ischemia because symptoms are unreliable, and/or continuous ECG monitoring was not utilized.
Continuous 12-lead ECG monitoring, which is both inexpensive and non-invasive, can identify transient episodes of myocardial ischemia, a precursor to MI, even when asymptomatic. However, continuous 12-lead ECG monitoring is not usual hospital practice; rather, only two leads are typically monitored. Information obtained with 12-lead ECG monitoring might provide useful information for deciding the best ACS treatment.
Therefore, using 12-lead ECG monitoring, the COMPARE Study (electroC
n of ischeM
sive to phaR
atment) was designed to assess the frequency and clinical consequences of transient myocardial ischemia, in patients with NSTEMI/UA treated with either early invasive PCI/stent or those managed conservatively (medications or PCI/stent following recurrent symptoms). The purpose of this manuscript is to describe the methodology used in the COMPARE Study.
Permission to proceed with this study was obtained from the Institutional Review Board of the hospital and the university. Research nurses identify hospitalized patients from the emergency department and telemetry unit with suspected ACS. Once consented, a 12-lead ECG Holter monitor is applied, and remains in place during the patient's entire hospital stay. Patients are also maintained on the routine bedside ECG monitoring system per hospital protocol. Off-line ECG analysis is done using sophisticated software and careful human oversight.
Medicine, Issue 70, Anatomy, Physiology, Cardiology, Myocardial Ischemia, Cardiovascular Diseases, Health Occupations, Health Care, transient myocardial ischemia, Acute Coronary Syndrome, electrocardiogram, ST-segment monitoring, Holter monitoring, research methodology
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
The NeuroStar TMS Device: Conducting the FDA Approved Protocol for Treatment of Depression
Institutions: Beth Israel Deaconess Medical Center, Inc..
The Neuronetics NeuroStar Transcranial Magnetic Stimulation (TMS) System is a class II medical device that produces brief duration, pulsed magnetic fields. These rapidly alternating fields induce electrical currents within localized, targeted regions of the cortex which are associated with various physiological and functional brain changes.1,2,3
In 2007, O'Reardon et al.
, utilizing the NeuroStar device, published the results of an industry-sponsored, multisite, randomized, sham-stimulation controlled clinical trial in which 301 patients with major depression, who had previously failed to respond to at least one adequate antidepressant treatment trial, underwent either active or sham TMS over the left dorsolateral prefrontal cortex (DLPFC). The patients, who were medication-free at the time of the study, received TMS five times per week over 4-6 weeks.4
The results demonstrated that a sub-population of patients (those who were relatively less resistant to medication, having failed not more than two good pharmacologic trials) showed a statistically significant improvement on the Montgomery-Asberg Depression Scale (MADRS), the Hamilton Depression Rating Scale (HAMD), and various other outcome measures. In October 2008, supported by these and other similar results5,6,7
, Neuronetics obtained the first and only Food and Drug Administration (FDA) approval for the clinical treatment of a specific form of medication-refractory depression using a TMS Therapy device (FDA approval K061053).
In this paper, we will explore the specified FDA approved NeuroStar depression treatment protocol (to be administered only under prescription and by a licensed medical profession in either an in- or outpatient setting).
Neuroscience, Issue 45, Transcranial Magnetic Stimulation, Depression, Neuronetics, NeuroStar, FDA Approved
Examining the Characteristics of Episodic Memory using Event-related Potentials in Patients with Alzheimer's Disease
Institutions: Vanderbilt University.
Our laboratory uses event-related EEG potentials (ERPs) to understand and support behavioral investigations of episodic memory in patients with amnestic mild cognitive impairment (aMCI) and Alzheimer's disease (AD). Whereas behavioral data inform us about the patients' performance, ERPs allow us to record discrete changes in brain activity. Further, ERPs can give us insight into the onset, duration, and interaction of independent cognitive processes associated with memory retrieval. In patient populations, these types of studies are used to examine which aspects of memory are impaired and which remain relatively intact compared to a control population. The methodology for collecting ERP data from a vulnerable patient population while these participants perform a recognition memory task is reviewed. This protocol includes participant preparation, quality assurance, data acquisition, and data analysis. In addition to basic setup and acquisition, we will also demonstrate localization techniques to obtain greater spatial resolution and source localization using high-density (128 channel) electrode arrays.
Medicine, Issue 54, recognition memory, episodic memory, event-related potentials, dual process, Alzheimer's disease, amnestic mild cognitive impairment
Assessment and Evaluation of the High Risk Neonate: The NICU Network Neurobehavioral Scale
Institutions: Brown University, Women & Infants Hospital of Rhode Island, University of Massachusetts, Boston.
There has been a long-standing interest in the assessment of the neurobehavioral integrity of the newborn infant. The NICU Network Neurobehavioral Scale (NNNS) was developed as an assessment for the at-risk infant. These are infants who are at increased risk for poor developmental outcome because of insults during prenatal development, such as substance exposure or prematurity or factors such as poverty, poor nutrition or lack of prenatal care that can have adverse effects on the intrauterine environment and affect the developing fetus. The NNNS assesses the full range of infant neurobehavioral performance including neurological integrity, behavioral functioning, and signs of stress/abstinence. The NNNS is a noninvasive neonatal assessment tool with demonstrated validity as a predictor, not only of medical outcomes such as cerebral palsy diagnosis, neurological abnormalities, and diseases with risks to the brain, but also of developmental outcomes such as mental and motor functioning, behavior problems, school readiness, and IQ. The NNNS can identify infants at high risk for abnormal developmental outcome and is an important clinical tool that enables medical researchers and health practitioners to identify these infants and develop intervention programs to optimize the development of these infants as early as possible. The video shows the NNNS procedures, shows examples of normal and abnormal performance and the various clinical populations in which the exam can be used.
Behavior, Issue 90, NICU Network Neurobehavioral Scale, NNNS, High risk infant, Assessment, Evaluation, Prediction, Long term outcome
Evaluation of Biomaterials for Bladder Augmentation using Cystometric Analyses in Various Rodent Models
Institutions: Harvard Medical School, Tufts University.
Renal function and continence of urine are critically dependent on the proper function of the urinary bladder, which stores urine at low pressure and expels it with a precisely orchestrated contraction. A number of congenital and acquired urological anomalies including posterior urethral valves, benign prostatic hyperplasia, and neurogenic bladder secondary to spina bifida/spinal cord injury can result in pathologic tissue remodeling leading to impaired compliance and reduced capacity1
. Functional or anatomical obstruction of the urinary tract is frequently associated with these conditions, and can lead to urinary incontinence and kidney damage from increased storage and voiding pressures2
. Surgical implantation of gastrointestinal segments to expand organ capacity and reduce intravesical pressures represents the primary surgical treatment option for these disorders when medical management fails3
. However, this approach is hampered by the limitation of available donor tissue, and is associated with significant complications including chronic urinary tract infection, metabolic perturbation, urinary stone formation, and secondary malignancy4,5
Current research in bladder tissue engineering is heavily focused on identifying biomaterial configurations which can support regeneration of tissues at defect sites. Conventional 3-D scaffolds derived from natural and synthetic polymers such as small intestinal submucosa and poly-glycolic acid have shown some short-term success in supporting urothelial and smooth muscle regeneration as well as facilitating increased organ storage capacity in both animal models and in the clinic6,7
. However, deficiencies in scaffold mechanical integrity and biocompatibility often result in deleterious fibrosis8
, graft contracture9
, and calcification10
, thus increasing the risk of implant failure and need for secondary surgical procedures. In addition, restoration of normal voiding characteristics utilizing standard biomaterial constructs for augmentation cystoplasty has yet to be achieved, and therefore research and development of novel matrices which can fulfill this role is needed.
In order to successfully develop and evaluate optimal biomaterials for clinical bladder augmentation, efficacy research must first be performed in standardized animal models using detailed surgical methods and functional outcome assessments. We have previously reported the use of a bladder augmentation model in mice to determine the potential of silk fibroin-based scaffolds to mediate tissue regeneration and functional voiding characteristics.11,12
Cystometric analyses of this model have shown that variations in structural and mechanical implant properties can influence the resulting urodynamic features of the tissue engineered bladders11,12
. Positive correlations between the degree of matrix-mediated tissue regeneration determined histologically and functional compliance and capacity evaluated by cystometry were demonstrated in this model11,12
. These results therefore suggest that functional evaluations of biomaterial configurations in rodent bladder augmentation systems may be a useful format for assessing scaffold properties and establishing in vivo
feasibility prior to large animal studies and clinical deployment. In the current study, we will present various surgical stages of bladder augmentation in both mice and rats using silk scaffolds and demonstrate techniques for awake and anesthetized cystometry.
Bioengineering, Issue 66, Medicine, Biomedical Engineering, Physiology, Silk, bladder tissue engineering, biomaterial, scaffold, matrix, augmentation, cystometry
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus
, consequently the name Taq
PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to:
● Set up reactions and thermal cycling conditions for a conventional PCR experiment
● Understand the function of various reaction components and their overall effect on a PCR experiment
● Design and optimize a PCR experiment for any DNA template
● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Tilt Testing with Combined Lower Body Negative Pressure: a "Gold Standard" for Measuring Orthostatic Tolerance
Institutions: Simon Fraser University .
Orthostatic tolerance (OT) refers to the ability to maintain cardiovascular stability when upright, against the hydrostatic effects of gravity, and hence to maintain cerebral perfusion and prevent syncope (fainting). Various techniques are available to assess OT and the effects of gravitational stress upon the circulation, typically by reproducing a presyncopal event (near-fainting episode) in a controlled laboratory environment. The time and/or degree of stress required to provoke this response provides the measure of OT. Any technique used to determine OT should: enable distinction between patients with orthostatic intolerance (of various causes) and asymptomatic control subjects; be highly reproducible, enabling evaluation of therapeutic interventions; avoid invasive procedures, which are known to impair OT1
In the late 1980s head-upright tilt testing was first utilized for diagnosing syncope2
. Since then it has been used to assess OT in patients with syncope of unknown cause, as well as in healthy subjects to study postural cardiovascular reflexes2-6
. Tilting protocols comprise three categories: passive tilt; passive tilt accompanied by pharmacological provocation; and passive tilt with combined lower body negative pressure (LBNP). However, the effects of tilt testing (and other orthostatic stress testing modalities) are often poorly reproducible, with low sensitivity and specificity to diagnose orthostatic intolerance7
Typically, a passive tilt includes 20-60 min of orthostatic stress continued until the onset of presyncope in patients2-6
. However, the main drawback of this procedure is its inability to invoke presyncope in all individuals undergoing the test, and corresponding low sensitivity8,9
. Thus, different methods were explored to increase the orthostatic stress and improve sensitivity.
Pharmacological provocation has been used to increase the orthostatic challenge, for example using isoprenaline4,7,10,11
or sublingual nitrate12,13
. However, the main drawback of these approaches are increases in sensitivity at the cost of unacceptable decreases in specificity10,14
, with a high positive response rate immediately after administration15
. Furthermore, invasive procedures associated with some pharmacological provocations greatly increase the false positive rate1
Another approach is to combine passive tilt testing with LBNP, providing a stronger orthostatic stress without invasive procedures or drug side-effects, using the technique pioneered by Professor Roger Hainsworth in the 1990s16-18
. This approach provokes presyncope in almost all subjects (allowing for symptom recognition in patients with syncope), while discriminating between patients with syncope and healthy controls, with a specificity of 92%, sensitivity of 85%, and repeatability of 1.1±0.6 min16,17
. This allows not only diagnosis and pathophysiological assessment19-22
, but also the evaluation of treatments for orthostatic intolerance due to its high repeatability23-30
. For these reasons, we argue this should be the "gold standard" for orthostatic stress testing, and accordingly this will be the method described in this paper.
Medicine, Issue 73, Anatomy, Physiology, Biomedical Engineering, Neurobiology, Kinesiology, Cardiology, tilt test, lower body negative pressure, orthostatic stress, syncope, orthostatic tolerance, fainting, gravitational stress, head upright, stroke, clinical techniques
Bronchoalveolar Lavage (BAL) for Research; Obtaining Adequate Sample Yield
Institutions: National Institute for Health Research, Royal Liverpool and Broadgreen University Hospital Trust, Liverpool School of Tropical Medicine, University of Liverpool, Royal Liverpool and Broadgreen University Hospital Trust, University Hospital Aintree.
We describe a research technique for fiberoptic bronchoscopy with bronchoalveolar lavage (BAL) using manual hand held suction in order to remove nonadherent cells and lung lining fluid from the mucosal surface. In research environments, BAL allows sampling of innate (lung macrophage), cellular (B- and T- cells), and humoral (immunoglobulin) responses within the lung.
BAL is internationally accepted for research purposes and since 1999 the technique has been performed in > 1,000 subjects in the UK and Malawi by our group.
Our technique uses gentle hand-held suction of instilled fluid; this is designed to maximize BAL volume returned and apply minimum shear force on ciliated epithelia in order to preserve the structure and function of cells within the BAL fluid and to preserve viability to facilitate the growth of cells in ex vivo
culture. The research technique therefore uses a larger volume instillate (typically in the order of 200 ml) and employs manual suction to reduce cell damage.
Patients are given local anesthetic, offered conscious sedation (midazolam), and tolerate the procedure well with minimal side effects. Verbal and written subject information improves tolerance and written informed consent is mandatory. Safety of the subject is paramount. Subjects are carefully selected using clear inclusion and exclusion criteria.
This protocol includes a description of the potential risks, and the steps taken to mitigate them, a list of contraindications, pre- and post-procedure checks, as well as precise bronchoscopy and laboratory techniques.
Medicine, Issue 85, Research bronchoscopy, bronchoalveolar lavage (BAL), fiberoptic bronchoscopy, lymphocyte, macrophage
Improving IV Insulin Administration in a Community Hospital
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4
The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5
It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance.
The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6
Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8
Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia.
Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
Micro-scale Engineering for Cell Biology
Institutions: MIT - Massachusetts Institute of Technology.
Cellular Biology, Issue 8, stem cells, tissue engineering, bioengineering
Functional Imaging with Reinforcement, Eyetracking, and Physiological Monitoring
Institutions: Columbia University, Columbia University, Columbia University.
We use functional brain imaging (fMRI) to study neural circuits that underlie decision-making. To understand how outcomes affect decision processes, simple perceptual tasks are combined with appetitive and aversive reinforcement. However, the use of reinforcers such as juice and airpuffs can create challenges for fMRI. Reinforcer delivery can cause head movement, which creates artifacts in the fMRI signal. Reinforcement can also lead to changes in heart rate and respiration that are mediated by autonomic pathways. Changes in heart rate and respiration can directly affect the fMRI (BOLD) signal in the brain and can be confounded with signal changes that are due to neural activity. In this presentation, we demonstrate methods for administering reinforcers in a controlled manner, for stabilizing the head, and for measuring pulse and respiration.
Medicine, Issue 21, Neuroscience, Psychiatry, fMRI, Decision Making, Reward, Punishment, Pulse, Respiration, Eye Tracking, Psychology
Functional Mapping with Simultaneous MEG and EEG
Institutions: MGH - Massachusetts General Hospital.
We use magnetoencephalography (MEG) and electroencephalography (EEG) to locate and determine the temporal evolution in brain areas involved in the processing of simple sensory stimuli. We will use somatosensory stimuli to locate the hand somatosensory areas, auditory stimuli to locate the auditory cortices, visual stimuli in four quadrants of the visual field to locate the early visual areas. These type of experiments are used for functional mapping in epileptic and brain tumor patients to locate eloquent cortices. In basic neuroscience similar experimental protocols are used to study the orchestration of cortical activity. The acquisition protocol includes quality assurance procedures, subject preparation for the combined MEG/EEG study, and acquisition of evoked-response data with somatosensory, auditory, and visual stimuli. We also demonstrate analysis of the data using the equivalent current dipole model and cortically-constrained minimum-norm estimates. Anatomical MRI data are employed in the analysis for visualization and for deriving boundaries of tissue boundaries for forward modeling and cortical location and orientation constraints for the minimum-norm estimates.
JoVE neuroscience, Issue 40, neuroscience, brain, MEG, EEG, functional imaging
Using Learning Outcome Measures to assess Doctoral Nursing Education
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
Manual Muscle Testing: A Method of Measuring Extremity Muscle Strength Applied to Critically Ill Patients
Institutions: Johns Hopkins University, Johns Hopkins Hospital , Johns Hopkins University, University of Maryland Medical System.
Survivors of acute respiratory distress syndrome (ARDS) and other causes of critical illness often have generalized weakness, reduced exercise tolerance, and persistent nerve and muscle impairments after hospital discharge.1-6
Using an explicit protocol with a structured approach to training and quality assurance of research staff, manual muscle testing (MMT) is a highly reliable method for assessing strength, using a standardized clinical examination, for patients following ARDS, and can be completed with mechanically ventilated patients who can tolerate sitting upright in bed and are able to follow two-step commands. 7, 8
This video demonstrates a protocol for MMT, which has been taught to ≥43 research staff who have performed >800 assessments on >280 ARDS survivors. Modifications for the bedridden patient are included. Each muscle is tested with specific techniques for positioning, stabilization, resistance, and palpation for each score of the 6-point ordinal Medical Research Council scale.7,9-11
Three upper and three lower extremity muscles are graded in this protocol: shoulder abduction, elbow flexion, wrist extension, hip flexion, knee extension, and ankle dorsiflexion. These muscles were chosen based on the standard approach for evaluating patients for ICU-acquired weakness used in prior publications. 1,2
Medicine, Issue 50, Muscle Strength, Critical illness, Intensive Care Units, Reproducibility of Results, Clinical Protocols.
Transplantation of Whole Kidney Marrow in Adult Zebrafish
Institutions: Harvard Medical School.
Hematopoietic stem cells (HSC) are a rare population of pluripotent cells that maintain all the differentiated blood lineages throughout the life of an organism. The functional definition of a HSC is a transplanted cell that has the ability to reconstitute all the blood lineages of an irradiated recipient long term. This designation was established by decades of seminal work in mammalian systems. Using hematopoietic cell transplantation (HCT) and reverse genetic manipulations in the mouse, the underlying regulatory factors of HSC biology are beginning to be unveiled, but are still largely under-explored. Recently, the zebrafish has emerged as a powerful genetic model to study vertebrate hematopoiesis. Establishing HCT in zebrafish will allow scientists to utilize the large-scale genetic and chemical screening methodologies available in zebrafish to reveal novel mechanisms underlying HSC regulation. In this article, we demonstrate a method to perform HCT in adult zebrafish. We show the dissection and preparation of zebrafish whole kidney marrow, the site of adult hematopoiesis in the zebrafish, and the introduction of these donor cells into the circulation of irradiated recipient fish via intracardiac injection. Additionally, we describe the post-transplant care of fish in an "ICU" to increase their long-term health. In general, gentle care of the fish before, during, and after the transplant is critical to increase the number of fish that will survive more than one month following the procedure, which is essential for assessment of long term (<3 month) engraftment. The experimental data used to establish this protocol will be published elsewhere. The establishment of this protocol will allow for the merger of large-scale zebrafish genetics and transplant biology.
Developmental Biology, Issue 2, zebrafish, HSC, stem cells, transplant