Multi-dimensional and transient flows play a key role in many areas of science, engineering, and health sciences but are often not well understood. The complex nature of these flows may be studied using particle image velocimetry (PIV), a laser-based imaging technique for optically accessible flows. Though many forms of PIV exist that extend the technique beyond the original planar two-component velocity measurement capabilities, the basic PIV system consists of a light source (laser), a camera, tracer particles, and analysis algorithms. The imaging and recording parameters, the light source, and the algorithms are adjusted to optimize the recording for the flow of interest and obtain valid velocity data.
Common PIV investigations measure two-component velocities in a plane at a few frames per second. However, recent developments in instrumentation have facilitated high-frame rate (> 1 kHz) measurements capable of resolving transient flows with high temporal resolution. Therefore, high-frame rate measurements have enabled investigations on the evolution of the structure and dynamics of highly transient flows. These investigations play a critical role in understanding the fundamental physics of complex flows.
A detailed description for performing high-resolution, high-speed planar PIV to study a transient flow near the surface of a flat plate is presented here. Details for adjusting the parameter constraints such as image and recording properties, the laser sheet properties, and processing algorithms to adapt PIV for any flow of interest are included.
24 Related JoVE Articles!
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone.
Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia.
The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction.
There exists a non-invasive, in vivo
method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia.
This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
The FlyBar: Administering Alcohol to Flies
Institutions: Florida State University, University of Houston.
Fruit flies (Drosophila melanogaster
) are an established model for both alcohol research and circadian biology. Recently, we showed that the circadian clock modulates alcohol sensitivity, but not the formation of tolerance. Here, we describe our protocol in detail. Alcohol is administered to the flies using the FlyBar. In this setup, saturated alcohol vapor is mixed with humidified air in set proportions, and administered to the flies in four tubes simultaneously. Flies are reared under standardized conditions in order to minimize variation between the replicates. Three-day old flies of different genotypes or treatments are used for the experiments, preferably by matching flies of two different time points (e.g.
, CT 5 and CT 17) making direct comparisons possible. During the experiment, flies are exposed for 1 hr to the pre-determined percentage of alcohol vapor and the number of flies that exhibit the Loss of Righting reflex (LoRR) or sedation are counted every 5 min. The data can be analyzed using three different statistical approaches. The first is to determine the time at which 50% of the flies have lost their righting reflex and use an Analysis of the Variance (ANOVA) to determine whether significant differences exist between time points. The second is to determine the percentage flies that show LoRR after a specified number of minutes, followed by an ANOVA analysis. The last method is to analyze the whole times series using multivariate statistics. The protocol can also be used for non-circadian experiments or comparisons between genotypes.
Neuroscience, Issue 87, neuroscience, alcohol sensitivity, Drosophila, Circadian, sedation, biological rhythms, undergraduate research
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Characterization of Inflammatory Responses During Intranasal Colonization with Streptococcus pneumoniae
Institutions: McMaster University .
Nasopharyngeal colonization by Streptococcus pneumoniae
is a prerequisite to invasion to the lungs or bloodstream1
. This organism is capable of colonizing the mucosal surface of the nasopharynx, where it can reside, multiply and eventually overcome host defences to invade to other tissues of the host. Establishment of an infection in the normally lower respiratory tract results in pneumonia. Alternatively, the bacteria can disseminate into the bloodstream causing bacteraemia, which is associated with high mortality rates2
, or else lead directly to the development of pneumococcal meningitis. Understanding the kinetics of, and immune responses to, nasopharyngeal colonization is an important aspect of S. pneumoniae
Our mouse model of intranasal colonization is adapted from human models3
and has been used by multiple research groups in the study of host-pathogen responses in the nasopharynx4-7
. In the first part of the model, we use a clinical isolate of S. pneumoniae
to establish a self-limiting bacterial colonization that is similar to carriage events in human adults. The procedure detailed herein involves preparation of a bacterial inoculum, followed by the establishment of a colonization event through delivery of the inoculum via an intranasal route of administration. Resident macrophages are the predominant cell type in the nasopharynx during the steady state. Typically, there are few lymphocytes present in uninfected mice8
, however mucosal colonization will lead to low- to high-grade inflammation (depending on the virulence of the bacterial species and strain) that will result in an immune response and the subsequent recruitment of host immune cells. These cells can be isolated by a lavage of the tracheal contents through the nares, and correlated to the density of colonization bacteria to better understand the kinetics of the infection.
Immunology, Issue 83, Streptococcus pneumoniae, Nasal lavage, nasopharynx, murine, flow cytometry, RNA, Quantitative PCR, recruited macrophages, neutrophils, T-cells, effector cells, intranasal colonization
A Novel Application of Musculoskeletal Ultrasound Imaging
Institutions: George Mason University, George Mason University, George Mason University, George Mason University.
Ultrasound is an attractive modality for imaging muscle and tendon motion during dynamic tasks and can provide a complementary methodological approach for biomechanical studies in a clinical or laboratory setting. Towards this goal, methods for quantification of muscle kinematics from ultrasound imagery are being developed based on image processing. The temporal resolution of these methods is typically not sufficient for highly dynamic tasks, such as drop-landing. We propose a new approach that utilizes a Doppler method for quantifying muscle kinematics. We have developed a novel vector tissue Doppler imaging (vTDI) technique that can be used to measure musculoskeletal contraction velocity, strain and strain rate with sub-millisecond temporal resolution during dynamic activities using ultrasound. The goal of this preliminary study was to investigate the repeatability and potential applicability of the vTDI technique in measuring musculoskeletal velocities during a drop-landing task, in healthy subjects. The vTDI measurements can be performed concurrently with other biomechanical techniques, such as 3D motion capture for joint kinematics and kinetics, electromyography for timing of muscle activation and force plates for ground reaction force. Integration of these complementary techniques could lead to a better understanding of dynamic muscle function and dysfunction underlying the pathogenesis and pathophysiology of musculoskeletal disorders.
Medicine, Issue 79, Anatomy, Physiology, Joint Diseases, Diagnostic Imaging, Muscle Contraction, ultrasonic applications, Doppler effect (acoustics), Musculoskeletal System, biomechanics, musculoskeletal kinematics, dynamic function, ultrasound imaging, vector Doppler, strain, strain rate
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
An Experimental Model to Study Tuberculosis-Malaria Coinfection upon Natural Transmission of Mycobacterium tuberculosis and Plasmodium berghei
Institutions: University Hospital Heidelberg, Research Center Borstel.
Coinfections naturally occur due to the geographic overlap of distinct types of pathogenic organisms. Concurrent infections most likely modulate the respective immune response to each single pathogen and may thereby affect pathogenesis and disease outcome. Coinfected patients may also respond differentially to anti-infective interventions. Coinfection between tuberculosis as caused by mycobacteria and the malaria parasite Plasmodium
, both of which are coendemic in many parts of sub-Saharan Africa, has not been studied in detail. In order to approach the challenging but scientifically and clinically highly relevant question how malaria-tuberculosis coinfection modulate host immunity and the course of each disease, we established an experimental mouse model that allows us to dissect the elicited immune responses to both pathogens in the coinfected host. Of note, in order to most precisely mimic naturally acquired human infections, we perform experimental infections of mice with both pathogens by their natural routes of infection, i.e.
aerosol and mosquito bite, respectively.
Infectious Diseases, Issue 84, coinfection, mouse, Tuberculosis, Malaria, Plasmodium berghei, Mycobacterium tuberculosis, natural transmission
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g.
, food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera
) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides.
We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
A cGMP-applicable Expansion Method for Aggregates of Human Neural Stem and Progenitor Cells Derived From Pluripotent Stem Cells or Fetal Brain Tissue
Institutions: Cedars-Sinai Medical Center.
A cell expansion technique to amass large numbers of cells from a single specimen for research experiments and clinical trials would greatly benefit the stem cell community. Many current expansion methods are laborious and costly, and those involving complete dissociation may cause several stem and progenitor cell types to undergo differentiation or early senescence. To overcome these problems, we have developed an automated mechanical passaging method referred to as “chopping” that is simple and inexpensive. This technique avoids chemical or enzymatic dissociation into single cells and instead allows for the large-scale expansion of suspended, spheroid cultures that maintain constant cell/cell contact. The chopping method has primarily been used for fetal brain-derived neural progenitor cells or neurospheres, and has recently been published for use with neural stem cells derived from embryonic and induced pluripotent stem cells. The procedure involves seeding neurospheres onto a tissue culture Petri dish and subsequently passing a sharp, sterile blade through the cells effectively automating the tedious process of manually mechanically dissociating each sphere. Suspending cells in culture provides a favorable surface area-to-volume ratio; as over 500,000 cells can be grown within a single neurosphere of less than 0.5 mm in diameter. In one T175 flask, over 50 million cells can grow in suspension cultures compared to only 15 million in adherent cultures. Importantly, the chopping procedure has been used under current good manufacturing practice (cGMP), permitting mass quantity production of clinical-grade cell products.
Neuroscience, Issue 88, neural progenitor cell, neural precursor cell, neural stem cell, passaging, neurosphere, chopping, stem cell, neuroscience, suspension culture, good manufacturing practice, GMP
Measuring Respiratory Function in Mice Using Unrestrained Whole-body Plethysmography
Institutions: Monash Institute of Medical Research, Monash Medical Centre, Animal Resource Centre, Perth, Australia, Wake Forest Institute for Regenerative Medicine.
Respiratory dysfunction is one of the leading causes of morbidity and mortality in the world and the rates of mortality continue to rise. Quantitative assessment of lung function in rodent models is an important tool in the development of future therapies. Commonly used techniques for assessing respiratory function including invasive plethysmography and forced oscillation. While these techniques provide valuable information, data collection can be fraught with artefacts and experimental variability due to the need for anesthesia and/or invasive instrumentation of the animal. In contrast, unrestrained whole-body plethysmography (UWBP) offers a precise, non-invasive, quantitative way by which to analyze respiratory parameters. This technique avoids the use of anesthesia and restraints, which is common to traditional plethysmography techniques. This video will demonstrate the UWBP procedure including the equipment set up, calibration and lung function recording. It will explain how to analyze the collected data, as well as identify experimental outliers and artefacts that results from animal movement. The respiratory parameters obtained using this technique include tidal volume, minute volume, inspiratory duty cycle, inspiratory flow rate and the ratio of inspiration time to expiration time. UWBP does not rely on specialized skills and is inexpensive to perform. A key feature of UWBP, and most appealing to potential users, is the ability to perform repeated measures of lung function on the same animal.
Physiology, Issue 90, Unrestrained Whole Body Plethysmography, Lung function, Respiratory Disease, Rodents
Setting Limits on Supersymmetry Using Simplified Models
Institutions: University College London, CERN, Lawrence Berkeley National Laboratories.
Experimental limits on supersymmetry and similar theories are difficult to set because of the enormous available parameter space and difficult to generalize because of the complexity of single points. Therefore, more phenomenological, simplified models are becoming popular for setting experimental limits, as they have clearer physical interpretations. The use of these simplified model limits to set a real limit on a concrete theory has not, however, been demonstrated. This paper recasts simplified model limits into limits on a specific and complete supersymmetry model, minimal supergravity. Limits obtained under various physical assumptions are comparable to those produced by directed searches. A prescription is provided for calculating conservative and aggressive limits on additional theories. Using acceptance and efficiency tables along with the expected and observed numbers of events in various signal regions, LHC experimental results can be recast in this manner into almost any theoretical framework, including nonsupersymmetric theories with supersymmetry-like signatures.
Physics, Issue 81, high energy physics, particle physics, Supersymmetry, LHC, ATLAS, CMS, New Physics Limits, Simplified Models
Whole-Body Nanoparticle Aerosol Inhalation Exposures
Institutions: West Virginia University , West Virginia University , National Institute for Occupational Safety and Health.
Inhalation is the most likely exposure route for individuals working with aerosolizable engineered nano-materials (ENM). To properly perform nanoparticle inhalation toxicology studies, the aerosols in a chamber housing the experimental animals must have: 1) a steady concentration maintained at a desired level for the entire exposure period; 2) a homogenous composition free of contaminants; and 3) a stable size distribution with a geometric mean diameter < 200 nm and a geometric standard deviation σg
< 2.5 5
. The generation of aerosols containing nanoparticles is quite challenging because nanoparticles easily agglomerate. This is largely due to very strong inter-particle forces and the formation of large fractal structures in tens or hundreds of microns in size 6
, which are difficult to be broken up. Several common aerosol generators, including nebulizers, fluidized beds, Venturi aspirators and the Wright dust feed, were tested; however, none were able to produce nanoparticle aerosols which satisfy all criteria 5
A whole-body nanoparticle aerosol inhalation exposure system was fabricated, validated and utilized for nano-TiO2
inhalation toxicology studies. Critical components: 1) novel nano-TiO2
aerosol generator; 2) 0.5 m3
whole-body inhalation exposure chamber; and 3) monitor and control system. Nano-TiO2
aerosols generated from bulk dry nano-TiO2
powders (primary diameter of 21 nm, bulk density of 3.8 g/cm3
) were delivered into the exposure chamber at a flow rate of 90 LPM (10.8 air changes/hr). Particle size distribution and mass concentration profiles were measured continuously with a scanning mobility particle sizer (SMPS), and an electric low pressure impactor (ELPI). The aerosol mass concentration (C
) was verified gravimetrically (mg/m3
). The mass (M
) of the collected particles was determined as M = (Mpost-Mpre), where Mpre
are masses of the filter before and after sampling (mg
). The mass concentration was calculated as C = M/(Q*t),
is sampling flowrate (m3/min
is the sampling time (minute
). The chamber pressure, temperature, relative humidity (RH), O2
concentrations were monitored and controlled continuously. Nano-TiO2
aerosols collected on Nuclepore filters were analyzed with a scanning electron microscope (SEM) and energy dispersive X-ray (EDX) analysis.
In summary, we report that the nano-particle aerosols generated and delivered to our exposure chamber have: 1) steady mass concentration; 2) homogenous composition free of contaminants; 3) stable particle size distributions with a count-median aerodynamic diameter of 157 nm during aerosol generation. This system reliably and repeatedly creates test atmospheres that simulate occupational, environmental or domestic ENM aerosol exposures.
Medicine, Issue 75, Physiology, Anatomy, Chemistry, Biomedical Engineering, Pharmacology, Titanium dioxide, engineered nanomaterials, nanoparticle, toxicology, inhalation exposure, aerosols, dry powder, animal model
Breathing-controlled Electrical Stimulation (BreEStim) for Management of Neuropathic Pain and Spasticity
Institutions: University of Texas Health Science Center at Houston , TIRR Memorial Hermann Hospital, TIRR Memorial Hermann Hospital.
Electrical stimulation (EStim) refers to the application of electrical current to muscles or nerves in order to achieve functional and therapeutic goals. It has been extensively used in various clinical settings. Based upon recent discoveries related to the systemic effects of voluntary breathing and intrinsic physiological interactions among systems during voluntary breathing, a new EStim protocol, Breathing-controlled Electrical Stimulation (BreEStim), has been developed to augment the effects of electrical stimulation. In BreEStim, a single-pulse electrical stimulus is triggered and delivered to the target area when the airflow rate of an isolated voluntary inspiration reaches the threshold. BreEStim integrates intrinsic physiological interactions that are activated during voluntary breathing and has demonstrated excellent clinical efficacy. Two representative applications of BreEStim are reported with detailed protocols: management of post-stroke finger flexor spasticity and neuropathic pain in spinal cord injury.
Medicine, Issue 71, Neuroscience, Neurobiology, Anatomy, Physiology, Behavior, electrical stimulation, BreEStim, electrode, voluntary breathing, respiration, inspiration, pain, neuropathic pain, pain management, spasticity, stroke, spinal cord injury, brain, central nervous system, CNS, clinical, electromyogram, neuromuscular electrical stimulation
A Novel Rescue Technique for Difficult Intubation and Difficult Ventilation
Institutions: Children’s Hospital of Michigan, St. Jude Children’s Research Hospital.
We describe a novel non surgical technique to maintain oxygenation and ventilation in a case of difficult intubation and difficult ventilation, which works especially well with poor mask fit.
Can not intubate, can not ventilate" (CICV) is a potentially life threatening situation. In this video we present a simulation of the technique we used in a case of CICV where oxygenation and ventilation were maintained by inserting an endotracheal tube (ETT) nasally down to the level of the naso-pharynx while sealing the mouth and nares for successful positive pressure ventilation.
A 13 year old patient was taken to the operating room for incision and drainage of a neck abcess and direct laryngobronchoscopy. After preoxygenation, anesthesia was induced intravenously. Mask ventilation was found to be extremely difficult because of the swelling of the soft tissue. The face mask could not fit properly on the face due to significant facial swelling as well. A direct laryngoscopy was attempted with no visualization of the larynx. Oxygen saturation was difficult to maintain, with saturations falling to 80%. In order to oxygenate and ventilate the patient, an endotracheal tube was then inserted nasally after nasal spray with nasal decongestant and lubricant. The tube was pushed gently and blindly into the hypopharynx. The mouth and nose of the patient were sealed by hand and positive pressure ventilation was possible with 100% O2
with good oxygen saturation during that period of time. Once the patient was stable and well sedated, a rigid bronchoscope was introduced by the otolaryngologist showing extensive subglottic and epiglottic edema, and a mass effect from the abscess, contributing to the airway compromise. The airway was secured with an ETT tube by the otolaryngologist.This video will show a simulation of the technique on a patient undergoing general anesthesia for dental restorations.
Medicine, Issue 47, difficult ventilation, difficult intubation, nasal, saturation
Psychophysiological Stress Assessment Using Biofeedback
Institutions: Cambridge Health Alliance, Harvard Medical School.
In the last half century, research in biofeedback has shown the extent to which the human mind can influence the functioning of the autonomic nervous system, previously thought to be outside of conscious control. By letting people observe signals from their own bodies, biofeedback enables them to develop greater awareness of their physiological and psychological reactions, such as stress, and to learn to modify these reactions. Biofeedback practitioners can facilitate this process by assessing people s reactions to mildly stressful events and formulating a biofeedback-based treatment plan. During stress assessment the practitioner first records a baseline for physiological readings, and then presents the client with several mild stressors, such as a cognitive, physical and emotional stressor. Variety of stressors is presented in order to determine a person's stimulus-response specificity, or differences in each person's reaction to qualitatively different stimuli. This video will demonstrate the process of psychophysiological stress assessment using biofeedback and present general guidelines for treatment planning.
Neuroscience, Issue 29, Stress, biofeedback, psychophysiological, assessment
Magnetic Resonance Derived Myocardial Strain Assessment Using Feature Tracking
Institutions: Cincinnati Children Hospital Medical Center (CCHMC), Imaging Systems GmbH, Advanced Medical Imaging Development SRL, The Christ Hospital.
Purpose: An accurate and practical method to measure parameters like strain in myocardial tissue is of great clinical value, since it has been shown, that strain is a more sensitive and earlier marker for contractile dysfunction than the frequently used parameter EF. Current technologies for CMR are time consuming and difficult to implement in clinical practice. Feature tracking is a technology that can lead to more automization and robustness of quantitative analysis of medical images with less time consumption than comparable methods.
Methods: An automatic or manual input in a single phase serves as an initialization from which the system starts to track the displacement of individual patterns representing anatomical structures over time. The specialty of this method is that the images do not need to be manipulated in any way beforehand like e.g. tagging of CMR images.
Results: The method is very well suited for tracking muscular tissue and with this allowing quantitative elaboration of myocardium and also blood flow.
Conclusions: This new method offers a robust and time saving procedure to quantify myocardial tissue and blood with displacement, velocity and deformation parameters on regular sequences of CMR imaging. It therefore can be implemented in clinical practice.
Medicine, Issue 48, feature tracking, strain, displacement, CMR
A Protocol for Comprehensive Assessment of Bulbar Dysfunction in Amyotrophic Lateral Sclerosis (ALS)
Institutions: University of Toronto, Sunnybrook Health Science Centre, University of Nebraska-Lincoln, University of Nebraska Medical Center, University of Toronto.
Improved methods for assessing bulbar impairment are necessary for expediting diagnosis of bulbar dysfunction in ALS, for predicting disease progression across speech subsystems, and for addressing the critical need for sensitive outcome measures for ongoing experimental treatment trials. To address this need, we are obtaining longitudinal profiles of bulbar impairment in 100 individuals based on a comprehensive instrumentation-based assessment that yield objective measures. Using instrumental approaches to quantify speech-related behaviors is very important in a field that has primarily relied on subjective, auditory-perceptual forms of speech assessment1
. Our assessment protocol measures performance across all of the speech subsystems, which include respiratory, phonatory (laryngeal), resonatory (velopharyngeal), and articulatory. The articulatory subsystem is divided into the facial components (jaw and lip), and the tongue. Prior research has suggested that each speech subsystem responds differently to neurological diseases such as ALS. The current protocol is designed to test the performance of each speech subsystem as independently from other subsystems as possible. The speech subsystems are evaluated in the context of more global changes to speech performance. These speech system level variables include speaking rate and intelligibility of speech.
The protocol requires specialized instrumentation, and commercial and custom software. The respiratory, phonatory, and resonatory subsystems are evaluated using pressure-flow (aerodynamic) and acoustic methods. The articulatory subsystem is assessed using 3D motion tracking techniques. The objective measures that are used to quantify bulbar impairment have been well established in the speech literature and show sensitivity to changes in bulbar function with disease progression. The result of the assessment is a comprehensive, across-subsystem performance profile for each participant. The profile, when compared to the same measures obtained from healthy controls, is used for diagnostic purposes. Currently, we are testing the sensitivity and specificity of these measures for diagnosis of ALS and for predicting the rate of disease progression. In the long term, the more refined endophenotype of bulbar ALS derived from this work is expected to strengthen future efforts to identify the genetic loci of ALS and improve diagnostic and treatment specificity of the disease as a whole. The objective assessment that is demonstrated in this video may be used to assess a broad range of speech motor impairments, including those related to stroke, traumatic brain injury, multiple sclerosis, and Parkinson disease.
Medicine, Issue 48, speech, assessment, subsystems, bulbar function, amyotrophic lateral sclerosis
Quantitative Autonomic Testing
Institutions: University of Massachusetts Medical School.
Disorders associated with dysfunction of autonomic nervous system are quite common yet frequently unrecognized. Quantitative autonomic testing can be invaluable tool for evaluation of these disorders, both in clinic and research. There are number of autonomic tests, however, only few were validated clinically or are quantitative. Here, fully quantitative and clinically validated protocol for testing of autonomic functions is presented. As a bare minimum the clinical autonomic laboratory should have a tilt table, ECG monitor, continuous noninvasive blood pressure monitor, respiratory monitor and a mean for evaluation of sudomotor domain. The software for recording and evaluation of autonomic tests is critical for correct evaluation of data. The presented protocol evaluates 3 major autonomic domains: cardiovagal, adrenergic and sudomotor. The tests include deep breathing, Valsalva maneuver, head-up tilt, and quantitative sudomotor axon test (QSART). The severity and distribution of dysautonomia is quantitated using Composite Autonomic Severity Scores (CASS). Detailed protocol is provided highlighting essential aspects of testing with emphasis on proper data acquisition, obtaining the relevant parameters and unbiased evaluation of autonomic signals. The normative data and CASS algorithm for interpretation of results are provided as well.
Medicine, Issue 53, Deep breathing, Valsalva maneuver, tilt test, sudomotor testing, Composite Autonomic Severity Score, CASS
Non-surgical Intratracheal Instillation of Mice with Analysis of Lungs and Lung Draining Lymph Nodes by Flow Cytometry
Institutions: University of Colorado School of Medicine, National Jewish Health , Colorado State University, National Jewish Health .
Phagocytic cells such as alveolar macrophages and lung dendritic cells (LDCs) continuously sample antigens from the alveolar spaces in the
lungs. LDCs, in particular, are known to migrate to the lung draining lymph nodes (LDLNs) where they present inhaled antigens to T cells initiating an
appropriate immune response to a variety of immunogens1,2
. To model interactions between the lungs and airborne antigens in mice, antigens can be
or as aerosols6
. Delivery by each route involves distinct technical skills and limitations that need to be
considered before designing an experiment. For example, intranasal and aerosolized exposure delivers antigens to both the lungs and the upper respiratory
tract. Hence antigens can access the nasal associated lymphoid tissue (NALT)7
, potentially complicating interpretation of the results. In addition, swallowing,
sneezing and the breathing rate of the mouse may also lead to inconsistencies in the doses delivered. Although the involvement of the upper respiratory tract may
be preferred for some studies, it can complicate experiments focusing on events specifically initiated in the lungs. In this setting, the intratracheal (i.t)
route is preferable as it delivers test materials directly into the lungs and bypasses the NALT. Many i.t injection protocols involve either blind intubation of the
trachea through the oral cavity or surgical exposure of the trachea to access the lungs. Herein, we describe a simple, consistent, non-surgical method for i.t
instillation. The opening of the trachea is visualized using a laryngoscope and a bent gavage needle is then inserted directly into the trachea to deliver the
innoculum. We also describe procedures for harvesting and processing of LDLNs and lungs for analysis of antigen trafficking by flow cytometry.
Immunology, Issue 51, Intratracheal, mouse, lungs, lung draining lymph nodes, flow cytometry
Aseptic Laboratory Techniques: Plating Methods
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories
(BMBL) as well as Material Safety Data Sheets
(MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection
(ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to:
● Perform plating procedures without contaminating media.
● Isolate single bacterial colonies by the streak-plating method.
● Use pour-plating and spread-plating methods to determine the concentration of bacteria.
● Perform soft agar overlays when working with phage.
● Transfer bacterial cells from one plate to another using the replica-plating procedure.
● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
Echo Particle Image Velocimetry
Institutions: University of New Hampshire.
The transport of mass, momentum, and energy in fluid flows is ultimately determined by spatiotemporal distributions of the fluid velocity field.1
Consequently, a prerequisite for understanding, predicting, and controlling fluid flows is the capability to measure the velocity field with adequate spatial and temporal resolution.2
For velocity measurements in optically opaque fluids or through optically opaque geometries, echo particle image velocimetry (EPIV) is an attractive diagnostic technique to generate "instantaneous" two-dimensional fields of velocity.3,4,5,6
In this paper, the operating protocol for an EPIV system built by integrating a commercial medical ultrasound machine7
with a PC running commercial particle image velocimetry (PIV) software8
is described, and validation measurements in Hagen-Poiseuille (i.e.
, laminar pipe) flow are reported.
For the EPIV measurements, a phased array probe connected to the medical ultrasound machine is used to generate a two-dimensional ultrasound image by pulsing the piezoelectric probe elements at different times. Each probe element transmits an ultrasound pulse into the fluid, and tracer particles in the fluid (either naturally occurring or seeded) reflect ultrasound echoes back to the probe where they are recorded. The amplitude of the reflected ultrasound waves and their time delay relative to transmission are used to create what is known as B-mode (brightness mode) two-dimensional ultrasound images. Specifically, the time delay is used to determine the position of the scatterer in the fluid and the amplitude is used to assign intensity to the scatterer. The time required to obtain a single B-mode image, t
, is determined by the time it take to pulse all the elements of the phased array probe. For acquiring multiple B-mode images, the frame rate of the system in frames per second (fps) = 1/δt
. (See 9 for a review of ultrasound imaging.)
For a typical EPIV experiment, the frame rate is between 20-60 fps, depending on flow conditions, and 100-1000 B-mode images of the spatial distribution of the tracer particles in the flow are acquired. Once acquired, the B-mode ultrasound images are transmitted via an ethernet connection to the PC running the PIV commercial software. Using the PIV software, tracer particle displacement fields, D(x,y)
[pixels], (where x and y denote horizontal and vertical spatial position in the ultrasound image, respectively) are acquired by applying cross correlation algorithms to successive ultrasound B-mode images.10
The velocity fields, u(x,y)
[m/s], are determined from the displacements fields, knowing the time step between image pairs, ΔT
[s], and the image magnification, M
. The time step between images ΔT
= 1/fps + D(x,y)/B
, where B
[pixels/s] is the time it takes for the ultrasound probe to sweep across the image width. In the present study, M = 77[μm/pixel], fps
= 49.5[1/s], and B
= 25,047[pixels/s]. Once acquired, the velocity fields can be analyzed to compute flow quantities of interest.
Mechanical Engineering, Issue 70, Physics, Engineering, Physical Sciences, Ultrasound, cross correlation, velocimetry, opaque fluids, particle, flow, fluid, EPIV
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Expired CO2 Measurement in Intubated or Spontaneously Breathing Patients from the Emergency Department
Institutions: Universit Catholique de Louvain Cliniques Universitaires Saint-Luc.
Carbon dioxide (CO2
) along with oxygen (O2
) share the role of being the most important gases in the human body. The measuring of expired CO2
at the mouth has solicited growing clinical interest among physicians in the emergency department for various indications: (1) surveillance et monitoring of the intubated patient; (2) verification of the correct positioning of an endotracheal tube; (3) monitoring of a patient in cardiac arrest; (4) achieving normocapnia in intubated head trauma patients; (5) monitoring ventilation during procedural sedation. The video allows physicians to familiarize themselves with the use of capnography and the text offers a review of the theory and principals involved. In particular, the importance of CO2
for the organism, the relevance of measuring expired CO2
, the differences between arterial and expired CO2
, the material used in capnography with their artifacts and traps, will be reviewed. Since the main reluctance in the use of expired CO2
measurement is due to lack of correct knowledge concerning the physiopathology of CO2
by the physician, we hope that this explanation and the video sequences accompanying will help resolve this limitation.
Medicine, Issue 47, capnography, CO2, emergency medicine, end-tidal CO2
Propagation of Human Embryonic Stem (ES) Cells
Institutions: MGH - Massachusetts General Hospital.
Cellular Biology, Issue 1, ES, embryonic stem cells, tissue culture