JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Can an integrated approach reduce child vulnerability to anaemia? Evidence from three African countries.
PUBLISHED: 01-01-2014
Addressing the complex, multi-factorial causes of childhood anaemia is best done through integrated packages of interventions. We hypothesized that due to reduced child vulnerability, a "buffering" of risk associated with known causes of anaemia would be observed among children living in areas benefiting from a community-based health and nutrition program intervention. Cross-sectional data on the nutrition and health status of children 24-59 mo (N=2405) were obtained in 2000 and 2004 from program evaluation surveys in Ghana, Malawi and Tanzania. Linear regression models estimated the association between haemoglobin and immediate, underlying and basic causes of child anaemia and variation in this association between years. Lower haemoglobin levels were observed in children assessed in 2000 compared to 2004 (difference -3.30 g/L), children from Tanzania (-9.15 g/L) and Malawi (-2.96 g/L) compared to Ghana, and the youngest (24-35 mo) compared to oldest age group (48-59 mo; -5.43 g/L). Children who were stunted, malaria positive and recently ill also had lower haemoglobin, independent of age, sex and other underlying and basic causes of anaemia. Despite ongoing morbidity, risk of lower haemoglobin decreased for children with malaria and recent illness, suggesting decreased vulnerability to their anaemia-producing effects. Stunting remained an independent and unbuffered risk factor. Reducing chronic undernutrition is required in order to further reduce child vulnerability and ensure maximum impact of anaemia control programs. Buffering the impact of child morbidity on haemoglobin levels, including malaria, may be achieved in certain settings.
One of the defining characteristics of autism spectrum disorder (ASD) is difficulty with language and communication.1 Children with ASD's onset of speaking is usually delayed, and many children with ASD consistently produce language less frequently and of lower lexical and grammatical complexity than their typically developing (TD) peers.6,8,12,23 However, children with ASD also exhibit a significant social deficit, and researchers and clinicians continue to debate the extent to which the deficits in social interaction account for or contribute to the deficits in language production.5,14,19,25 Standardized assessments of language in children with ASD usually do include a comprehension component; however, many such comprehension tasks assess just one aspect of language (e.g., vocabulary),5 or include a significant motor component (e.g., pointing, act-out), and/or require children to deliberately choose between a number of alternatives. These last two behaviors are known to also be challenging to children with ASD.7,12,13,16 We present a method which can assess the language comprehension of young typically developing children (9-36 months) and children with autism.2,4,9,11,22 This method, Portable Intermodal Preferential Looking (P-IPL), projects side-by-side video images from a laptop onto a portable screen. The video images are paired first with a 'baseline' (nondirecting) audio, and then presented again paired with a 'test' linguistic audio that matches only one of the video images. Children's eye movements while watching the video are filmed and later coded. Children who understand the linguistic audio will look more quickly to, and longer at, the video that matches the linguistic audio.2,4,11,18,22,26 This paradigm includes a number of components that have recently been miniaturized (projector, camcorder, digitizer) to enable portability and easy setup in children's homes. This is a crucial point for assessing young children with ASD, who are frequently uncomfortable in new (e.g., laboratory) settings. Videos can be created to assess a wide range of specific components of linguistic knowledge, such as Subject-Verb-Object word order, wh-questions, and tense/aspect suffixes on verbs; videos can also assess principles of word learning such as a noun bias, a shape bias, and syntactic bootstrapping.10,14,17,21,24 Videos include characters and speech that are visually and acoustically salient and well tolerated by children with ASD.
23 Related JoVE Articles!
Play Button
A Method for Investigating Age-related Differences in the Functional Connectivity of Cognitive Control Networks Associated with Dimensional Change Card Sort Performance
Authors: Bianca DeBenedictis, J. Bruce Morton.
Institutions: University of Western Ontario.
The ability to adjust behavior to sudden changes in the environment develops gradually in childhood and adolescence. For example, in the Dimensional Change Card Sort task, participants switch from sorting cards one way, such as shape, to sorting them a different way, such as color. Adjusting behavior in this way exacts a small performance cost, or switch cost, such that responses are typically slower and more error-prone on switch trials in which the sorting rule changes as compared to repeat trials in which the sorting rule remains the same. The ability to flexibly adjust behavior is often said to develop gradually, in part because behavioral costs such as switch costs typically decrease with increasing age. Why aspects of higher-order cognition, such as behavioral flexibility, develop so gradually remains an open question. One hypothesis is that these changes occur in association with functional changes in broad-scale cognitive control networks. On this view, complex mental operations, such as switching, involve rapid interactions between several distributed brain regions, including those that update and maintain task rules, re-orient attention, and select behaviors. With development, functional connections between these regions strengthen, leading to faster and more efficient switching operations. The current video describes a method of testing this hypothesis through the collection and multivariate analysis of fMRI data from participants of different ages.
Behavior, Issue 87, Neurosciences, fMRI, Cognitive Control, Development, Functional Connectivity
Play Button
Methods for Quantitative Detection of Antibody-induced Complement Activation on Red Blood Cells
Authors: Elisabeth M. Meulenbroek, Diana Wouters, Sacha Zeerleder.
Institutions: University of Amsterdam, University of Amsterdam.
Antibodies against red blood cells (RBCs) can lead to complement activation resulting in an accelerated clearance via complement receptors in the liver (extravascular hemolysis) or leading to intravascular lysis of RBCs. Alloantibodies (e.g. ABO) or autoantibodies to RBC antigens (as seen in autoimmune hemolytic anemia, AIHA) leading to complement activation are potentially harmful and can be - especially when leading to intravascular lysis - fatal1. Currently, complement activation due to (auto)-antibodies on RBCs is assessed in vitro by using the Coombs test reflecting complement deposition on RBC or by a nonquantitative hemolytic assay reflecting RBC lysis1-4. However, to assess the efficacy of complement inhibitors, it is mandatory to have quantitative techniques. Here we describe two such techniques. First, an assay to detect C3 and C4 deposition on red blood cells that is induced by antibodies in patient serum is presented. For this, FACS analysis is used with fluorescently labeled anti-C3 or anti-C4 antibodies. Next, a quantitative hemolytic assay is described. In this assay, complement-mediated hemolysis induced by patient serum is measured making use of spectrophotometric detection of the released hemoglobin. Both of these assays are very reproducible and quantitative, facilitating studies of antibody-induced complement activation.
Immunology, Issue 83, Complement, red blood cells, auto-immune hemolytic anemia, hemolytic assay, FACS, antibodies, C1-inhibitor
Play Button
Isolation and Quantification of Botulinum Neurotoxin From Complex Matrices Using the BoTest Matrix Assays
Authors: F. Mark Dunning, Timothy M. Piazza, Füsûn N. Zeytin, Ward C. Tucker.
Institutions: BioSentinel Inc., Madison, WI.
Accurate detection and quantification of botulinum neurotoxin (BoNT) in complex matrices is required for pharmaceutical, environmental, and food sample testing. Rapid BoNT testing of foodstuffs is needed during outbreak forensics, patient diagnosis, and food safety testing while accurate potency testing is required for BoNT-based drug product manufacturing and patient safety. The widely used mouse bioassay for BoNT testing is highly sensitive but lacks the precision and throughput needed for rapid and routine BoNT testing. Furthermore, the bioassay's use of animals has resulted in calls by drug product regulatory authorities and animal-rights proponents in the US and abroad to replace the mouse bioassay for BoNT testing. Several in vitro replacement assays have been developed that work well with purified BoNT in simple buffers, but most have not been shown to be applicable to testing in highly complex matrices. Here, a protocol for the detection of BoNT in complex matrices using the BoTest Matrix assays is presented. The assay consists of three parts: The first part involves preparation of the samples for testing, the second part is an immunoprecipitation step using anti-BoNT antibody-coated paramagnetic beads to purify BoNT from the matrix, and the third part quantifies the isolated BoNT's proteolytic activity using a fluorogenic reporter. The protocol is written for high throughput testing in 96-well plates using both liquid and solid matrices and requires about 2 hr of manual preparation with total assay times of 4-26 hr depending on the sample type, toxin load, and desired sensitivity. Data are presented for BoNT/A testing with phosphate-buffered saline, a drug product, culture supernatant, 2% milk, and fresh tomatoes and includes discussion of critical parameters for assay success.
Neuroscience, Issue 85, Botulinum, food testing, detection, quantification, complex matrices, BoTest Matrix, Clostridium, potency testing
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Construction of Vapor Chambers Used to Expose Mice to Alcohol During the Equivalent of all Three Trimesters of Human Development
Authors: Russell A. Morton, Marvin R. Diaz, Lauren A. Topper, C. Fernando Valenzuela.
Institutions: University of New Mexico Health Sciences Center.
Exposure to alcohol during development can result in a constellation of morphological and behavioral abnormalities that are collectively known as Fetal Alcohol Spectrum Disorders (FASDs). At the most severe end of the spectrum is Fetal Alcohol Syndrome (FAS), characterized by growth retardation, craniofacial dysmorphology, and neurobehavioral deficits. Studies with animal models, including rodents, have elucidated many molecular and cellular mechanisms involved in the pathophysiology of FASDs. Ethanol administration to pregnant rodents has been used to model human exposure during the first and second trimesters of pregnancy. Third trimester ethanol consumption in humans has been modeled using neonatal rodents. However, few rodent studies have characterized the effect of ethanol exposure during the equivalent to all three trimesters of human pregnancy, a pattern of exposure that is common in pregnant women. Here, we show how to build vapor chambers from readily obtainable materials that can each accommodate up to six standard mouse cages. We describe a vapor chamber paradigm that can be used to model exposure to ethanol, with minimal handling, during all three trimesters. Our studies demonstrate that pregnant dams developed significant metabolic tolerance to ethanol. However, neonatal mice did not develop metabolic tolerance and the number of fetuses, fetus weight, placenta weight, number of pups/litter, number of dead pups/litter, and pup weight were not significantly affected by ethanol exposure. An important advantage of this paradigm is its applicability to studies with genetically-modified mice. Additionally, this paradigm minimizes handling of animals, a major confound in fetal alcohol research.
Medicine, Issue 89, fetal, ethanol, exposure, paradigm, vapor, development, alcoholism, teratogenic, animal, mouse, model
Play Button
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Authors: Hugh Alley, Christopher D. Owens, Warren J. Gasper, S. Marlene Grenon.
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone. Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia. The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction. There exists a non-invasive, in vivo method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia. This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
Play Button
Measuring Attentional Biases for Threat in Children and Adults
Authors: Vanessa LoBue.
Institutions: Rutgers University.
Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g., snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g., flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.
Behavior, Issue 92, Detection, threat, attention, attentional bias, anxiety, visual search
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
Play Button
Determining Soil-transmitted Helminth Infection Status and Physical Fitness of School-aged Children
Authors: Peiling Yap, Thomas Fürst, Ivan Müller, Susi Kriemler, Jürg Utzinger, Peter Steinmann.
Institutions: Swiss Tropical and Public Health Institute, Basel, Switzerland, University of Basel, Basel, Switzerland.
Soil-transmitted helminth (STH) infections are common. Indeed, more than 1 billion people are affected, mainly in the developing world where poverty prevails and hygiene behavior, water supply, and sanitation are often deficient1,2. Ascaris lumbricoides, Trichuris trichiura, and the two hookworm species, Ancylostoma duodenale and Necator americanus, are the most prevalent STHs3. The estimated global burden due to hookworm disease, ascariasis, and trichuriasis is 22.1, 10.5, and 6.4 million disability-adjusted life years (DALYs), respectively4. Furthermore, an estimated 30-100 million people are infected with Strongyloides stercoralis, the most neglected STH species of global significance which arguably also causes a considerable public health impact5,6. Multiple-species infections (i.e., different STHs harbored in a single individual) are common, and infections have been linked to lowered productivity and thus economic outlook of developing countries1,3. For the diagnosis of common STHs, the World Health Organization (WHO) recommends the Kato-Katz technique7,8, which is a relatively straightforward method for determining the prevalence and intensity of such infections. It facilitates the detection of parasite eggs that infected subjects pass in their feces. With regard to the diagnosis of S.stercoralis, there is currently no simple and accurate tool available. The Baermann technique is the most widely employed method for its diagnosis. The principle behind the Baermann technique is that active S.stercoralis larvae migrate out of an illuminated fresh fecal sample as the larvae are phototactic9. It requires less sophisticated laboratory materials and is less time consuming than culture and immunological methods5. Morbidities associated with STH infections range from acute but common symptoms, such as abdominal pain, diarrhea, and pruritus, to chronic symptoms, such as anemia, under- and malnutrition, and cognitive impairment10. Since the symptoms are generally unspecific and subtle, they often go unnoticed, are considered a normal condition by affected individuals, or are treated as symptoms of other diseases that might be more common in a given setting. Hence, it is conceivable that the true burden of STH infections is underestimated by assessment tools relying on self-declared signs and symptoms as is usually the case in population-based surveys. In the late 1980s and early 1990s, Stephenson and colleagues highlighted the possibility of STH infections lowering the physical fitness of boys aged 6-12 years11,12. This line of scientific inquiry gained new momentum recently13,14,15. The 20-meter (m) shuttle run test was developed and validated by Léger et al.16 and is used worldwide to measure the aerobic fitness of children17. The test is easy to standardize and can be performed wherever a 20-m long and flat running course and an audio source are available, making its use attractive in resource-constrained settings13. To facilitate and standardize attempts at assessing whether STH infections have an effect on the physical fitness of school-aged children, we present methodologies that diagnose STH infections or measure physical fitness that are simple to execute and yet, provide accurate and reproducible outcomes. This will help to generate new evidence regarding the health impact of STH infections.
Infection, Issue 66, Immunology, Medicine, Infectious Diseases, Soil-transmitted helminths, physical fitness, Kato-Katz technique, Baermann technique, 20-meter shuttle run test, children
Play Button
A Novel Rescue Technique for Difficult Intubation and Difficult Ventilation
Authors: Maria M. Zestos, Dima Daaboul, Zulfiqar Ahmed, Nasser Durgham, Roland Kaddoum.
Institutions: Children’s Hospital of Michigan, St. Jude Children’s Research Hospital.
We describe a novel non surgical technique to maintain oxygenation and ventilation in a case of difficult intubation and difficult ventilation, which works especially well with poor mask fit. Can not intubate, can not ventilate" (CICV) is a potentially life threatening situation. In this video we present a simulation of the technique we used in a case of CICV where oxygenation and ventilation were maintained by inserting an endotracheal tube (ETT) nasally down to the level of the naso-pharynx while sealing the mouth and nares for successful positive pressure ventilation. A 13 year old patient was taken to the operating room for incision and drainage of a neck abcess and direct laryngobronchoscopy. After preoxygenation, anesthesia was induced intravenously. Mask ventilation was found to be extremely difficult because of the swelling of the soft tissue. The face mask could not fit properly on the face due to significant facial swelling as well. A direct laryngoscopy was attempted with no visualization of the larynx. Oxygen saturation was difficult to maintain, with saturations falling to 80%. In order to oxygenate and ventilate the patient, an endotracheal tube was then inserted nasally after nasal spray with nasal decongestant and lubricant. The tube was pushed gently and blindly into the hypopharynx. The mouth and nose of the patient were sealed by hand and positive pressure ventilation was possible with 100% O2 with good oxygen saturation during that period of time. Once the patient was stable and well sedated, a rigid bronchoscope was introduced by the otolaryngologist showing extensive subglottic and epiglottic edema, and a mass effect from the abscess, contributing to the airway compromise. The airway was secured with an ETT tube by the otolaryngologist.This video will show a simulation of the technique on a patient undergoing general anesthesia for dental restorations.
Medicine, Issue 47, difficult ventilation, difficult intubation, nasal, saturation
Play Button
Measurement Of Neuromagnetic Brain Function In Pre-school Children With Custom Sized MEG
Authors: Graciela Tesan, Blake W. Johnson, Melanie Reid, Rosalind Thornton, Stephen Crain.
Institutions: Macquarie University.
Magnetoencephalography is a technique that detects magnetic fields associated with cortical activity [1]. The electrophysiological activity of the brain generates electric fields - that can be recorded using electroencephalography (EEG)- and their concomitant magnetic fields - detected by MEG. MEG signals are detected by specialized sensors known as superconducting quantum interference devices (SQUIDs). Superconducting sensors require cooling with liquid helium at -270 °C. They are contained inside a vacumm-insulated helmet called a dewar, which is filled with liquid. SQUIDS are placed in fixed positions inside the helmet dewar in the helium coolant, and a subject's head is placed inside the helmet dewar for MEG measurements. The helmet dewar must be sized to satisfy opposing constraints. Clearly, it must be large enough to fit most or all of the heads in the population that will be studied. However, the helmet must also be small enough to keep most of the SQUID sensors within range of the tiny cerebral fields that they are to measure. Conventional whole-head MEG systems are designed to accommodate more than 90% of adult heads. However adult systems are not well suited for measuring brain function in pre-school chidren whose heads have a radius several cm smaller than adults. The KIT-Macquarie Brain Research Laboratory at Macquarie University uses a MEG system custom sized to fit the heads of pre-school children. This child system has 64 first-order axial gradiometers with a 50 mm baseline[2] and is contained inside a magnetically-shielded room (MSR) together with a conventional adult-sized MEG system [3,4]. There are three main advantages of the customized helmet dewar for studying children. First, the smaller radius of the sensor configuration brings the SQUID sensors into range of the neuromagnetic signals of children's heads. Second, the smaller helmet allows full insertion of a child's head into the dewar. Full insertion is prevented in adult dewar helmets because of the smaller crown to shoulder distance in children. These two factors are fundamental in recording brain activity using MEG because neuromagnetic signals attenuate rapidly with distance. Third, the customized child helmet aids in the symmetric positioning of the head and limits the freedom of movement of the child's head within the dewar. When used with a protocol that aligns the requirements of data collection with the motivational and behavioral capacities of children, these features significantly facilitate setup, positioning, and measurement of MEG signals.
Neuroscience, Issue 36, Magnetoencephalography, Pediatrics, Brain Mapping, Language, Brain Development, Cognitive Neuroscience, Language Acquisition, Linguistics
Play Button
Making Sense of Listening: The IMAP Test Battery
Authors: Johanna G. Barry, Melanie A. Ferguson, David R. Moore.
Institutions: MRC Institute of Hearing Research, National Biomedical Research Unit in Hearing.
The ability to hear is only the first step towards making sense of the range of information contained in an auditory signal. Of equal importance are the abilities to extract and use the information encoded in the auditory signal. We refer to these as listening skills (or auditory processing AP). Deficits in these skills are associated with delayed language and literacy development, though the nature of the relevant deficits and their causal connection with these delays is hotly debated. When a child is referred to a health professional with normal hearing and unexplained difficulties in listening, or associated delays in language or literacy development, they should ideally be assessed with a combination of psychoacoustic (AP) tests, suitable for children and for use in a clinic, together with cognitive tests to measure attention, working memory, IQ, and language skills. Such a detailed examination needs to be relatively short and within the technical capability of any suitably qualified professional. Current tests for the presence of AP deficits tend to be poorly constructed and inadequately validated within the normal population. They have little or no reference to the presenting symptoms of the child, and typically include a linguistic component. Poor performance may thus reflect problems with language rather than with AP. To assist in the assessment of children with listening difficulties, pediatric audiologists need a single, standardized child-appropriate test battery based on the use of language-free stimuli. We present the IMAP test battery which was developed at the MRC Institute of Hearing Research to supplement tests currently used to investigate cases of suspected AP deficits. IMAP assesses a range of relevant auditory and cognitive skills and takes about one hour to complete. It has been standardized in 1500 normally-hearing children from across the UK, aged 6-11 years. Since its development, it has been successfully used in a number of large scale studies both in the UK and the USA. IMAP provides measures for separating out sensory from cognitive contributions to hearing. It further limits confounds due to procedural effects by presenting tests in a child-friendly game-format. Stimulus-generation, management of test protocols and control of test presentation is mediated by the IHR-STAR software platform. This provides a standardized methodology for a range of applications and ensures replicable procedures across testers. IHR-STAR provides a flexible, user-programmable environment that currently has additional applications for hearing screening, mapping cochlear implant electrodes, and academic research or teaching.
Neuroscience, Issue 44, Listening skills, auditory processing, auditory psychophysics, clinical assessment, child-friendly testing
Play Button
Assessment of Cerebral Lateralization in Children using Functional Transcranial Doppler Ultrasound (fTCD)
Authors: Dorothy V. M. Bishop, Nicholas A. Badcock, Georgina Holt.
Institutions: University of Oxford.
There are many unanswered questions about cerebral lateralization. In particular, it remains unclear which aspects of language and nonverbal ability are lateralized, whether there are any disadvantages associated with atypical patterns of cerebral lateralization, and whether cerebral lateralization develops with age. In the past, researchers interested in these questions tended to use handedness as a proxy measure for cerebral lateralization, but this is unsatisfactory because handedness is only a weak and indirect indicator of laterality of cognitive functions1. Other methods, such as fMRI, are expensive for large-scale studies, and not always feasible with children2. Here we will describe the use of functional transcranial Doppler ultrasound (fTCD) as a cost-effective, non-invasive and reliable method for assessing cerebral lateralization. The procedure involves measuring blood flow in the middle cerebral artery via an ultrasound probe placed just in front of the ear. Our work builds on work by Rune Aaslid, who co-introduced TCD in 1982, and Stefan Knecht, Michael Deppe and their colleagues at the University of Münster, who pioneered the use of simultaneous measurements of left- and right middle cerebral artery blood flow, and devised a method of correcting for heart beat activity. This made it possible to see a clear increase in left-sided blood flow during language generation, with lateralization agreeing well with that obtained using other methods3. The middle cerebral artery has a very wide vascular territory (see Figure 1) and the method does not provide useful information about localization within a hemisphere. Our experience suggests it is particularly sensitive to tasks that involve explicit or implicit speech production. The 'gold standard' task is a word generation task (e.g. think of as many words as you can that begin with the letter 'B') 4, but this is not suitable for young children and others with limited literacy skills. Compared with other brain imaging methods, fTCD is relatively unaffected by movement artefacts from speaking, and so we are able to get a reliable result from tasks that involve describing pictures aloud5,6. Accordingly, we have developed a child-friendly task that involves looking at video-clips that tell a story, and then describing what was seen.
Neuroscience, Issue 43, functional transcranial Doppler ultrasound, cerebral lateralization, language, child
Play Button
Assessment and Evaluation of the High Risk Neonate: The NICU Network Neurobehavioral Scale
Authors: Barry M. Lester, Lynne Andreozzi-Fontaine, Edward Tronick, Rosemarie Bigsby.
Institutions: Brown University, Women & Infants Hospital of Rhode Island, University of Massachusetts, Boston.
There has been a long-standing interest in the assessment of the neurobehavioral integrity of the newborn infant. The NICU Network Neurobehavioral Scale (NNNS) was developed as an assessment for the at-risk infant. These are infants who are at increased risk for poor developmental outcome because of insults during prenatal development, such as substance exposure or prematurity or factors such as poverty, poor nutrition or lack of prenatal care that can have adverse effects on the intrauterine environment and affect the developing fetus. The NNNS assesses the full range of infant neurobehavioral performance including neurological integrity, behavioral functioning, and signs of stress/abstinence. The NNNS is a noninvasive neonatal assessment tool with demonstrated validity as a predictor, not only of medical outcomes such as cerebral palsy diagnosis, neurological abnormalities, and diseases with risks to the brain, but also of developmental outcomes such as mental and motor functioning, behavior problems, school readiness, and IQ. The NNNS can identify infants at high risk for abnormal developmental outcome and is an important clinical tool that enables medical researchers and health practitioners to identify these infants and develop intervention programs to optimize the development of these infants as early as possible. The video shows the NNNS procedures, shows examples of normal and abnormal performance and the various clinical populations in which the exam can be used.
Behavior, Issue 90, NICU Network Neurobehavioral Scale, NNNS, High risk infant, Assessment, Evaluation, Prediction, Long term outcome
Play Button
Derivation of Glial Restricted Precursors from E13 mice
Authors: André W. Phillips, Sina Falahati, Roshi DeSilva, Irina Shats, Joel Marx, Edwin Arauz, Douglas A. Kerr, Jeffrey D. Rothstein, Michael V. Johnston, Ali Fatemi.
Institutions: Johns Hopkins University, Johns Hopkins School of Medicine, University of Maryland , Biogen Idec, Johns Hopkins School of Medicine, Johns Hopkins School of Medicine.
This is a protocol for derivation of glial restricted precursor (GRP) cells from the spinal cord of E13 mouse fetuses. These cells are early precursors within the oligodendrocytic cell lineage. Recently, these cells have been studied as potential source for restorative therapies in white matter diseases. Periventricular leukomalacia (PVL) is the leading cause of non-genetic white matter disease in childhood and affects up to 50% of extremely premature infants. The data suggest a heightened susceptibility of the developing brain to hypoxia-ischemia, oxidative stress and excitotoxicity that selectively targets nascent white matter. Glial restricted precursors (GRP), oligodendrocyte progenitor cells (OPC) and immature oligodendrocytes (preOL) seem to be key players in the development of PVL and are the subject of continuing studies. Furthermore, previous studies have identified a subset of CNS tissue that has increased susceptibility to glutamate excitotoxicity as well as a developmental pattern to this susceptibility. Our laboratory is currently investigating the role of oligodendrocyte progenitors in PVL and use cells at the GRP stage of development. We utilize these derived GRP cells in several experimental paradigms to test their response to select stresses consistent with PVL. GRP cells can be manipulated in vitro into OPCs and preOL for transplantation experiments with mouse PVL models and in vitro models of PVL-like insults including hypoxia-ischemia. By using cultured cells and in vitro studies there would be reduced variability between experiments which facilitates interpretation of the data. Cultured cells also allows for enrichment of the GRP population while minimizing the impact of contaminating cells of non-GRP phenotype.
Neuroscience, Issue 64, Physiology, Medicine, periventricular leukomalacia, oligodendrocytes, glial restricted precursors, spinal cord, cell culture
Play Button
Eye Tracking Young Children with Autism
Authors: Noah J. Sasson, Jed T. Elison.
Institutions: University of Texas at Dallas, University of North Carolina at Chapel Hill.
The rise of accessible commercial eye-tracking systems has fueled a rapid increase in their use in psychological and psychiatric research. By providing a direct, detailed and objective measure of gaze behavior, eye-tracking has become a valuable tool for examining abnormal perceptual strategies in clinical populations and has been used to identify disorder-specific characteristics1, promote early identification2, and inform treatment3. In particular, investigators of autism spectrum disorders (ASD) have benefited from integrating eye-tracking into their research paradigms4-7. Eye-tracking has largely been used in these studies to reveal mechanisms underlying impaired task performance8 and abnormal brain functioning9, particularly during the processing of social information1,10-11. While older children and adults with ASD comprise the preponderance of research in this area, eye-tracking may be especially useful for studying young children with the disorder as it offers a non-invasive tool for assessing and quantifying early-emerging developmental abnormalities2,12-13. Implementing eye-tracking with young children with ASD, however, is associated with a number of unique challenges, including issues with compliant behavior resulting from specific task demands and disorder-related psychosocial considerations. In this protocol, we detail methodological considerations for optimizing research design, data acquisition and psychometric analysis while eye-tracking young children with ASD. The provided recommendations are also designed to be more broadly applicable for eye-tracking children with other developmental disabilities. By offering guidelines for best practices in these areas based upon lessons derived from our own work, we hope to help other investigators make sound research design and analysis choices while avoiding common pitfalls that can compromise data acquisition while eye-tracking young children with ASD or other developmental difficulties.
Medicine, Issue 61, eye tracking, autism, neurodevelopmental disorders, toddlers, perception, attention, social cognition
Play Button
Making MR Imaging Child's Play - Pediatric Neuroimaging Protocol, Guidelines and Procedure
Authors: Nora M. Raschle, Michelle Lee, Roman Buechler, Joanna A. Christodoulou, Maria Chang, Monica Vakil, Patrice L. Stering, Nadine Gaab.
Institutions: Children’s Hospital Boston, University of Zurich, Harvard, Harvard Medical School.
Within the last decade there has been an increase in the use of structural and functional magnetic resonance imaging (fMRI) to investigate the neural basis of human perception, cognition and behavior 1, 2. Moreover, this non-invasive imaging method has grown into a tool for clinicians and researchers to explore typical and atypical brain development. Although advances in neuroimaging tools and techniques are apparent, (f)MRI in young pediatric populations remains relatively infrequent 2. Practical as well as technical challenges when imaging children present clinicians and research teams with a unique set of problems 3, 2. To name just a few, the child participants are challenged by a need for motivation, alertness and cooperation. Anxiety may be an additional factor to be addressed. Researchers or clinicians need to consider time constraints, movement restriction, scanner background noise and unfamiliarity with the MR scanner environment2,4-10. A progressive use of functional and structural neuroimaging in younger age groups, however, could further add to our understanding of brain development. As an example, several research groups are currently working towards early detection of developmental disorders, potentially even before children present associated behavioral characteristics e.g.11. Various strategies and techniques have been reported as a means to ensure comfort and cooperation of young children during neuroimaging sessions. Play therapy 12, behavioral approaches 13, 14,15, 16-18 and simulation 19, the use of mock scanner areas 20,21, basic relaxation 22 and a combination of these techniques 23 have all been shown to improve the participant's compliance and thus MRI data quality. Even more importantly, these strategies have proven to increase the comfort of families and children involved 12. One of the main advances of such techniques for the clinical practice is the possibility of avoiding sedation or general anesthesia (GA) as a way to manage children's compliance during MR imaging sessions 19,20. In the current video report, we present a pediatric neuroimaging protocol with guidelines and procedures that have proven to be successful to date in young children.
Neuroscience, Issue 29, fMRI, imaging, development, children, pediatric neuroimaging, cognitive development, magnetic resonance imaging, pediatric imaging protocol, patient preparation, mock scanner
Play Button
Building a Better Mosquito: Identifying the Genes Enabling Malaria and Dengue Fever Resistance in A. gambiae and A. aegypti Mosquitoes
Authors: George Dimopoulos.
Institutions: Johns Hopkins University.
In this interview, George Dimopoulos focuses on the physiological mechanisms used by mosquitoes to combat Plasmodium falciparum and dengue virus infections. Explanation is given for how key refractory genes, those genes conferring resistance to vector pathogens, are identified in the mosquito and how this knowledge can be used to generate transgenic mosquitoes that are unable to carry the malaria parasite or dengue virus.
Cellular Biology, Issue 5, Translational Research, mosquito, malaria, virus, dengue, genetics, injection, RNAi, transgenesis, transgenic
Play Button
Population Replacement Strategies for Controlling Vector Populations and the Use of Wolbachia pipientis for Genetic Drive
Authors: Jason Rasgon.
Institutions: Johns Hopkins University.
In this video, Jason Rasgon discusses population replacement strategies to control vector-borne diseases such as malaria and dengue. "Population replacement" is the replacement of wild vector populations (that are competent to transmit pathogens) with those that are not competent to transmit pathogens. There are several theoretical strategies to accomplish this. One is to exploit the maternally-inherited symbiotic bacteria Wolbachia pipientis. Wolbachia is a widespread reproductive parasite that spreads in a selfish manner at the extent of its host's fitness. Jason Rasgon discusses, in detail, the basic biology of this bacterial symbiont and various ways to use it for control of vector-borne diseases.
Cellular Biology, Issue 5, mosquito, malaria, genetics, infectious disease, Wolbachia
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.