JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
High Incorrect Use of the Standard Error of the Mean (SEM) in Original Articles in Three Cardiovascular Journals Evaluated for 2012.
PLoS ONE
PUBLISHED: 01-01-2014
In biomedical journals authors sometimes use the standard error of the mean (SEM) for data description, which has been called inappropriate or incorrect.
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Published: 06-30-2014
ABSTRACT
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
24 Related JoVE Articles!
Play Button
Manual Isolation of Adipose-derived Stem Cells from Human Lipoaspirates
Authors: Min Zhu, Sepideh Heydarkhan-Hagvall, Marc Hedrick, Prosper Benhaim, Patricia Zuk.
Institutions: Cytori Therapeutics Inc, David Geffen School of Medicine at UCLA, David Geffen School of Medicine at UCLA, David Geffen School of Medicine at UCLA, David Geffen School of Medicine at UCLA.
In 2001, researchers at the University of California, Los Angeles, described the isolation of a new population of adult stem cells from liposuctioned adipose tissue that they initially termed Processed Lipoaspirate Cells or PLA cells. Since then, these stem cells have been renamed as Adipose-derived Stem Cells or ASCs and have gone on to become one of the most popular adult stem cells populations in the fields of stem cell research and regenerative medicine. Thousands of articles now describe the use of ASCs in a variety of regenerative animal models, including bone regeneration, peripheral nerve repair and cardiovascular engineering. Recent articles have begun to describe the myriad of uses for ASCs in the clinic. The protocol shown in this article outlines the basic procedure for manually and enzymatically isolating ASCs from large amounts of lipoaspirates obtained from cosmetic procedures. This protocol can easily be scaled up or down to accommodate the volume of lipoaspirate and can be adapted to isolate ASCs from fat tissue obtained through abdominoplasties and other similar procedures.
Cellular Biology, Issue 79, Adipose Tissue, Stem Cells, Humans, Cell Biology, biology (general), enzymatic digestion, collagenase, cell isolation, Stromal Vascular Fraction (SVF), Adipose-derived Stem Cells, ASCs, lipoaspirate, liposuction
50585
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
50680
Play Button
Barnes Maze Testing Strategies with Small and Large Rodent Models
Authors: Cheryl S. Rosenfeld, Sherry A. Ferguson.
Institutions: University of Missouri, Food and Drug Administration.
Spatial learning and memory of laboratory rodents is often assessed via navigational ability in mazes, most popular of which are the water and dry-land (Barnes) mazes. Improved performance over sessions or trials is thought to reflect learning and memory of the escape cage/platform location. Considered less stressful than water mazes, the Barnes maze is a relatively simple design of a circular platform top with several holes equally spaced around the perimeter edge. All but one of the holes are false-bottomed or blind-ending, while one leads to an escape cage. Mildly aversive stimuli (e.g. bright overhead lights) provide motivation to locate the escape cage. Latency to locate the escape cage can be measured during the session; however, additional endpoints typically require video recording. From those video recordings, use of automated tracking software can generate a variety of endpoints that are similar to those produced in water mazes (e.g. distance traveled, velocity/speed, time spent in the correct quadrant, time spent moving/resting, and confirmation of latency). Type of search strategy (i.e. random, serial, or direct) can be categorized as well. Barnes maze construction and testing methodologies can differ for small rodents, such as mice, and large rodents, such as rats. For example, while extra-maze cues are effective for rats, smaller wild rodents may require intra-maze cues with a visual barrier around the maze. Appropriate stimuli must be identified which motivate the rodent to locate the escape cage. Both Barnes and water mazes can be time consuming as 4-7 test trials are typically required to detect improved learning and memory performance (e.g. shorter latencies or path lengths to locate the escape platform or cage) and/or differences between experimental groups. Even so, the Barnes maze is a widely employed behavioral assessment measuring spatial navigational abilities and their potential disruption by genetic, neurobehavioral manipulations, or drug/ toxicant exposure.
Behavior, Issue 84, spatial navigation, rats, Peromyscus, mice, intra- and extra-maze cues, learning, memory, latency, search strategy, escape motivation
51194
Play Button
Study of Phagolysosome Biogenesis in Live Macrophages
Authors: Marc Bronietzki, Bahram Kasmapour, Maximiliano Gabriel Gutierrez.
Institutions: Helmholtz Centre for Infection Research, National Institute for Medical Research.
Phagocytic cells play a major role in the innate immune system by removing and eliminating invading microorganisms in their phagosomes. Phagosome maturation is the complex and tightly regulated process during which a nascent phagosome undergoes drastic transformation through well-orchestrated interactions with various cellular organelles and compartments in the cytoplasm. This process, which is essential for the physiological function of phagocytic cells by endowing phagosomes with their lytic and bactericidal properties, culminates in fusion of phagosomes with lysosomes and biogenesis of phagolysosomes which is considered to be the last and critical stage of maturation for phagosomes. In this report, we describe a live cell imaging based method for qualitative and quantitative analysis of the dynamic process of lysosome to phagosome content delivery, which is a hallmark of phagolysosome biogenesis. This approach uses IgG-coated microbeads as a model for phagocytosis and fluorophore-conjugated dextran molecules as a luminal lysosomal cargo probe, in order to follow the dynamic delivery of lysosmal content to the phagosomes in real time in live macrophages using time-lapse imaging and confocal laser scanning microscopy. Here we describe in detail the background, the preparation steps and the step-by-step experimental setup to enable easy and precise deployment of this method in other labs. Our described method is simple, robust, and most importantly, can be easily adapted to study phagosomal interactions and maturation in different systems and under various experimental settings such as use of various phagocytic cells types, loss-of-function experiments, different probes, and phagocytic particles.
Immunology, Issue 85, Lysosome, Phagosome, phagolysosome, live-cell imaging, phagocytes, macrophages
51201
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Recording Single Neurons' Action Potentials from Freely Moving Pigeons Across Three Stages of Learning
Authors: Sarah Starosta, Maik C. Stüttgen, Onur Güntürkün.
Institutions: Ruhr-University Bochum.
While the subject of learning has attracted immense interest from both behavioral and neural scientists, only relatively few investigators have observed single-neuron activity while animals are acquiring an operantly conditioned response, or when that response is extinguished. But even in these cases, observation periods usually encompass only a single stage of learning, i.e. acquisition or extinction, but not both (exceptions include protocols employing reversal learning; see Bingman et al.1 for an example). However, acquisition and extinction entail different learning mechanisms and are therefore expected to be accompanied by different types and/or loci of neural plasticity. Accordingly, we developed a behavioral paradigm which institutes three stages of learning in a single behavioral session and which is well suited for the simultaneous recording of single neurons' action potentials. Animals are trained on a single-interval forced choice task which requires mapping each of two possible choice responses to the presentation of different novel visual stimuli (acquisition). After having reached a predefined performance criterion, one of the two choice responses is no longer reinforced (extinction). Following a certain decrement in performance level, correct responses are reinforced again (reacquisition). By using a new set of stimuli in every session, animals can undergo the acquisition-extinction-reacquisition process repeatedly. Because all three stages of learning occur in a single behavioral session, the paradigm is ideal for the simultaneous observation of the spiking output of multiple single neurons. We use pigeons as model systems, but the task can easily be adapted to any other species capable of conditioned discrimination learning.
Neuroscience, Issue 88, pigeon, single unit recording, learning, memory, extinction, spike sorting, operant conditioning, reward, electrophysiology, animal cognition, model species
51283
Play Button
Preparation of DNA-crosslinked Polyacrylamide Hydrogels
Authors: Michelle L. Previtera, Noshir A. Langrana.
Institutions: JFK Medical Center, Rutgers University, Rutgers University.
Mechanobiology is an emerging scientific area that addresses the critical role of physical cues in directing cell morphology and function. For example, the effect of tissue elasticity on cell function is a major area of mechanobiology research because tissue stiffness modulates with disease, development, and injury. Static tissue-mimicking materials, or materials that cannot alter stiffness once cells are plated, are predominately used to investigate the effects of tissue stiffness on cell functions. While information gathered from static studies is valuable, these studies are not indicative of the dynamic nature of the cellular microenvironment in vivo. To better address the effects of dynamic stiffness on cell function, we developed a DNA-crosslinked polyacrylamide hydrogel system (DNA gels). Unlike other dynamic substrates, DNA gels have the ability to decrease or increase in stiffness after fabrication without stimuli. DNA gels consist of DNA crosslinks that are polymerized into a polyacrylamide backbone. Adding and removing crosslinks via delivery of single-stranded DNA allows temporal, spatial, and reversible control of gel elasticity. We have shown in previous reports that dynamic modulation of DNA gel elasticity influences fibroblast and neuron behavior. In this report and video, we provide a schematic that describes the DNA gel crosslinking mechanisms and step-by-step instructions on the preparation DNA gels.
Bioengineering, Issue 90, bioengineering (general), Elastic, viscoelastic, bis-acrylamide, substrate, stiffness, dynamic, static, neuron, fibroblast, compliance, ECM, mechanobiology, tunable
51323
Play Button
The 5-Choice Serial Reaction Time Task: A Task of Attention and Impulse Control for Rodents
Authors: Samuel K. Asinof, Tracie A. Paine.
Institutions: Oberlin College.
This protocol describes the 5-choice serial reaction time task, which is an operant based task used to study attention and impulse control in rodents. Test day challenges, modifications to the standard task, can be used to systematically tax the neural systems controlling either attention or impulse control. Importantly, these challenges have consistent effects on behavior across laboratories in intact animals and can reveal either enhancements or deficits in cognitive function that are not apparent when rats are only tested on the standard task. The variety of behavioral measures that are collected can be used to determine if other factors (i.e., sedation, motivation deficits, locomotor impairments) are contributing to changes in performance. The versatility of the 5CSRTT is further enhanced because it is amenable to combination with pharmacological, molecular, and genetic techniques.
Neuroscience, Issue 90, attention, impulse control, neuroscience, cognition, rodent
51574
Play Button
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Authors: Kristen K. McCampbell, Kristin N. Springer, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90, zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
51644
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Characterization of Surface Modifications by White Light Interferometry: Applications in Ion Sputtering, Laser Ablation, and Tribology Experiments
Authors: Sergey V. Baryshev, Robert A. Erck, Jerry F. Moore, Alexander V. Zinovev, C. Emil Tripa, Igor V. Veryovkin.
Institutions: Argonne National Laboratory, Argonne National Laboratory, MassThink LLC.
In materials science and engineering it is often necessary to obtain quantitative measurements of surface topography with micrometer lateral resolution. From the measured surface, 3D topographic maps can be subsequently analyzed using a variety of software packages to extract the information that is needed. In this article we describe how white light interferometry, and optical profilometry (OP) in general, combined with generic surface analysis software, can be used for materials science and engineering tasks. In this article, a number of applications of white light interferometry for investigation of surface modifications in mass spectrometry, and wear phenomena in tribology and lubrication are demonstrated. We characterize the products of the interaction of semiconductors and metals with energetic ions (sputtering), and laser irradiation (ablation), as well as ex situ measurements of wear of tribological test specimens. Specifically, we will discuss: Aspects of traditional ion sputtering-based mass spectrometry such as sputtering rates/yields measurements on Si and Cu and subsequent time-to-depth conversion. Results of quantitative characterization of the interaction of femtosecond laser irradiation with a semiconductor surface. These results are important for applications such as ablation mass spectrometry, where the quantities of evaporated material can be studied and controlled via pulse duration and energy per pulse. Thus, by determining the crater geometry one can define depth and lateral resolution versus experimental setup conditions. Measurements of surface roughness parameters in two dimensions, and quantitative measurements of the surface wear that occur as a result of friction and wear tests. Some inherent drawbacks, possible artifacts, and uncertainty assessments of the white light interferometry approach will be discussed and explained.
Materials Science, Issue 72, Physics, Ion Beams (nuclear interactions), Light Reflection, Optical Properties, Semiconductor Materials, White Light Interferometry, Ion Sputtering, Laser Ablation, Femtosecond Lasers, Depth Profiling, Time-of-flight Mass Spectrometry, Tribology, Wear Analysis, Optical Profilometry, wear, friction, atomic force microscopy, AFM, scanning electron microscopy, SEM, imaging, visualization
50260
Play Button
Telomere Length and Telomerase Activity; A Yin and Yang of Cell Senescence
Authors: Mary Derasmo Axelrad, Temuri Budagov, Gil Atzmon.
Institutions: Albert Einstein College of Medicine , Albert Einstein College of Medicine , Albert Einstein College of Medicine .
Telomeres are repeating DNA sequences at the tip ends of the chromosomes that are diverse in length and in humans can reach a length of 15,000 base pairs. The telomere serves as a bioprotective mechanism of chromosome attrition at each cell division. At a certain length, telomeres become too short to allow replication, a process that may lead to chromosome instability or cell death. Telomere length is regulated by two opposing mechanisms: attrition and elongation. Attrition occurs as each cell divides. In contrast, elongation is partially modulated by the enzyme telomerase, which adds repeating sequences to the ends of the chromosomes. In this way, telomerase could possibly reverse an aging mechanism and rejuvenates cell viability. These are crucial elements in maintaining cell life and are used to assess cellular aging. In this manuscript we will describe an accurate, short, sophisticated and cheap method to assess telomere length in multiple tissues and species. This method takes advantage of two key elements, the tandem repeat of the telomere sequence and the sensitivity of the qRT-PCR to detect differential copy numbers of tested samples. In addition, we will describe a simple assay to assess telomerase activity as a complementary backbone test for telomere length.
Genetics, Issue 75, Molecular Biology, Cellular Biology, Medicine, Biomedical Engineering, Genomics, Telomere length, telomerase activity, telomerase, telomeres, telomere, DNA, PCR, polymerase chain reaction, qRT-PCR, sequencing, aging, telomerase assay
50246
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
3998
Play Button
Correlating Behavioral Responses to fMRI Signals from Human Prefrontal Cortex: Examining Cognitive Processes Using Task Analysis
Authors: Joseph F.X. DeSouza, Shima Ovaysikia, Laura K. Pynn.
Institutions: Centre for Vision Research, York University, Centre for Vision Research, York University.
The aim of this methods paper is to describe how to implement a neuroimaging technique to examine complementary brain processes engaged by two similar tasks. Participants' behavior during task performance in an fMRI scanner can then be correlated to the brain activity using the blood-oxygen-level-dependent signal. We measure behavior to be able to sort correct trials, where the subject performed the task correctly and then be able to examine the brain signals related to correct performance. Conversely, if subjects do not perform the task correctly, and these trials are included in the same analysis with the correct trials we would introduce trials that were not only for correct performance. Thus, in many cases these errors can be used themselves to then correlate brain activity to them. We describe two complementary tasks that are used in our lab to examine the brain during suppression of an automatic responses: the stroop1 and anti-saccade tasks. The emotional stroop paradigm instructs participants to either report the superimposed emotional 'word' across the affective faces or the facial 'expressions' of the face stimuli1,2. When the word and the facial expression refer to different emotions, a conflict between what must be said and what is automatically read occurs. The participant has to resolve the conflict between two simultaneously competing processes of word reading and facial expression. Our urge to read out a word leads to strong 'stimulus-response (SR)' associations; hence inhibiting these strong SR's is difficult and participants are prone to making errors. Overcoming this conflict and directing attention away from the face or the word requires the subject to inhibit bottom up processes which typically directs attention to the more salient stimulus. Similarly, in the anti-saccade task3,4,5,6, where an instruction cue is used to direct only attention to a peripheral stimulus location but then the eye movement is made to the mirror opposite position. Yet again we measure behavior by recording the eye movements of participants which allows for the sorting of the behavioral responses into correct and error trials7 which then can be correlated to brain activity. Neuroimaging now allows researchers to measure different behaviors of correct and error trials that are indicative of different cognitive processes and pinpoint the different neural networks involved.
Neuroscience, Issue 64, fMRI, eyetracking, BOLD, attention, inhibition, Magnetic Resonance Imaging, MRI
3237
Play Button
Aseptic Laboratory Techniques: Plating Methods
Authors: Erin R. Sanders.
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories (BMBL) as well as Material Safety Data Sheets (MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection (ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to: ● Perform plating procedures without contaminating media. ● Isolate single bacterial colonies by the streak-plating method. ● Use pour-plating and spread-plating methods to determine the concentration of bacteria. ● Perform soft agar overlays when working with phage. ● Transfer bacterial cells from one plate to another using the replica-plating procedure. ● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
3064
Play Button
The Importance of Correct Protein Concentration for Kinetics and Affinity Determination in Structure-function Analysis
Authors: Ewa Pol.
Institutions: GE Healthcare Bio-Sciences AB.
In this study, we explore the interaction between the bovine cysteine protease inhibitor cystatin B and a catalytically inactive form of papain (Fig. 1), a plant cysteine protease, by real-time label-free analysis using Biacore X100. Several cystatin B variants with point mutations in areas of interaction with papain, are produced. For each cystatin B variant we determine its specific binding concentration using calibration-free concentration analysis (CFCA) and compare the values obtained with total protein concentration as determined by A280. After that, the kinetics of each cystatin B variant binding to papain is measured using single-cycle kinetics (SCK). We show that one of the four cystatin B variants we examine is only partially active for binding. This partial activity, revealed by CFCA, translates to a significant difference in the association rate constant (ka) and affinity (KD), compared to the values calculated using total protein concentration. Using CFCA in combination with kinetic analysis in a structure-function study contributes to obtaining reliable results, and helps to make the right interpretation of the interaction mechanism.
Cellular Biology, Issue 37, Protein interaction, Surface Plasmon Resonance, Biacore X100, CFCA, Cystatin B, Papain
1746
Play Button
Optimization of the Ugi Reaction Using Parallel Synthesis and Automated Liquid Handling
Authors: Jean-Claude Bradley, Khalid Baig Mirza, Tom Osborne, Antony Wiliams, Kevin Owens.
Institutions: Drexel University, Mettler-Toledo, Chemspider.
The optimization of a Ugi reaction involving the mixing of furfurylamine, benzaldehyde, boc-glycine and t-butylisocyanide is described. Triplicate runs of 48 parallel experiments are reported, varying concentration, solvent and the excess of some of the reagents. The isolation of the product was achieved by a simple filtration and wash procedure. The highest yield obtained was 66% from 0.4 M methanol with 1.2 eq. of imine. This is significantly above the 49% yield obtained from the initial reaction under equimolar concentration at 0.4 M in methanol. Methanol solutions with reagent concentrations of 0.4M or 0.2M gave superior yields while all solvent systems at 0.07M performed poorly. At 0.2M, methanol and ethanol/methanol (60/40) mixtures were statistically equally good while THF/methanol (60/40) was poor and acetonitrile/methanol (60/40) was intermediate. Good reproducibility of the precipitate yields was obtained in these replicate experiments, allowing for subtle interaction effects to be positively identified.
Chemistry, Issue 21, Ugi Reaction, Automated Liquid Handling, Combinatorial Chemistry, organic chemistry, Mini-block, Open Notebook Science, reaction optimization, UsefulChem, MiniBlock, precipitate
942
Play Button
Applying Microfluidics to Electrophysiology
Authors: David T. Eddington.
Institutions: University of Illinois, Chicago.
Microfluidics can be integrated with standard electrophysiology techniques to allow new experimental modalities. Specifically, the motivation for the microfluidic brain slice device is discussed including how the device docks to standard perfusion chambers and the technique of passive pumping which is used to deliver boluses of neuromodulators to the brain slice. By simplifying the device design, we are able to achieve a practical solution to the current unmet electrophysiology need of applying multiple neuromodulators across multiple regions of the brain slice. This is achieved by substituting the standard coverglass substrate of the perfusion chamber with a thin microfluidic device bonded to the coverglass substrate. This was then attached to the perfusion chamber and small holes connect the open-well of the perfusion chamber to the microfluidic channels buried within the microfluidic substrate. These microfluidic channels are interfaced with ports drilled into the edge of the perfusion chamber to access and deliver stimulants. This project represents how the field of microfluidics is transitioning away from proof-of concept device demonstrations and into practical solutions for unmet experimental and clinical needs.
Neuroscience, Issue 8, Biomedical Engineering, Microfluidics, Slice Recording, Electrophysiology, Neurotransmitter, Bioengineering
301
Play Button
Use of Arabidopsis eceriferum Mutants to Explore Plant Cuticle Biosynthesis
Authors: Lacey Samuels, Allan DeBono, Patricia Lam, Miao Wen, Reinhard Jetter, Ljerka Kunst.
Institutions: University of British Columbia - UBC, University of British Columbia - UBC.
The plant cuticle is a waxy outer covering on plants that has a primary role in water conservation, but is also an important barrier against the entry of pathogenic microorganisms. The cuticle is made up of a tough crosslinked polymer called "cutin" and a protective wax layer that seals the plant surface. The waxy layer of the cuticle is obvious on many plants, appearing as a shiny film on the ivy leaf or as a dusty outer covering on the surface of a grape or a cabbage leaf thanks to light scattering crystals present in the wax. Because the cuticle is an essential adaptation of plants to a terrestrial environment, understanding the genes involved in plant cuticle formation has applications in both agriculture and forestry. Today, we'll show the analysis of plant cuticle mutants identified by forward and reverse genetics approaches.
Plant Biology, Issue 16, Annual Review, Cuticle, Arabidopsis, Eceriferum Mutants, Cryso-SEM, Gas Chromatography
709
Play Button
Preparation of Artificial Bilayers for Electrophysiology Experiments
Authors: Ruchi Kapoor, Jung H. Kim, Helgi Ingolfson, Olaf Sparre Andersen.
Institutions: Weill Cornell Medical College of Cornell University.
Planar lipid bilayers, also called artificial lipid bilayers, allow you to study ion-conducting channels in a well-defined environment. These bilayers can be used for many different studies, such as the characterization of membrane-active peptides, the reconstitution of ion channels or investigations on how changes in lipid bilayer properties alter the function of bilayer-spanning channels. Here, we show how to form a planar bilayer and how to isolate small patches from the bilayer, and in a second video will also demonstrate a procedure for using gramicidin channels to determine changes in lipid bilayer elastic properties. We also demonstrate the individual steps needed to prepare the bilayer chamber, the electrodes and how to test that the bilayer is suitable for single-channel measurements.
Cellular Biology, Issue 20, Springer Protocols, Artificial Bilayers, Bilayer Patch Experiments, Lipid Bilayers, Bilayer Punch Electrodes, Electrophysiology
1033
Play Button
Zebrafish Brain Ventricle Injection
Authors: Jennifer H. Gutzman, Hazel Sive.
Institutions: Whitehead Institute for Biochemical Research, MIT - Massachusetts Institute of Technology.
Proper brain ventricle formation during embryonic brain development is required for normal brain function. Brain ventricles are the highly conserved cavities within the brain that are filled with cerebrospinal fluid. In zebrafish, after neural tube formation, the neuroepithelium undergoes a series of constrictions and folds while it fills with fluid resulting in brain ventricle formation. In order to understand the process of ventricle formation, and the neuroepithelial shape changes that occur at the same time, we needed a way to visualize the ventricle space in comparison to the brain tissue. However, the nature of transparent zebrafish embryos makes it difficult to differentiate the tissue from the ventricle space. Therefore, we developed a brain ventricle injection technique where the ventricle space is filled with a fluorescent dye and imaged by brightfield and fluorescent microscopy. The brightfield and the fluorescent images are then processed and superimposed in Photoshop. This technique allows for visualization of the ventricle space with the fluorescent dye, in comparison to the shape of the neuroepithelium in the brightfield image. Brain ventricle injection in zebrafish can be employed from 18 hours post fertilization through early larval stages. We have used this technique extensively in our studies of brain ventricle formation and morphogenesis as well as in characterizing brain morphogenesis mutants (1-3).
Neuroscience, Issue 26, brain, ventricle, zebrafish, morphology, microinjection, development, imaging
1218
Play Button
Improving IV Insulin Administration in a Community Hospital
Authors: Michael C. Magee.
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4 The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5 It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance. The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6 Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8 Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia. Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control
3705
Play Button
A New Single Chamber Implantable Defibrillator with Atrial Sensing: A Practical Demonstration of Sensing and Ease of Implantation
Authors: Dietmar Bänsch, Ralph Schneider, Ibrahim Akin, Cristoph A. Nienaber.
Institutions: University Hospital of Rostock, Germany.
Implantable cardioverter-defibrillators (ICDs) terminate ventricular tachycardia (VT) and ventricular fibrillation (VF) with high efficacy and can protect patients from sudden cardiac death (SCD). However, inappropriate shocks may occur if tachycardias are misdiagnosed. Inappropriate shocks are harmful and impair patient quality of life. The risk of inappropriate therapy increases with lower detection rates programmed in the ICD. Single-chamber detection poses greater risks for misdiagnosis when compared with dual-chamber devices that have the benefit of additional atrial information. However, using a dual-chamber device merely for the sake of detection is generally not accepted, since the risks associated with the second electrode may outweigh the benefits of detection. Therefore, BIOTRONIK developed a ventricular lead called the LinoxSMART S DX, which allows for the detection of atrial signals from two electrodes positioned at the atrial part of the ventricular electrode. This device contains two ring electrodes; one that contacts the atrial wall at the junction of the superior vena cava (SVC) and one positioned at the free floating part of the electrode in the atrium. The excellent signal quality can only be achieved by a special filter setting in the ICD (Lumax 540 and 740 VR-T DX, BIOTRONIK). Here, the ease of implantation of the system will be demonstrated.
Medicine, Issue 60, Implantable defibrillator, dual chamber, single chamber, tachycardia detection
3750
Play Button
BioMEMS and Cellular Biology: Perspectives and Applications
Authors: Albert Folch.
Institutions: University of Washington.
The ability to culture cells has revolutionized hypothesis testing in basic cell and molecular biology research. It has become a standard methodology in drug screening, toxicology, and clinical assays, and is increasingly used in regenerative medicine. However, the traditional cell culture methodology essentially consisting of the immersion of a large population of cells in a homogeneous fluid medium and on a homogeneous flat substrate has become increasingly limiting both from a fundamental and practical perspective. Microfabrication technologies have enabled researchers to design, with micrometer control, the biochemical composition and topology of the substrate, and the medium composition, as well as the neighboring cell type in the surrounding cellular microenvironment. Additionally, microtechnology is conceptually well-suited for the development of fast, low-cost in vitro systems that allow for high-throughput culturing and analysis of cells under large numbers of conditions. In this interview, Albert Folch explains these limitations, how they can be overcome with soft lithography and microfluidics, and describes some relevant examples of research in his lab and future directions.
Biomedical Engineering, Issue 8, BioMEMS, Soft Lithography, Microfluidics, Agrin, Axon Guidance, Olfaction, Interview
300
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.