JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Do Earthquakes Shake Stock Markets?
.
PLoS ONE
PUBLISHED: 07-22-2015
This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.
Authors: Keizo Takao, Tsuyoshi Miyakawa.
Published: 11-13-2006
ABSTRACT
Although all of the mouse genome sequences have been determined, we do not yet know the functions of most of these genes. Gene-targeting techniques, however, can be used to delete or manipulate a specific gene in mice. The influence of a given gene on a specific behavior can then be determined by conducting behavioral analyses of the mutant mice. As a test for behavioral phenotyping of mutant mice, the light/dark transition test is one of the most widely used tests to measure anxiety-like behavior in mice. The test is based on the natural aversion of mice to brightly illuminated areas and on their spontaneous exploratory behavior in novel environments. The test is sensitive to anxiolytic drug treatment. The apparatus consists of a dark chamber and a brightly illuminated chamber. Mice are allowed to move freely between the two chambers. The number of entries into the bright chamber and the duration of time spent there are indices of bright-space anxiety in mice. To obtain phenotyping results of a strain of mutant mice that can be readily reproduced and compared with those of other mutants, the behavioral test methods should be as identical as possible between laboratories. The procedural differences that exist between laboratories, however, make it difficult to replicate or compare the results among laboratories. Here, we present our protocol for the light/dark transition test as a movie so that the details of the protocol can be demonstrated. In our laboratory, we have assessed more than 60 strains of mutant mice using the protocol shown in the movie. Those data will be disclosed as a part of a public database that we are now constructing. Visualization of the protocol will facilitate understanding of the details of the entire experimental procedure, allowing for standardization of the protocols used across laboratories and comparisons of the behavioral phenotypes of various strains of mutant mice assessed using this test.
26 Related JoVE Articles!
Play Button
Scalable High Throughput Selection From Phage-displayed Synthetic Antibody Libraries
Authors: Shane Miersch, Zhijian Li, Rachel Hanna, Megan E. McLaughlin, Michael Hornsby, Tet Matsuguchi, Marcin Paduch, Annika Sääf, Jim Wells, Shohei Koide, Anthony Kossiakoff, Sachdev S. Sidhu.
Institutions: The Recombinant Antibody Network, University of Toronto, University of California, San Francisco at Mission Bay, The University of Chicago.
The demand for antibodies that fulfill the needs of both basic and clinical research applications is high and will dramatically increase in the future. However, it is apparent that traditional monoclonal technologies are not alone up to this task. This has led to the development of alternate methods to satisfy the demand for high quality and renewable affinity reagents to all accessible elements of the proteome. Toward this end, high throughput methods for conducting selections from phage-displayed synthetic antibody libraries have been devised for applications involving diverse antigens and optimized for rapid throughput and success. Herein, a protocol is described in detail that illustrates with video demonstration the parallel selection of Fab-phage clones from high diversity libraries against hundreds of targets using either a manual 96 channel liquid handler or automated robotics system. Using this protocol, a single user can generate hundreds of antigens, select antibodies to them in parallel and validate antibody binding within 6-8 weeks. Highlighted are: i) a viable antigen format, ii) pre-selection antigen characterization, iii) critical steps that influence the selection of specific and high affinity clones, and iv) ways of monitoring selection effectiveness and early stage antibody clone characterization. With this approach, we have obtained synthetic antibody fragments (Fabs) to many target classes including single-pass membrane receptors, secreted protein hormones, and multi-domain intracellular proteins. These fragments are readily converted to full-length antibodies and have been validated to exhibit high affinity and specificity. Further, they have been demonstrated to be functional in a variety of standard immunoassays including Western blotting, ELISA, cellular immunofluorescence, immunoprecipitation and related assays. This methodology will accelerate antibody discovery and ultimately bring us closer to realizing the goal of generating renewable, high quality antibodies to the proteome.
Immunology, Issue 95, Bacteria, Viruses, Amino Acids, Peptides, and Proteins, Nucleic Acids, Nucleotides, and Nucleosides, Life Sciences (General), phage display, synthetic antibodies, high throughput, antibody selection, scalable methodology
51492
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
51631
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Analysis of Volatile and Oxidation Sensitive Compounds Using a Cold Inlet System and Electron Impact Mass Spectrometry
Authors: Jens Sproß.
Institutions: Bielefeld University.
This video presents a protocol for the mass spectrometrical analysis of volatile and oxidation sensitive compounds using electron impact ionization. The analysis of volatile and oxidation sensitive compounds by mass spectrometry is not easily achieved, as all state-of-the-art mass spectrometric methods require at least one sample preparation step, e.g., dissolution and dilution of the analyte (electrospray ionization), co-crystallization of the analyte with a matrix compound (matrix-assisted laser desorption/ionization), or transfer of the prepared samples into the ionization source of the mass spectrometer, to be conducted under atmospheric conditions. Here, the use of a sample inlet system is described which enables the analysis of volatile metal organyls, silanes, and phosphanes using a sector field mass spectrometer equipped with an electron impact ionization source. All sample preparation steps and the sample introduction into the ion source of the mass spectrometer take place either under air-free conditions or under vacuum, enabling the analysis of compounds highly susceptible to oxidation. The presented technique is especially of interest for inorganic chemists, working with metal organyls, silanes, or phosphanes, which have to be handled using inert conditions, such as the Schlenk technique. The principle of operation is presented in this video.
Chemistry, Issue 91, mass spectrometry, electron impact, inlet system, volatile, air sensitive
51858
Play Button
Evaluation of a Novel Laser-assisted Coronary Anastomotic Connector - the Trinity Clip - in a Porcine Off-pump Bypass Model
Authors: David Stecher, Glenn Bronkers, Jappe O.T. Noest, Cornelis A.F. Tulleken, Imo E. Hoefer, Lex A. van Herwerden, Gerard Pasterkamp, Marc P. Buijsrogge.
Institutions: University Medical Center Utrecht, Vascular Connect b.v., University Medical Center Utrecht, University Medical Center Utrecht.
To simplify and facilitate beating heart (i.e., off-pump), minimally invasive coronary artery bypass surgery, a new coronary anastomotic connector, the Trinity Clip, is developed based on the excimer laser-assisted nonocclusive anastomosis technique. The Trinity Clip connector enables simplified, sutureless, and nonocclusive connection of the graft to the coronary artery, and an excimer laser catheter laser-punches the opening of the anastomosis. Consequently, owing to the complete nonocclusive anastomosis construction, coronary conditioning (i.e., occluding or shunting) is not necessary, in contrast to the conventional anastomotic technique, hence simplifying the off-pump bypass procedure. Prior to clinical application in coronary artery bypass grafting, the safety and quality of this novel connector will be evaluated in a long-term experimental porcine off-pump coronary artery bypass (OPCAB) study. In this paper, we describe how to evaluate the coronary anastomosis in the porcine OPCAB model using various techniques to assess its quality. Representative results are summarized and visually demonstrated.
Medicine, Issue 93, Anastomosis, coronary, anastomotic connector, anastomotic coupler, excimer laser-assisted nonocclusive anastomosis (ELANA), coronary artery bypass graft (CABG), off-pump coronary artery bypass (OPCAB), beating heart surgery, excimer laser, porcine model, experimental, medical device
52127
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
52183
Play Button
Adapting Human Videofluoroscopic Swallow Study Methods to Detect and Characterize Dysphagia in Murine Disease Models
Authors: Teresa E. Lever, Sabrina M. Braun, Ryan T. Brooks, Rebecca A. Harris, Loren L. Littrell, Ryan M. Neff, Cameron J. Hinkel, Mitchell J. Allen, Mollie A. Ulsas.
Institutions: University of Missouri, University of Missouri, University of Missouri.
This study adapted human videofluoroscopic swallowing study (VFSS) methods for use with murine disease models for the purpose of facilitating translational dysphagia research. Successful outcomes are dependent upon three critical components: test chambers that permit self-feeding while standing unrestrained in a confined space, recipes that mask the aversive taste/odor of commercially-available oral contrast agents, and a step-by-step test protocol that permits quantification of swallow physiology. Elimination of one or more of these components will have a detrimental impact on the study results. Moreover, the energy level capability of the fluoroscopy system will determine which swallow parameters can be investigated. Most research centers have high energy fluoroscopes designed for use with people and larger animals, which results in exceptionally poor image quality when testing mice and other small rodents. Despite this limitation, we have identified seven VFSS parameters that are consistently quantifiable in mice when using a high energy fluoroscope in combination with the new murine VFSS protocol. We recently obtained a low energy fluoroscopy system with exceptionally high imaging resolution and magnification capabilities that was designed for use with mice and other small rodents. Preliminary work using this new system, in combination with the new murine VFSS protocol, has identified 13 swallow parameters that are consistently quantifiable in mice, which is nearly double the number obtained using conventional (i.e., high energy) fluoroscopes. Identification of additional swallow parameters is expected as we optimize the capabilities of this new system. Results thus far demonstrate the utility of using a low energy fluoroscopy system to detect and quantify subtle changes in swallow physiology that may otherwise be overlooked when using high energy fluoroscopes to investigate murine disease models.
Medicine, Issue 97, mouse, murine, rodent, swallowing, deglutition, dysphagia, videofluoroscopy, radiation, iohexol, barium, palatability, taste, translational, disease models
52319
Play Button
Establishment of Human Epithelial Enteroids and Colonoids from Whole Tissue and Biopsy
Authors: Maxime M. Mahe, Nambirajan Sundaram, Carey L. Watson, Noah F. Shroyer, Michael A. Helmrath.
Institutions: Cincinnati Children's Hospital Medical Center, Baylor College of Medicine.
The epithelium of the gastrointestinal tract is constantly renewed as it turns over. This process is triggered by the proliferation of intestinal stem cells (ISCs) and progeny that progressively migrate and differentiate toward the tip of the villi. These processes, essential for gastrointestinal homeostasis, have been extensively studied using multiple approaches. Ex vivo technologies, especially primary cell cultures have proven to be promising for understanding intestinal epithelial functions. A long-term primary culture system for mouse intestinal crypts has been established to generate 3-dimensional epithelial organoids. These epithelial structures contain crypt- and villus-like domains reminiscent of normal gut epithelium. Commonly, termed “enteroids” when derived from small intestine and “colonoids” when derived from colon, they are different from organoids that also contain mesenchyme tissue. Additionally, these enteroids/colonoids continuously produce all cell types found normally within the intestinal epithelium. This in vitro organ-like culture system is rapidly becoming the new gold standard for investigation of intestinal stem cell biology and epithelial cell physiology. This technology has been recently transferred to the study of human gut. The establishment of human derived epithelial enteroids and colonoids from small intestine and colon has been possible through the utilization of specific culture media that allow their growth and maintenance over time. Here, we describe a method to establish a small intestinal and colon crypt-derived system from human whole tissue or biopsies. We emphasize the culture modalities that are essential for the successful growth and maintenance of human enteroids and colonoids.
Medicine, Issue 97, Intestinal stem cells, 3-dimensional cell culture, human, small intestine, colon, biopsy, enteroids, minigut, epithelial organoids, in-vitro, colonoids, enterospheres
52483
Play Button
Mass Spectrometric Approaches to Study Protein Structure and Interactions in Lyophilized Powders
Authors: Balakrishnan S. Moorthy, Lavanya K. Iyer, Elizabeth M. Topp.
Institutions: Purdue University.
Amide hydrogen/deuterium exchange (ssHDX-MS) and side-chain photolytic labeling (ssPL-MS) followed by mass spectrometric analysis can be valuable for characterizing lyophilized formulations of protein therapeutics. Labeling followed by suitable proteolytic digestion allows the protein structure and interactions to be mapped with peptide-level resolution. Since the protein structural elements are stabilized by a network of chemical bonds from the main-chains and side-chains of amino acids, specific labeling of atoms in the amino acid residues provides insight into the structure and conformation of the protein. In contrast to routine methods used to study proteins in lyophilized solids (e.g., FTIR), ssHDX-MS and ssPL-MS provide quantitative and site-specific information. The extent of deuterium incorporation and kinetic parameters can be related to rapidly and slowly exchanging amide pools (Nfast, Nslow) and directly reflects the degree of protein folding and structure in lyophilized formulations. Stable photolytic labeling does not undergo back-exchange, an advantage over ssHDX-MS. Here, we provide detailed protocols for both ssHDX-MS and ssPL-MS, using myoglobin (Mb) as a model protein in lyophilized formulations containing either trehalose or sorbitol.
Chemistry, Issue 98, Amide hydrogen/deuterium exchange, photolytic labeling, mass spectrometry, lyophilized formulations, photo-leucine, solid-state, protein structure, protein conformation, protein dynamics, secondary structure, protein stability, excipients
52503
Play Button
Paw-Dragging: a Novel, Sensitive Analysis of the Mouse Cylinder Test
Authors: R. Brian Roome, Jacqueline L. Vanderluit.
Institutions: Memorial University of Newfoundland, McGill University.
The cylinder test is routinely used to predict focal ischemic damage to the forelimb motor cortex in rodents. When placed in the cylinder, rodents explore by rearing and touching the walls of the cylinder with their forelimb paws for postural support. Following ischemic injury to the forelimb sensorimotor cortex, rats rely more heavily on their unaffected forelimb paw for postural support resulting in fewer touches with their affected paw which is termed forelimb asymmetry. In contrast, focal ischemic damage in the mouse brain fails to result in comparable consistent deficits in forelimb asymmetry. While forelimb asymmetry deficits are infrequently observed, mice do demonstrate a novel behaviour post stroke termed “paw-dragging”. Paw-dragging is the tendency for a mouse to drag its affected paw along the cylinder wall rather than directly push off from the wall when dismounting from a rear to a four-legged stance. We have previously demonstrated that paw-dragging behaviour is highly sensitive to small cortical ischemic injuries to the forelimb motor cortex. Here we provide a detailed protocol for paw-dragging analysis. We define what a paw-drag is and demonstrate how to quantify paw-dragging behaviour. The cylinder test is a simple and inexpensive test to administer and does not require pre-training or food deprivation strategies. In using paw-dragging analysis with the cylinder test, it fills a niche for predicting cortical ischemic injuries such as photothrombosis and Endothelin-1 (ET-1)-induced ischemia – two models that are ever-increasing in popularity and produce smaller focal injuries than middle cerebral artery occlusion. Finally, measuring paw-dragging behaviour in the cylinder test will allow studies of functional recovery after cortical injury using a wide cohort of transgenic mouse strains where previous forelimb asymmetry analysis has failed to detect consistent deficits.
Behavior, Issue 98, Neuroscience, Medicine, brain, behavioural testing, mouse, cylinder test, focal ischemic stroke, forelimb motor cortex
52701
Play Button
High Throughput Quantitative Expression Screening and Purification Applied to Recombinant Disulfide-rich Venom Proteins Produced in E. coli
Authors: Natalie J. Saez, Hervé Nozach, Marilyne Blemont, Renaud Vincentelli.
Institutions: Aix-Marseille Université, Commissariat à l'énergie atomique et aux énergies alternatives (CEA) Saclay, France.
Escherichia coli (E. coli) is the most widely used expression system for the production of recombinant proteins for structural and functional studies. However, purifying proteins is sometimes challenging since many proteins are expressed in an insoluble form. When working with difficult or multiple targets it is therefore recommended to use high throughput (HTP) protein expression screening on a small scale (1-4 ml cultures) to quickly identify conditions for soluble expression. To cope with the various structural genomics programs of the lab, a quantitative (within a range of 0.1-100 mg/L culture of recombinant protein) and HTP protein expression screening protocol was implemented and validated on thousands of proteins. The protocols were automated with the use of a liquid handling robot but can also be performed manually without specialized equipment. Disulfide-rich venom proteins are gaining increasing recognition for their potential as therapeutic drug leads. They can be highly potent and selective, but their complex disulfide bond networks make them challenging to produce. As a member of the FP7 European Venomics project (www.venomics.eu), our challenge is to develop successful production strategies with the aim of producing thousands of novel venom proteins for functional characterization. Aided by the redox properties of disulfide bond isomerase DsbC, we adapted our HTP production pipeline for the expression of oxidized, functional venom peptides in the E. coli cytoplasm. The protocols are also applicable to the production of diverse disulfide-rich proteins. Here we demonstrate our pipeline applied to the production of animal venom proteins. With the protocols described herein it is likely that soluble disulfide-rich proteins will be obtained in as little as a week. Even from a small scale, there is the potential to use the purified proteins for validating the oxidation state by mass spectrometry, for characterization in pilot studies, or for sensitive micro-assays.
Bioengineering, Issue 89, E. coli, expression, recombinant, high throughput (HTP), purification, auto-induction, immobilized metal affinity chromatography (IMAC), tobacco etch virus protease (TEV) cleavage, disulfide bond isomerase C (DsbC) fusion, disulfide bonds, animal venom proteins/peptides
51464
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Pulse Wave Velocity Testing in the Baltimore Longitudinal Study of Aging
Authors: Melissa David, Omar Malti, Majd AlGhatrif, Jeanette Wright, Marco Canepa, James B. Strait.
Institutions: National Institute of Aging.
Carotid-femoral pulse wave velocity is considered the gold standard for measurements of central arterial stiffness obtained through noninvasive methods1. Subjects are placed in the supine position and allowed to rest quietly for at least 10 min prior to the start of the exam. The proper cuff size is selected and a blood pressure is obtained using an oscillometric device. Once a resting blood pressure has been obtained, pressure waveforms are acquired from the right femoral and right common carotid arteries. The system then automatically calculates the pulse transit time between these two sites (using the carotid artery as a surrogate for the descending aorta). Body surface measurements are used to determine the distance traveled by the pulse wave between the two sampling sites. This distance is then divided by the pulse transit time resulting in the pulse wave velocity. The measurements are performed in triplicate and the average is used for analysis.
Medicine, Issue 84, Pulse Wave Velocity (PWV), Pulse Wave Analysis (PWA), Arterial stiffness, Aging, Cardiovascular, Carotid-femoral pulse
50817
Play Button
Morris Water Maze Experiment
Authors: Joseph Nunez.
Institutions: Michigan State University (MSU).
The Morris water maze is widely used to study spatial memory and learning. Animals are placed in a pool of water that is colored opaque with powdered non-fat milk or non-toxic tempera paint, where they must swim to a hidden escape platform. Because they are in opaque water, the animals cannot see the platform, and cannot rely on scent to find the escape route. Instead, they must rely on external/extra-maze cues. As the animals become more familiar with the task, they are able to find the platform more quickly. Developed by Richard G. Morris in 1984, this paradigm has become one of the "gold standards" of behavioral neuroscience.
Behavior, Issue 19, Declarative, Hippocampus, Memory, Procedural, Rodent, Spatial Learning
897
Play Button
Single Molecule Methods for Monitoring Changes in Bilayer Elastic Properties
Authors: Helgi Ingolfson, Ruchi Kapoor, Shemille A. Collingwood, Olaf Sparre Andersen.
Institutions: Weill Cornell Medical College, Weill Cornell Medical College of Cornell University.
Membrane protein function is regulated by the cell membrane lipid composition. This regulation is due to a combination of specific lipid-protein interactions and more general lipid bilayer-protein interactions. These interactions are particularly important in pharmacological research, as many current pharmaceuticals on the market can alter the lipid bilayer material properties, which can lead to altered membrane protein function. The formation of gramicidin channels are dependent on conformational changes in gramicidin subunits which are in turn dependent on the properties of the lipid. Hence the gramicidin channel current is a reporter of altered properties of the bilayer due to certain compounds.
Cellular Biology, Issue 21, Springer Protocols, Membrane Biophysics, Gramicidin Channels, Artificial Bilayers, Bilayer Elastic Properties,
1032
Play Button
Using the Gene Pulser MXcell Electroporation System to Transfect Primary Cells with High Efficiency
Authors: Adam M. McCoy, Michelle L. Collins, Luis A. Ugozzoli.
Institutions: Bio-Rad Laboratories, Inc..
It is becoming increasingly apparent that electroporation is the most effective way to introduce plasmid DNA or siRNA into primary cells. The Gene Pulser MXcell electroporation system and Gene Pulser electroporation buffer (Bio-Rad) were specifically developed to easily transfect nucleic acids into mammalian cells and difficult-to-transfect cells, such as primary and stem cells. We will demonstrate how to perform a simple experiment to quickly identify the best electroporation conditions. We will demonstrate how to run several samples through a range of electroporation conditions so that an experiment can be conducted at the same time as optimization is performed. We will also show how optimal conditions identified using 96-well electroporation plates can be used with standard electroporation cuvettes, facilitating the switch from electroporation plates to electroporation cuvettes while maintaining the same electroporation efficiency. In the video, we will also discuss some of the key factors that can lead to the success or failure of electroporation experiments.
Cellular Biology, Issue 35, Primary cell electroporation, MEF, Bio-Rad, Gene Pulser MXcell, transfection, GFP
1662
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
1988
Play Button
Strategies for Study of Neuroprotection from Cold-preconditioning
Authors: Heidi M. Mitchell, David M. White, Richard P. Kraig.
Institutions: The University of Chicago Medical Center.
Neurological injury is a frequent cause of morbidity and mortality from general anesthesia and related surgical procedures that could be alleviated by development of effective, easy to administer and safe preconditioning treatments. We seek to define the neural immune signaling responsible for cold-preconditioning as means to identify novel targets for therapeutics development to protect brain before injury onset. Low-level pro-inflammatory mediator signaling changes over time are essential for cold-preconditioning neuroprotection. This signaling is consistent with the basic tenets of physiological conditioning hormesis, which require that irritative stimuli reach a threshold magnitude with sufficient time for adaptation to the stimuli for protection to become evident. Accordingly, delineation of the immune signaling involved in cold-preconditioning neuroprotection requires that biological systems and experimental manipulations plus technical capacities are highly reproducible and sensitive. Our approach is to use hippocampal slice cultures as an in vitro model that closely reflects their in vivo counterparts with multi-synaptic neural networks influenced by mature and quiescent macroglia / microglia. This glial state is particularly important for microglia since they are the principal source of cytokines, which are operative in the femtomolar range. Also, slice cultures can be maintained in vitro for several weeks, which is sufficient time to evoke activating stimuli and assess adaptive responses. Finally, environmental conditions can be accurately controlled using slice cultures so that cytokine signaling of cold-preconditioning can be measured, mimicked, and modulated to dissect the critical node aspects. Cytokine signaling system analyses require the use of sensitive and reproducible multiplexed techniques. We use quantitative PCR for TNF-α to screen for microglial activation followed by quantitative real-time qPCR array screening to assess tissue-wide cytokine changes. The latter is a most sensitive and reproducible means to measure multiple cytokine system signaling changes simultaneously. Significant changes are confirmed with targeted qPCR and then protein detection. We probe for tissue-based cytokine protein changes using multiplexed microsphere flow cytometric assays using Luminex technology. Cell-specific cytokine production is determined with double-label immunohistochemistry. Taken together, this brain tissue preparation and style of use, coupled to the suggested investigative strategies, may be an optimal approach for identifying potential targets for the development of novel therapeutics that could mimic the advantages of cold-preconditioning.
Neuroscience, Issue 43, innate immunity, hormesis, microglia, hippocampus, slice culture, immunohistochemistry, neural-immune, gene expression, real-time PCR
2192
Play Button
Western Blotting: Sample Preparation to Detection
Authors: Anna Eslami, Jesse Lujan.
Institutions: EMD Chemicals Inc..
Western blotting is an analytical technique used to detect specific proteins in a given sample of tissue homogenate or extract. It uses gel electrophoresis to separate native or denatured proteins by the length of the polypeptide (denaturing conditions) or by the 3-D structure of the protein (native/ non-denaturing conditions). The proteins are then transferred to a membrane (typically nitrocellulose or PVDF), where they are probed (detected) using antibodies specific to the target protein.
Basic Protocols, Issue 44, western blot, SDS-PAGE, electrophoresis, protein transfer, immunoblot, protein separation, PVDF, nitrocellulose, ECL
2359
Play Button
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Authors: Aya D. Pusic, Yelena Y. Grinberg, Heidi M. Mitchell, Richard P. Kraig.
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers. In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic and modulate SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
2910
Play Button
Aseptic Laboratory Techniques: Plating Methods
Authors: Erin R. Sanders.
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories (BMBL) as well as Material Safety Data Sheets (MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection (ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to: ● Perform plating procedures without contaminating media. ● Isolate single bacterial colonies by the streak-plating method. ● Use pour-plating and spread-plating methods to determine the concentration of bacteria. ● Perform soft agar overlays when working with phage. ● Transfer bacterial cells from one plate to another using the replica-plating procedure. ● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
3064
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
Evaluation of Polymeric Gene Delivery Nanoparticles by Nanoparticle Tracking Analysis and High-throughput Flow Cytometry
Authors: Ron B. Shmueli, Nupura S. Bhise, Jordan J. Green.
Institutions: Johns Hopkins University School of Medicine, Johns Hopkins University School of Medicine, Johns Hopkins University School of Medicine, Johns Hopkins University School of Medicine.
Non-viral gene delivery using polymeric nanoparticles has emerged as an attractive approach for gene therapy to treat genetic diseases1 and as a technology for regenerative medicine2. Unlike viruses, which have significant safety issues, polymeric nanoparticles can be designed to be non-toxic, non-immunogenic, non-mutagenic, easier to synthesize, chemically versatile, capable of carrying larger nucleic acid cargo and biodegradable and/or environmentally responsive. Cationic polymers self-assemble with negatively charged DNA via electrostatic interaction to form complexes on the order of 100 nm that are commonly termed polymeric nanoparticles. Examples of biomaterials used to form nanoscale polycationic gene delivery nanoparticles include polylysine, polyphosphoesters, poly(amidoamines)s and polyethylenimine (PEI), which is a non-degradable off-the-shelf cationic polymer commonly used for nucleic acid delivery1,3 . Poly(beta-amino ester)s (PBAEs) are a newer class of cationic polymers4 that are hydrolytically degradable5,6 and have been shown to be effective at gene delivery to hard-to-transfect cell types such as human retinal endothelial cells (HRECs)7, mouse mammary epithelial cells8, human brain cancer cells9 and macrovascular (human umbilical vein, HUVECs) endothelial cells10. A new protocol to characterize polymeric nanoparticles utilizing nanoparticle tracking analysis (NTA) is described. In this approach, both the particle size distribution and the distribution of the number of plasmids per particle are obtained11. In addition, a high-throughput 96-well plate transfection assay for rapid screening of the transfection efficacy of polymeric nanoparticles is presented. In this protocol, poly(beta-amino ester)s (PBAEs) are used as model polymers and human retinal endothelial cells (HRECs) are used as model human cells. This protocol can be easily adapted to evaluate any polymeric nanoparticle and any cell type of interest in a multi-well plate format.
Biomedical Engineering, Issue 73, Bioengineering, Tissue Engineering, Cellular Biology, Medicine, Genetics, Biocompatible Materials, Biopolymers, Drug Delivery Systems, Nanotechnology, bioengineering (general), Therapeutics, Nanoparticle, poly(beta-amino ester), high-throughput, transfection, nanoparticle tracking analysis, biomaterial, gene delivery, flow cytometry
50176
Play Button
A Microplate Assay to Assess Chemical Effects on RBL-2H3 Mast Cell Degranulation: Effects of Triclosan without Use of an Organic Solvent
Authors: Lisa M. Weatherly, Rachel H. Kennedy, Juyoung Shim, Julie A. Gosse.
Institutions: University of Maine, Orono, University of Maine, Orono.
Mast cells play important roles in allergic disease and immune defense against parasites. Once activated (e.g. by an allergen), they degranulate, a process that results in the exocytosis of allergic mediators. Modulation of mast cell degranulation by drugs and toxicants may have positive or adverse effects on human health. Mast cell function has been dissected in detail with the use of rat basophilic leukemia mast cells (RBL-2H3), a widely accepted model of human mucosal mast cells3-5. Mast cell granule component and the allergic mediator β-hexosaminidase, which is released linearly in tandem with histamine from mast cells6, can easily and reliably be measured through reaction with a fluorogenic substrate, yielding measurable fluorescence intensity in a microplate assay that is amenable to high-throughput studies1. Originally published by Naal et al.1, we have adapted this degranulation assay for the screening of drugs and toxicants and demonstrate its use here. Triclosan is a broad-spectrum antibacterial agent that is present in many consumer products and has been found to be a therapeutic aid in human allergic skin disease7-11, although the mechanism for this effect is unknown. Here we demonstrate an assay for the effect of triclosan on mast cell degranulation. We recently showed that triclosan strongly affects mast cell function2. In an effort to avoid use of an organic solvent, triclosan is dissolved directly into aqueous buffer with heat and stirring, and resultant concentration is confirmed using UV-Vis spectrophotometry (using ε280 = 4,200 L/M/cm)12. This protocol has the potential to be used with a variety of chemicals to determine their effects on mast cell degranulation, and more broadly, their allergic potential.
Immunology, Issue 81, mast cell, basophil, degranulation, RBL-2H3, triclosan, irgasan, antibacterial, β-hexosaminidase, allergy, Asthma, toxicants, ionophore, antigen, fluorescence, microplate, UV-Vis
50671
Play Button
Fabrication of High Contrast Gratings for the Spectrum Splitting Dispersive Element in a Concentrated Photovoltaic System
Authors: Yuhan Yao, He Liu, Wei Wu.
Institutions: University of Sothern California.
High contrast gratings are designed and fabricated and its application is proposed in a parallel spectrum splitting dispersive element that can improve the solar conversion efficiency of a concentrated photovoltaic system. The proposed system will also lower the solar cell cost in the concentrated photovoltaic system by replacing the expensive tandem solar cells with the cost-effective single junction solar cells. The structures and the parameters of high contrast gratings for the dispersive elements were numerically optimized. The large-area fabrication of high contrast gratings was experimentally demonstrated using nanoimprint lithography and dry etching. The quality of grating material and the performance of the fabricated device were both experimentally characterized. By analyzing the measurement results, the possible side effects from the fabrication processes are discussed and several methods that have the potential to improve the fabrication processes are proposed, which can help to increase the optical efficiency of the fabricated devices.
Engineering, Issue 101, Parallel spectrum splitting, dispersive element, high contrast grating, concentrated photovoltaic system, nanoimprint lithography, reactive ion etching
52913
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.