JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
A knowledge generation model via the hypernetwork.
PLoS ONE
PUBLISHED: 01-01-2014
The influence of the statistical properties of the network on the knowledge diffusion has been extensively studied. However, the structure evolution and the knowledge generation processes are always integrated simultaneously. By introducing the Cobb-Douglas production function and treating the knowledge growth as a cooperative production of knowledge, in this paper, we present two knowledge-generation dynamic evolving models based on different evolving mechanisms. The first model, named "HDPH model," adopts the hyperedge growth and the hyperdegree preferential attachment mechanisms. The second model, named "KSPH model," adopts the hyperedge growth and the knowledge stock preferential attachment mechanisms. We investigate the effect of the parameters (?,?) on the total knowledge stock of the two models. The hyperdegree distribution of the HDPH model can be theoretically analyzed by the mean-field theory. The analytic result indicates that the hyperdegree distribution of the HDPH model obeys the power-law distribution and the exponent is ? = 2 + 1/m. Furthermore, we present the distributions of the knowledge stock for different parameters (?,?). The findings indicate that our proposed models could be helpful for deeply understanding the scientific research cooperation.
Authors: William R. Brant, Siegbert Schmid, Guodong Du, Helen E. A. Brand, Wei Kong Pang, Vanessa K. Peterson, Zaiping Guo, Neeraj Sharma.
Published: 11-10-2014
ABSTRACT
Li-ion batteries are widely used in portable electronic devices and are considered as promising candidates for higher-energy applications such as electric vehicles.1,2 However, many challenges, such as energy density and battery lifetimes, need to be overcome before this particular battery technology can be widely implemented in such applications.3 This research is challenging, and we outline a method to address these challenges using in situ NPD to probe the crystal structure of electrodes undergoing electrochemical cycling (charge/discharge) in a battery. NPD data help determine the underlying structural mechanism responsible for a range of electrode properties, and this information can direct the development of better electrodes and batteries. We briefly review six types of battery designs custom-made for NPD experiments and detail the method to construct the ‘roll-over’ cell that we have successfully used on the high-intensity NPD instrument, WOMBAT, at the Australian Nuclear Science and Technology Organisation (ANSTO). The design considerations and materials used for cell construction are discussed in conjunction with aspects of the actual in situ NPD experiment and initial directions are presented on how to analyze such complex in situ data.
25 Related JoVE Articles!
Play Button
Simulation of the Planetary Interior Differentiation Processes in the Laboratory
Authors: Yingwei Fei.
Institutions: Carnegie Institution of Washington.
A planetary interior is under high-pressure and high-temperature conditions and it has a layered structure. There are two important processes that led to that layered structure, (1) percolation of liquid metal in a solid silicate matrix by planet differentiation, and (2) inner core crystallization by subsequent planet cooling. We conduct high-pressure and high-temperature experiments to simulate both processes in the laboratory. Formation of percolative planetary core depends on the efficiency of melt percolation, which is controlled by the dihedral (wetting) angle. The percolation simulation includes heating the sample at high pressure to a target temperature at which iron-sulfur alloy is molten while the silicate remains solid, and then determining the true dihedral angle to evaluate the style of liquid migration in a crystalline matrix by 3D visualization. The 3D volume rendering is achieved by slicing the recovered sample with a focused ion beam (FIB) and taking SEM image of each slice with a FIB/SEM crossbeam instrument. The second set of experiments is designed to understand the inner core crystallization and element distribution between the liquid outer core and solid inner core by determining the melting temperature and element partitioning at high pressure. The melting experiments are conducted in the multi-anvil apparatus up to 27 GPa and extended to higher pressure in the diamond-anvil cell with laser-heating. We have developed techniques to recover small heated samples by precision FIB milling and obtain high-resolution images of the laser-heated spot that show melting texture at high pressure. By analyzing the chemical compositions of the coexisting liquid and solid phases, we precisely determine the liquidus curve, providing necessary data to understand the inner core crystallization process.
Physics, Issue 81, Geophysics, Planetary Science, Geochemistry, Planetary interior, high-pressure, planet differentiation, 3D tomography
50778
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Setting-up an In Vitro Model of Rat Blood-brain Barrier (BBB): A Focus on BBB Impermeability and Receptor-mediated Transport
Authors: Yves Molino, Françoise Jabès, Emmanuelle Lacassagne, Nicolas Gaudin, Michel Khrestchatisky.
Institutions: VECT-HORUS SAS, CNRS, NICN UMR 7259.
The blood brain barrier (BBB) specifically regulates molecular and cellular flux between the blood and the nervous tissue. Our aim was to develop and characterize a highly reproducible rat syngeneic in vitro model of the BBB using co-cultures of primary rat brain endothelial cells (RBEC) and astrocytes to study receptors involved in transcytosis across the endothelial cell monolayer. Astrocytes were isolated by mechanical dissection following trypsin digestion and were frozen for later co-culture. RBEC were isolated from 5-week-old rat cortices. The brains were cleaned of meninges and white matter, and mechanically dissociated following enzymatic digestion. Thereafter, the tissue homogenate was centrifuged in bovine serum albumin to separate vessel fragments from nervous tissue. The vessel fragments underwent a second enzymatic digestion to free endothelial cells from their extracellular matrix. The remaining contaminating cells such as pericytes were further eliminated by plating the microvessel fragments in puromycin-containing medium. They were then passaged onto filters for co-culture with astrocytes grown on the bottom of the wells. RBEC expressed high levels of tight junction (TJ) proteins such as occludin, claudin-5 and ZO-1 with a typical localization at the cell borders. The transendothelial electrical resistance (TEER) of brain endothelial monolayers, indicating the tightness of TJs reached 300 ohm·cm2 on average. The endothelial permeability coefficients (Pe) for lucifer yellow (LY) was highly reproducible with an average of 0.26 ± 0.11 x 10-3 cm/min. Brain endothelial cells organized in monolayers expressed the efflux transporter P-glycoprotein (P-gp), showed a polarized transport of rhodamine 123, a ligand for P-gp, and showed specific transport of transferrin-Cy3 and DiILDL across the endothelial cell monolayer. In conclusion, we provide a protocol for setting up an in vitro BBB model that is highly reproducible due to the quality assurance methods, and that is suitable for research on BBB transporters and receptors.
Medicine, Issue 88, rat brain endothelial cells (RBEC), mouse, spinal cord, tight junction (TJ), receptor-mediated transport (RMT), low density lipoprotein (LDL), LDLR, transferrin, TfR, P-glycoprotein (P-gp), transendothelial electrical resistance (TEER),
51278
Play Button
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Authors: Kristen K. McCampbell, Kristin N. Springer, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90, zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
51644
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Production of Haploid Zebrafish Embryos by In Vitro Fertilization
Authors: Paul T. Kroeger Jr., Shahram Jevin Poureetezadi, Robert McKee, Jonathan Jou, Rachel Miceli, Rebecca A. Wingert.
Institutions: University of Notre Dame.
The zebrafish has become a mainstream vertebrate model that is relevant for many disciplines of scientific study. Zebrafish are especially well suited for forward genetic analysis of developmental processes due to their external fertilization, embryonic size, rapid ontogeny, and optical clarity – a constellation of traits that enable the direct observation of events ranging from gastrulation to organogenesis with a basic stereomicroscope. Further, zebrafish embryos can survive for several days in the haploid state. The production of haploid embryos in vitro is a powerful tool for mutational analysis, as it enables the identification of recessive mutant alleles present in first generation (F1) female carriers following mutagenesis in the parental (P) generation. This approach eliminates the necessity to raise multiple generations (F2, F3, etc.) which involves breeding of mutant families, thus saving the researcher time along with reducing the needs for zebrafish colony space, labor, and the husbandry costs. Although zebrafish have been used to conduct forward screens for the past several decades, there has been a steady expansion of transgenic and genome editing tools. These tools now offer a plethora of ways to create nuanced assays for next generation screens that can be used to further dissect the gene regulatory networks that drive vertebrate ontogeny. Here, we describe how to prepare haploid zebrafish embryos. This protocol can be implemented for novel future haploid screens, such as in enhancer and suppressor screens, to address the mechanisms of development for a broad number of processes and tissues that form during early embryonic stages.
Developmental Biology, Issue 89, zebrafish, haploid, in vitro fertilization, forward genetic screen, saturation, recessive mutation, mutagenesis
51708
Play Button
Directed Dopaminergic Neuron Differentiation from Human Pluripotent Stem Cells
Authors: Pengbo Zhang, Ninuo Xia, Renee A. Reijo Pera.
Institutions: Stanford University School of Medicine, Stanford University School of Medicine.
Dopaminergic (DA) neurons in the substantia nigra pars compacta (also known as A9 DA neurons) are the specific cell type that is lost in Parkinson’s disease (PD). There is great interest in deriving A9 DA neurons from human pluripotent stem cells (hPSCs) for regenerative cell replacement therapy for PD. During neural development, A9 DA neurons originate from the floor plate (FP) precursors located at the ventral midline of the central nervous system. Here, we optimized the culture conditions for the stepwise differentiation of hPSCs to A9 DA neurons, which mimics embryonic DA neuron development. In our protocol, we first describe the efficient generation of FP precursor cells from hPSCs using a small molecule method, and then convert the FP cells to A9 DA neurons, which could be maintained in vitro for several months. This efficient, repeatable and controllable protocol works well in human embryonic stem cells (hESCs) and human induced pluripotent stem cells (hiPSCs) from normal persons and PD patients, in which one could derive A9 DA neurons to perform in vitro disease modeling and drug screening and in vivo cell transplantation therapy for PD.
Neuroscience, Issue 91, dopaminergic neuron, substantia nigra pars compacta, midbrain, Parkinson’s disease, directed differentiation, human pluripotent stem cells, floor plate
51737
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
51823
Play Button
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
52183
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Movement Retraining using Real-time Feedback of Performance
Authors: Michael Anthony Hunt.
Institutions: University of British Columbia .
Any modification of movement - especially movement patterns that have been honed over a number of years - requires re-organization of the neuromuscular patterns responsible for governing the movement performance. This motor learning can be enhanced through a number of methods that are utilized in research and clinical settings alike. In general, verbal feedback of performance in real-time or knowledge of results following movement is commonly used clinically as a preliminary means of instilling motor learning. Depending on patient preference and learning style, visual feedback (e.g. through use of a mirror or different types of video) or proprioceptive guidance utilizing therapist touch, are used to supplement verbal instructions from the therapist. Indeed, a combination of these forms of feedback is commonplace in the clinical setting to facilitate motor learning and optimize outcomes. Laboratory-based, quantitative motion analysis has been a mainstay in research settings to provide accurate and objective analysis of a variety of movements in healthy and injured populations. While the actual mechanisms of capturing the movements may differ, all current motion analysis systems rely on the ability to track the movement of body segments and joints and to use established equations of motion to quantify key movement patterns. Due to limitations in acquisition and processing speed, analysis and description of the movements has traditionally occurred offline after completion of a given testing session. This paper will highlight a new supplement to standard motion analysis techniques that relies on the near instantaneous assessment and quantification of movement patterns and the display of specific movement characteristics to the patient during a movement analysis session. As a result, this novel technique can provide a new method of feedback delivery that has advantages over currently used feedback methods.
Medicine, Issue 71, Biophysics, Anatomy, Physiology, Physics, Biomedical Engineering, Behavior, Psychology, Kinesiology, Physical Therapy, Musculoskeletal System, Biofeedback, biomechanics, gait, movement, walking, rehabilitation, clinical, training
50182
Play Button
Collecting Variable-concentration Isothermal Titration Calorimetry Datasets in Order to Determine Binding Mechanisms
Authors: Lee A. Freiburger, Anthony K. Mittermaier, Karine Auclair.
Institutions: McGill University.
Isothermal titration calorimetry (ITC) is commonly used to determine the thermodynamic parameters associated with the binding of a ligand to a host macromolecule. ITC has some advantages over common spectroscopic approaches for studying host/ligand interactions. For example, the heat released or absorbed when the two components interact is directly measured and does not require any exogenous reporters. Thus the binding enthalpy and the association constant (Ka) are directly obtained from ITC data, and can be used to compute the entropic contribution. Moreover, the shape of the isotherm is dependent on the c-value and the mechanistic model involved. The c-value is defined as c = n[P]tKa, where [P]t is the protein concentration, and n is the number of ligand binding sites within the host. In many cases, multiple binding sites for a given ligand are non-equivalent and ITC allows the characterization of the thermodynamic binding parameters for each individual binding site. This however requires that the correct binding model be used. This choice can be problematic if different models can fit the same experimental data. We have previously shown that this problem can be circumvented by performing experiments at several c-values. The multiple isotherms obtained at different c-values are fit simultaneously to separate models. The correct model is next identified based on the goodness of fit across the entire variable-c dataset. This process is applied here to the aminoglycoside resistance-causing enzyme aminoglycoside N-6'-acetyltransferase-Ii (AAC(6')-Ii). Although our methodology is applicable to any system, the necessity of this strategy is better demonstrated with a macromolecule-ligand system showing allostery or cooperativity, and when different binding models provide essentially identical fits to the same data. To our knowledge, there are no such systems commercially available. AAC(6')-Ii, is a homo-dimer containing two active sites, showing cooperativity between the two subunits. However ITC data obtained at a single c-value can be fit equally well to at least two different models a two-sets-of-sites independent model and a two-site sequential (cooperative) model. Through varying the c-value as explained above, it was established that the correct binding model for AAC(6')-Ii is a two-site sequential binding model. Herein, we describe the steps that must be taken when performing ITC experiments in order to obtain datasets suitable for variable-c analyses.
Biochemistry, Issue 50, ITC, global fitting, cooperativity, binding model, ligand
2529
Play Button
Dissection of the Adult Zebrafish Kidney
Authors: Gary F. Gerlach, Lauran N. Schrader, Rebecca A. Wingert.
Institutions: University of Notre Dame .
Researchers working in the burgeoning field of adult stem cell biology seek to understand the signals that regulate the behavior and function of stem cells during normal homeostasis and disease states. The understanding of adult stem cells has broad reaching implications for the future of regenerative medicine1. For example, better knowledge about adult stem cell biology can facilitate the design of therapeutic strategies in which organs are triggered to heal themselves or even the creation of methods for growing organs in vitro that can be transplanted into humans1. The zebrafish has become a powerful animal model for the study of vertebrate cell biology2. There has been extensive documentation and analysis of embryonic development in the zebrafish3. Only recently have scientists sought to document adult anatomy and surgical dissection techniques4, as there has been a progressive movement within the zebrafish community to broaden the applications of this research organism to adult studies. For example, there are expanding interests in using zebrafish to investigate the biology of adult stem cell populations and make sophisticated adult models of diseases such as cancer5. Historically, isolation of the zebrafish adult kidney has been instrumental for studying hematopoiesis, as the kidney is the anatomical location of blood cell production in fish6,7. The kidney is composed of nephron functional units found in arborized arrangements, surrounded by hematopoietic tissue that is dispersed throughout the intervening spaces. The hematopoietic component consists of hematopoietic stem cells (HSCs) and their progeny that inhabit the kidney until they terminally differentiate8. In addition, it is now appreciated that a group of renal stem/progenitor cells (RPCs) also inhabit the zebrafish kidney organ and enable both kidney regeneration and growth, as observed in other fish species9-11. In light of this new discovery, the zebrafish kidney is one organ that houses the location of two exciting opportunities for adult stem cell biology studies. It is clear that many outstanding questions could be well served with this experimental system. To encourage expansion of this field, it is beneficial to document detailed methods of visualizing and then isolating the adult zebrafish kidney organ. This protocol details our procedure for dissection of the adult kidney from both unfixed and fixed animals. Dissection of the kidney organ can be used to isolate and characterize hematopoietic and renal stem cells and their offspring using established techniques such as histology, fluorescence activated cell sorting (FACS)11,12, expression profiling13,14, and transplantation11,15. We hope that dissemination of this protocol will provide researchers with the knowledge to implement broader use of zebrafish studies that ultimately can be translated for human application.
Developmental Biology, Issue 54, kidney, blood, zebrafish, regeneration, adult stem cell, dissection
2839
Play Button
Aseptic Laboratory Techniques: Plating Methods
Authors: Erin R. Sanders.
Institutions: University of California, Los Angeles .
Microorganisms are present on all inanimate surfaces creating ubiquitous sources of possible contamination in the laboratory. Experimental success relies on the ability of a scientist to sterilize work surfaces and equipment as well as prevent contact of sterile instruments and solutions with non-sterile surfaces. Here we present the steps for several plating methods routinely used in the laboratory to isolate, propagate, or enumerate microorganisms such as bacteria and phage. All five methods incorporate aseptic technique, or procedures that maintain the sterility of experimental materials. Procedures described include (1) streak-plating bacterial cultures to isolate single colonies, (2) pour-plating and (3) spread-plating to enumerate viable bacterial colonies, (4) soft agar overlays to isolate phage and enumerate plaques, and (5) replica-plating to transfer cells from one plate to another in an identical spatial pattern. These procedures can be performed at the laboratory bench, provided they involve non-pathogenic strains of microorganisms (Biosafety Level 1, BSL-1). If working with BSL-2 organisms, then these manipulations must take place in a biosafety cabinet. Consult the most current edition of the Biosafety in Microbiological and Biomedical Laboratories (BMBL) as well as Material Safety Data Sheets (MSDS) for Infectious Substances to determine the biohazard classification as well as the safety precautions and containment facilities required for the microorganism in question. Bacterial strains and phage stocks can be obtained from research investigators, companies, and collections maintained by particular organizations such as the American Type Culture Collection (ATCC). It is recommended that non-pathogenic strains be used when learning the various plating methods. By following the procedures described in this protocol, students should be able to: ● Perform plating procedures without contaminating media. ● Isolate single bacterial colonies by the streak-plating method. ● Use pour-plating and spread-plating methods to determine the concentration of bacteria. ● Perform soft agar overlays when working with phage. ● Transfer bacterial cells from one plate to another using the replica-plating procedure. ● Given an experimental task, select the appropriate plating method.
Basic Protocols, Issue 63, Streak plates, pour plates, soft agar overlays, spread plates, replica plates, bacteria, colonies, phage, plaques, dilutions
3064
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
3259
Play Button
RNA-seq Analysis of Transcriptomes in Thrombin-treated and Control Human Pulmonary Microvascular Endothelial Cells
Authors: Dilyara Cheranova, Margaret Gibson, Suman Chaudhary, Li Qin Zhang, Daniel P. Heruth, Dmitry N. Grigoryev, Shui Qing Ye.
Institutions: Children's Mercy Hospital and Clinics, School of Medicine, University of Missouri-Kansas City.
The characterization of gene expression in cells via measurement of mRNA levels is a useful tool in determining how the transcriptional machinery of the cell is affected by external signals (e.g. drug treatment), or how cells differ between a healthy state and a diseased state. With the advent and continuous refinement of next-generation DNA sequencing technology, RNA-sequencing (RNA-seq) has become an increasingly popular method of transcriptome analysis to catalog all species of transcripts, to determine the transcriptional structure of all expressed genes and to quantify the changing expression levels of the total set of transcripts in a given cell, tissue or organism1,2 . RNA-seq is gradually replacing DNA microarrays as a preferred method for transcriptome analysis because it has the advantages of profiling a complete transcriptome, providing a digital type datum (copy number of any transcript) and not relying on any known genomic sequence3. Here, we present a complete and detailed protocol to apply RNA-seq to profile transcriptomes in human pulmonary microvascular endothelial cells with or without thrombin treatment. This protocol is based on our recent published study entitled "RNA-seq Reveals Novel Transcriptome of Genes and Their Isoforms in Human Pulmonary Microvascular Endothelial Cells Treated with Thrombin,"4 in which we successfully performed the first complete transcriptome analysis of human pulmonary microvascular endothelial cells treated with thrombin using RNA-seq. It yielded unprecedented resources for further experimentation to gain insights into molecular mechanisms underlying thrombin-mediated endothelial dysfunction in the pathogenesis of inflammatory conditions, cancer, diabetes, and coronary heart disease, and provides potential new leads for therapeutic targets to those diseases. The descriptive text of this protocol is divided into four parts. The first part describes the treatment of human pulmonary microvascular endothelial cells with thrombin and RNA isolation, quality analysis and quantification. The second part describes library construction and sequencing. The third part describes the data analysis. The fourth part describes an RT-PCR validation assay. Representative results of several key steps are displayed. Useful tips or precautions to boost success in key steps are provided in the Discussion section. Although this protocol uses human pulmonary microvascular endothelial cells treated with thrombin, it can be generalized to profile transcriptomes in both mammalian and non-mammalian cells and in tissues treated with different stimuli or inhibitors, or to compare transcriptomes in cells or tissues between a healthy state and a disease state.
Genetics, Issue 72, Molecular Biology, Immunology, Medicine, Genomics, Proteins, RNA-seq, Next Generation DNA Sequencing, Transcriptome, Transcription, Thrombin, Endothelial cells, high-throughput, DNA, genomic DNA, RT-PCR, PCR
4393
Play Button
Right Ventricular Systolic Pressure Measurements in Combination with Harvest of Lung and Immune Tissue Samples in Mice
Authors: Wen-Chi Chen, Sung-Hyun Park, Carol Hoffman, Cecil Philip, Linda Robinson, James West, Gabriele Grunig.
Institutions: New York University School of Medicine, Tuxedo, Vanderbilt University Medical Center, New York University School of Medicine.
The function of the right heart is to pump blood through the lungs, thus linking right heart physiology and pulmonary vascular physiology. Inflammation is a common modifier of heart and lung function, by elaborating cellular infiltration, production of cytokines and growth factors, and by initiating remodeling processes 1. Compared to the left ventricle, the right ventricle is a low-pressure pump that operates in a relatively narrow zone of pressure changes. Increased pulmonary artery pressures are associated with increased pressure in the lung vascular bed and pulmonary hypertension 2. Pulmonary hypertension is often associated with inflammatory lung diseases, for example chronic obstructive pulmonary disease, or autoimmune diseases 3. Because pulmonary hypertension confers a bad prognosis for quality of life and life expectancy, much research is directed towards understanding the mechanisms that might be targets for pharmaceutical intervention 4. The main challenge for the development of effective management tools for pulmonary hypertension remains the complexity of the simultaneous understanding of molecular and cellular changes in the right heart, the lungs and the immune system. Here, we present a procedural workflow for the rapid and precise measurement of pressure changes in the right heart of mice and the simultaneous harvest of samples from heart, lungs and immune tissues. The method is based on the direct catheterization of the right ventricle via the jugular vein in close-chested mice, first developed in the late 1990s as surrogate measure of pressures in the pulmonary artery5-13. The organized team-approach facilitates a very rapid right heart catheterization technique. This makes it possible to perform the measurements in mice that spontaneously breathe room air. The organization of the work-flow in distinct work-areas reduces time delay and opens the possibility to simultaneously perform physiology experiments and harvest immune, heart and lung tissues. The procedural workflow outlined here can be adapted for a wide variety of laboratory settings and study designs, from small, targeted experiments, to large drug screening assays. The simultaneous acquisition of cardiac physiology data that can be expanded to include echocardiography5,14-17 and harvest of heart, lung and immune tissues reduces the number of animals needed to obtain data that move the scientific knowledge basis forward. The procedural workflow presented here also provides an ideal basis for gaining knowledge of the networks that link immune, lung and heart function. The same principles outlined here can be adapted to study other or additional organs as needed.
Immunology, Issue 71, Medicine, Anatomy, Physiology, Cardiology, Surgery, Cardiovascular Abnormalities, Inflammation, Respiration Disorders, Immune System Diseases, Cardiac physiology, mouse, pulmonary hypertension, right heart function, lung immune response, lung inflammation, lung remodeling, catheterization, mice, tissue, animal model
50023
Play Button
Brain Imaging Investigation of the Neural Correlates of Observing Virtual Social Interactions
Authors: Keen Sung, Sanda Dolcos, Sophie Flor-Henry, Crystal Zhou, Claudia Gasior, Jennifer Argo, Florin Dolcos.
Institutions: University of Alberta, University of Illinois, University of Alberta, University of Alberta, University of Alberta, University of Illinois at Urbana-Champaign, University of Illinois at Urbana-Champaign.
The ability to gauge social interactions is crucial in the assessment of others’ intentions. Factors such as facial expressions and body language affect our decisions in personal and professional life alike 1. These "friend or foe" judgements are often based on first impressions, which in turn may affect our decisions to "approach or avoid". Previous studies investigating the neural correlates of social cognition tended to use static facial stimuli 2. Here, we illustrate an experimental design in which whole-body animated characters were used in conjunction with functional magnetic resonance imaging (fMRI) recordings. Fifteen participants were presented with short movie-clips of guest-host interactions in a business setting, while fMRI data were recorded; at the end of each movie, participants also provided ratings of the host behaviour. This design mimics more closely real-life situations, and hence may contribute to better understanding of the neural mechanisms of social interactions in healthy behaviour, and to gaining insight into possible causes of deficits in social behaviour in such clinical conditions as social anxiety and autism 3.
Neuroscience, Issue 53, Social Perception, Social Knowledge, Social Cognition Network, Non-Verbal Communication, Decision-Making, Event-Related fMRI
2379
Play Button
Combining Behavioral Endocrinology and Experimental Economics: Testosterone and Social Decision Making
Authors: Christoph Eisenegger, Michael Naef.
Institutions: University of Zurich, Royal Holloway, University of London.
Behavioral endocrinological research in humans as well as in animals suggests that testosterone plays a key role in social interactions. Studies in rodents have shown a direct link between testosterone and aggressive behavior1 and folk wisdom adapts these findings to humans, suggesting that testosterone induces antisocial, egoistic or even aggressive behavior2. However, many researchers doubt a direct testosterone-aggression link in humans, arguing instead that testosterone is primarily involved in status-related behavior3,4. As a high status can also be achieved by aggressive and antisocial means it can be difficult to distinguish between anti-social and status seeking behavior. We therefore set up an experimental environment, in which status can only be achieved by prosocial means. In a double-blind and placebo-controlled experiment, we administered a single sublingual dose of 0.5 mg of testosterone (with a hydroxypropyl-β-cyclodextrin carrier) to 121 women and investigated their social interaction behavior in an economic bargaining paradigm. Real monetary incentives are at stake in this paradigm; every player A receives a certain amount of money and has to make an offer to another player B on how to share the money. If B accepts, she gets what was offered and player A keeps the rest. If B refuses the offer, nobody gets anything. A status seeking player A is expected to avoid being rejected by behaving in a prosocial way, i.e. by making higher offers. The results show that if expectations about the hormone are controlled for, testosterone administration leads to a significant increase in fair bargaining offers compared to placebo. The role of expectations is reflected in the fact that subjects who report that they believe to have received testosterone make lower offers than those who say they believe that they were treated with a placebo. These findings suggest that the experimental economics approach is sensitive for detecting neurobiological effects as subtle as those achieved by administration of hormones. Moreover, the findings point towards the importance of both psychosocial as well as neuroendocrine factors in determining the influence of testosterone on human social behavior.
Neuroscience, Issue 49, behavioral endocrinology, testosterone, social status, decision making
2065
Play Button
Using Learning Outcome Measures to assess Doctoral Nursing Education
Authors: Glenn H. Raup, Jeff King, Romana J. Hughes, Natasha Faidley.
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
2048
Play Button
Batch Immunostaining for Large-Scale Protein Detection in the Whole Monkey Brain
Authors: Shahin Zangenehpour, Mark W. Burke, Avi Chaudhuri, Maurice Ptito.
Institutions: Montreal Neurological Institute, Universitè de Montrèal, McGill University.
Immunohistochemistry (IHC) is one of the most widely used laboratory techniques for the detection of target proteins in situ. Questions concerning the expression pattern of a target protein across the entire brain are relatively easy to answer when using IHC in small brains, such as those of rodents. However, answering the same questions in large and convoluted brains, such as those of primates presents a number of challenges. Here we present a systematic approach for immunodetection of target proteins in an adult monkey brain. This approach relies on the tissue embedding and sectioning methodology of NeuroScience Associates (NSA) as well as tools developed specifically for batch-staining of free-floating sections. It results in uniform staining of a set of sections which, at a particular interval, represents the entire brain. The resulting stained sections can be subjected to a wide variety of analytical procedures in order to measure protein levels, the population of neurons expressing a certain protein.
Neuroscience, Issue 29, brain, immunohistochemistry, monkey, non-human primate, antibody, SMI32, FMRP, NeuN
1286
Play Button
Counting Human Neural Stem Cells
Authors: Steven Marchenko, Lisa Flanagan.
Institutions: University of California, Irvine (UCI).
Knowledge of the exact number of viable cells in a given volume of a cell suspension is required for many routine tissue culture manipulations, such as plating cells for immunocytochemistry or for cell transfections. This protocol describes a straightforward and fast method for differentiating between live and dead cells and quantifying the cell concentration and total cell number using a hemacytometer. This procedure first requires detaching cells from a growth surface and resuspending them in media. Next, the cells are diluted in a solution of Trypan blue (ideally to a concentration that will give 20-50 cells per quadrant) and placed in the hemacytometer. Finally, averaging the counts of viable cells in several randomly selected quadrants, dividing the average by the volume of one 1 mm2 quadrant (0.1 μl) and multiplying by the dilution factor gives the number of cells per l. Multiplying this cell concentration by the total volume in μl gives the total cell number. This protocol describes counting human neural stem/precursor cells (hNSPCs), but can also be used for many other cell types.
Issue 7, Basic Protocols, Stem Cells, Cell Culture, Cell Counting, Hemocytometer
262
Play Button
Building a Better Mosquito: Identifying the Genes Enabling Malaria and Dengue Fever Resistance in A. gambiae and A. aegypti Mosquitoes
Authors: George Dimopoulos.
Institutions: Johns Hopkins University.
In this interview, George Dimopoulos focuses on the physiological mechanisms used by mosquitoes to combat Plasmodium falciparum and dengue virus infections. Explanation is given for how key refractory genes, those genes conferring resistance to vector pathogens, are identified in the mosquito and how this knowledge can be used to generate transgenic mosquitoes that are unable to carry the malaria parasite or dengue virus.
Cellular Biology, Issue 5, Translational Research, mosquito, malaria, virus, dengue, genetics, injection, RNAi, transgenesis, transgenic
233
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.