JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
esyN: network building, sharing and publishing.
PLoS ONE
PUBLISHED: 09-02-2014
The construction and analysis of networks is increasingly widespread in biological research. We have developed esyN ("easy networks") as a free and open source tool to facilitate the exchange of biological network models between researchers. esyN acts as a searchable database of user-created networks from any field. We have developed a simple companion web tool that enables users to view and edit networks using data from publicly available databases. Both normal interaction networks (graphs) and Petri nets can be created. In addition to its basic tools, esyN contains a number of logical templates that can be used to create models more easily. The ability to use previously published models as building blocks makes esyN a powerful tool for the construction of models and network graphs. Users are able to save their own projects online and share them either publicly or with a list of collaborators. The latter can be given the ability to edit the network themselves, allowing online collaboration on network construction. esyN is designed to facilitate unrestricted exchange of this increasingly important type of biological information. Ultimately, the aim of esyN is to bring the advantages of Open Source software development to the construction of biological networks.
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Published: 10-15-2014
ABSTRACT
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
25 Related JoVE Articles!
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
A Computer-assisted Multi-electrode Patch-clamp System
Authors: Rodrigo Perin, Henry Markram.
Institutions: Ecole Polytechnique Federale de Lausanne.
The patch-clamp technique is today the most well-established method for recording electrical activity from individual neurons or their subcellular compartments. Nevertheless, achieving stable recordings, even from individual cells, remains a time-consuming procedure of considerable complexity. Automation of many steps in conjunction with efficient information display can greatly assist experimentalists in performing a larger number of recordings with greater reliability and in less time. In order to achieve large-scale recordings we concluded the most efficient approach is not to fully automatize the process but to simplify the experimental steps and reduce the chances of human error while efficiently incorporating the experimenter's experience and visual feedback. With these goals in mind we developed a computer-assisted system which centralizes all the controls necessary for a multi-electrode patch-clamp experiment in a single interface, a commercially available wireless gamepad, while displaying experiment related information and guidance cues on the computer screen. Here we describe the different components of the system which allowed us to reduce the time required for achieving the recording configuration and substantially increase the chances of successfully recording large numbers of neurons simultaneously.
Neuroscience, Issue 80, Patch-clamp, automatic positioning, whole-cell, neuronal recording, in vitro, multi-electrode
50630
Play Button
Procedure for the Development of Multi-depth Circular Cross-sectional Endothelialized Microchannels-on-a-chip
Authors: Xiang Li, Samantha Marie Mearns, Manuela Martins-Green, Yuxin Liu.
Institutions: West Virginia University, University of California at Riverside.
Efforts have been focused on developing in vitro assays for the study of microvessels because in vivo animal studies are more time-consuming, expensive, and observation and quantification are very challenging. However, conventional in vitro microvessel assays have limitations when representing in vivo microvessels with respect to three-dimensional (3D) geometry and providing continuous fluid flow. Using a combination of photolithographic reflowable photoresist technique, soft lithography, and microfluidics, we have developed a multi-depth circular cross-sectional endothelialized microchannels-on-a-chip, which mimics the 3D geometry of in vivo microvessels and runs under controlled continuous perfusion flow. A positive reflowable photoresist was used to fabricate a master mold with a semicircular cross-sectional microchannel network. By the alignment and bonding of the two polydimethylsiloxane (PDMS) microchannels replicated from the master mold, a cylindrical microchannel network was created. The diameters of the microchannels can be well controlled. In addition, primary human umbilical vein endothelial cells (HUVECs) seeded inside the chip showed that the cells lined the inner surface of the microchannels under controlled perfusion lasting for a time period between 4 days to 2 weeks.
Bioengineering, Issue 80, Bioengineering, Tissue Engineering, Miniaturization, Microtechnology, Microfluidics, Reflow photoresist, PDMS, Perfusion flow, Primary endothelial cells
50771
Play Button
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Authors: Amy H. Van Hove, Brandon D. Wilson, Danielle S. W. Benoit.
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g. primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
50890
Play Button
A Practical Guide to Phylogenetics for Nonexperts
Authors: Damien O'Halloran.
Institutions: The George Washington University.
Many researchers, across incredibly diverse foci, are applying phylogenetics to their research question(s). However, many researchers are new to this topic and so it presents inherent problems. Here we compile a practical introduction to phylogenetics for nonexperts. We outline in a step-by-step manner, a pipeline for generating reliable phylogenies from gene sequence datasets. We begin with a user-guide for similarity search tools via online interfaces as well as local executables. Next, we explore programs for generating multiple sequence alignments followed by protocols for using software to determine best-fit models of evolution. We then outline protocols for reconstructing phylogenetic relationships via maximum likelihood and Bayesian criteria and finally describe tools for visualizing phylogenetic trees. While this is not by any means an exhaustive description of phylogenetic approaches, it does provide the reader with practical starting information on key software applications commonly utilized by phylogeneticists. The vision for this article would be that it could serve as a practical training tool for researchers embarking on phylogenetic studies and also serve as an educational resource that could be incorporated into a classroom or teaching-lab.
Basic Protocol, Issue 84, phylogenetics, multiple sequence alignments, phylogenetic tree, BLAST executables, basic local alignment search tool, Bayesian models
50975
Play Button
A Method for Investigating Age-related Differences in the Functional Connectivity of Cognitive Control Networks Associated with Dimensional Change Card Sort Performance
Authors: Bianca DeBenedictis, J. Bruce Morton.
Institutions: University of Western Ontario.
The ability to adjust behavior to sudden changes in the environment develops gradually in childhood and adolescence. For example, in the Dimensional Change Card Sort task, participants switch from sorting cards one way, such as shape, to sorting them a different way, such as color. Adjusting behavior in this way exacts a small performance cost, or switch cost, such that responses are typically slower and more error-prone on switch trials in which the sorting rule changes as compared to repeat trials in which the sorting rule remains the same. The ability to flexibly adjust behavior is often said to develop gradually, in part because behavioral costs such as switch costs typically decrease with increasing age. Why aspects of higher-order cognition, such as behavioral flexibility, develop so gradually remains an open question. One hypothesis is that these changes occur in association with functional changes in broad-scale cognitive control networks. On this view, complex mental operations, such as switching, involve rapid interactions between several distributed brain regions, including those that update and maintain task rules, re-orient attention, and select behaviors. With development, functional connections between these regions strengthen, leading to faster and more efficient switching operations. The current video describes a method of testing this hypothesis through the collection and multivariate analysis of fMRI data from participants of different ages.
Behavior, Issue 87, Neurosciences, fMRI, Cognitive Control, Development, Functional Connectivity
51003
Play Button
Generation of Shear Adhesion Map Using SynVivo Synthetic Microvascular Networks
Authors: Ashley M. Smith, Balabhaskar Prabhakarpandian, Kapil Pant.
Institutions: CFD Research Corporation.
Cell/particle adhesion assays are critical to understanding the biochemical interactions involved in disease pathophysiology and have important applications in the quest for the development of novel therapeutics. Assays using static conditions fail to capture the dependence of adhesion on shear, limiting their correlation with in vivo environment. Parallel plate flow chambers that quantify adhesion under physiological fluid flow need multiple experiments for the generation of a shear adhesion map. In addition, they do not represent the in vivo scale and morphology and require large volumes (~ml) of reagents for experiments. In this study, we demonstrate the generation of shear adhesion map from a single experiment using a microvascular network based microfluidic device, SynVivo-SMN. This device recreates the complex in vivo vasculature including geometric scale, morphological elements, flow features and cellular interactions in an in vitro format, thereby providing a biologically realistic environment for basic and applied research in cellular behavior, drug delivery, and drug discovery. The assay was demonstrated by studying the interaction of the 2 µm biotin-coated particles with avidin-coated surfaces of the microchip. The entire range of shear observed in the microvasculature is obtained in a single assay enabling adhesion vs. shear map for the particles under physiological conditions.
Bioengineering, Issue 87, particle, adhesion, shear, microfluidics, vasculature, networks
51025
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Synthesis of an Intein-mediated Artificial Protein Hydrogel
Authors: Miguel A. Ramirez, Zhilei Chen.
Institutions: Texas A&M University, College Station, Texas A&M University, College Station.
We present the synthesis of a highly stable protein hydrogel mediated by a split-intein-catalyzed protein trans-splicing reaction. The building blocks of this hydrogel are two protein block-copolymers each containing a subunit of a trimeric protein that serves as a crosslinker and one half of a split intein. A highly hydrophilic random coil is inserted into one of the block-copolymers for water retention. Mixing of the two protein block copolymers triggers an intein trans-splicing reaction, yielding a polypeptide unit with crosslinkers at either end that rapidly self-assembles into a hydrogel. This hydrogel is very stable under both acidic and basic conditions, at temperatures up to 50 °C, and in organic solvents. The hydrogel rapidly reforms after shear-induced rupture. Incorporation of a "docking station peptide" into the hydrogel building block enables convenient incorporation of "docking protein"-tagged target proteins. The hydrogel is compatible with tissue culture growth media, supports the diffusion of 20 kDa molecules, and enables the immobilization of bioactive globular proteins. The application of the intein-mediated protein hydrogel as an organic-solvent-compatible biocatalyst was demonstrated by encapsulating the horseradish peroxidase enzyme and corroborating its activity.
Bioengineering, Issue 83, split-intein, self-assembly, shear-thinning, enzyme, immobilization, organic synthesis
51202
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Authors: Phoebe Spetsieris, Yilong Ma, Shichun Peng, Ji Hyun Ko, Vijay Dhawan, Chris C. Tang, David Eidelberg.
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4 is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8. Using logistic regression analysis of subject scores (i.e. pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e. composite networks with improved discrimination of patients from healthy control subjects5,6. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11. These standardized values can in turn be used to assist in differential diagnosis12,13 and to assess disease progression and treatment effects at the network level7,14-16. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
50319
Play Button
Development of an Audio-based Virtual Gaming Environment to Assist with Navigation Skills in the Blind
Authors: Erin C. Connors, Lindsay A. Yazzolino, Jaime Sánchez, Lotfi B. Merabet.
Institutions: Massachusetts Eye and Ear Infirmary, Harvard Medical School, University of Chile .
Audio-based Environment Simulator (AbES) is virtual environment software designed to improve real world navigation skills in the blind. Using only audio based cues and set within the context of a video game metaphor, users gather relevant spatial information regarding a building's layout. This allows the user to develop an accurate spatial cognitive map of a large-scale three-dimensional space that can be manipulated for the purposes of a real indoor navigation task. After game play, participants are then assessed on their ability to navigate within the target physical building represented in the game. Preliminary results suggest that early blind users were able to acquire relevant information regarding the spatial layout of a previously unfamiliar building as indexed by their performance on a series of navigation tasks. These tasks included path finding through the virtual and physical building, as well as a series of drop off tasks. We find that the immersive and highly interactive nature of the AbES software appears to greatly engage the blind user to actively explore the virtual environment. Applications of this approach may extend to larger populations of visually impaired individuals.
Medicine, Issue 73, Behavior, Neuroscience, Anatomy, Physiology, Neurobiology, Ophthalmology, Psychology, Behavior and Behavior Mechanisms, Technology, Industry, virtual environments, action video games, blind, audio, rehabilitation, indoor navigation, spatial cognitive map, Audio-based Environment Simulator, virtual reality, cognitive psychology, clinical techniques
50272
Play Button
Modeling Biological Membranes with Circuit Boards and Measuring Electrical Signals in Axons: Student Laboratory Exercises
Authors: Martha M. Robinson, Jonathan M. Martin, Harold L. Atwood, Robin L. Cooper.
Institutions: University of Kentucky, University of Toronto.
This is a demonstration of how electrical models can be used to characterize biological membranes. This exercise also introduces biophysical terminology used in electrophysiology. The same equipment is used in the membrane model as on live preparations. Some properties of an isolated nerve cord are investigated: nerve action potentials, recruitment of neurons, and responsiveness of the nerve cord to environmental factors.
Basic Protocols, Issue 47, Invertebrate, Crayfish, Modeling, Student laboratory, Nerve cord
2325
Play Button
Plasma Lithography Surface Patterning for Creation of Cell Networks
Authors: Michael Junkin, Siu Ling Leung, Yongliang Yang, Yi Lu, Justin Volmering, Pak Kin Wong.
Institutions: University of Arizona , University of Arizona .
Systematic manipulation of a cell microenvironment with micro- and nanoscale resolution is often required for deciphering various cellular and molecular phenomena. To address this requirement, we have developed a plasma lithography technique to manipulate the cellular microenvironment by creating a patterned surface with feature sizes ranging from 100 nm to millimeters. The goal of this technique is to be able to study, in a controlled way, the behaviors of individual cells as well as groups of cells and their interactions. This plasma lithography method is based on selective modification of the surface chemistry on a substrate by means of shielding the contact of low-temperature plasma with a physical mold. This selective shielding leaves a chemical pattern which can guide cell attachment and movement. This pattern, or surface template, can then be used to create networks of cells whose structure can mimic that found in nature and produces a controllable environment for experimental investigations. The technique is well suited to studying biological phenomenon as it produces stable surface patterns on transparent polymeric substrates in a biocompatible manner. The surface patterns last for weeks to months and can thus guide interaction with cells for long time periods which facilitates the study of long-term cellular processes, such as differentiation and adaption. The modification to the surface is primarily chemical in nature and thus does not introduce topographical or physical interference for interpretation of results. It also does not involve any harsh or toxic substances to achieve patterning and is compatible for tissue culture. Furthermore, it can be applied to modify various types of polymeric substrates, which due to the ability to tune their properties are ideal for and are widely used in biological applications. The resolution achievable is also beneficial, as isolation of specific processes such as migration, adhesion, or binding allows for discrete, clear observations at the single to multicell level. This method has been employed to form diverse networks of different cell types for investigations involving migration, signaling, tissue formation, and the behavior and interactions of neurons arraigned in a network.
Bioengineering, Issue 52, Cell Network, Surface Patterning, Self-Organization, Developmental Biology, Tissue Engineering, Nanopattern, Micropattern, Self-Assembly, Cell Guidance, Neuron
3115
Play Button
A Protocol for Computer-Based Protein Structure and Function Prediction
Authors: Ambrish Roy, Dong Xu, Jonathan Poisson, Yang Zhang.
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
3259
Play Button
Assessment and Evaluation of the High Risk Neonate: The NICU Network Neurobehavioral Scale
Authors: Barry M. Lester, Lynne Andreozzi-Fontaine, Edward Tronick, Rosemarie Bigsby.
Institutions: Brown University, Women & Infants Hospital of Rhode Island, University of Massachusetts, Boston.
There has been a long-standing interest in the assessment of the neurobehavioral integrity of the newborn infant. The NICU Network Neurobehavioral Scale (NNNS) was developed as an assessment for the at-risk infant. These are infants who are at increased risk for poor developmental outcome because of insults during prenatal development, such as substance exposure or prematurity or factors such as poverty, poor nutrition or lack of prenatal care that can have adverse effects on the intrauterine environment and affect the developing fetus. The NNNS assesses the full range of infant neurobehavioral performance including neurological integrity, behavioral functioning, and signs of stress/abstinence. The NNNS is a noninvasive neonatal assessment tool with demonstrated validity as a predictor, not only of medical outcomes such as cerebral palsy diagnosis, neurological abnormalities, and diseases with risks to the brain, but also of developmental outcomes such as mental and motor functioning, behavior problems, school readiness, and IQ. The NNNS can identify infants at high risk for abnormal developmental outcome and is an important clinical tool that enables medical researchers and health practitioners to identify these infants and develop intervention programs to optimize the development of these infants as early as possible. The video shows the NNNS procedures, shows examples of normal and abnormal performance and the various clinical populations in which the exam can be used.
Behavior, Issue 90, NICU Network Neurobehavioral Scale, NNNS, High risk infant, Assessment, Evaluation, Prediction, Long term outcome
3368
Play Button
Functional Calcium Imaging in Developing Cortical Networks
Authors: Julia Dawitz, Tim Kroon, J.J. Johannes Hjorth, Rhiannon M. Meredith.
Institutions: VU University, Amsterdam.
A hallmark pattern of activity in developing nervous systems is spontaneous, synchronized network activity. Synchronized activity has been observed in intact spinal cord, brainstem, retina, cortex and dissociated neuronal culture preparations. During periods of spontaneous activity, neurons depolarize to fire single or bursts of action potentials, activating many ion channels. Depolarization activates voltage-gated calcium channels on dendrites and spines that mediate calcium influx. Highly synchronized electrical activity has been measured from local neuronal networks using field electrodes. This technique enables high temporal sampling rates but lower spatial resolution due to integrated read-out of multiple neurons at one electrode. Single cell resolution of neuronal activity is possible using patch-clamp electrophysiology on single neurons to measure firing activity. However, the ability to measure from a network is limited to the number of neurons patched simultaneously, and typically is only one or two neurons. The use of calcium-dependent fluorescent indicator dyes has enabled the measurement of synchronized activity across a network of cells. This technique gives both high spatial resolution and sufficient temporal sampling to record spontaneous activity of the developing network. A key feature of newly-forming cortical and hippocampal networks during pre- and early postnatal development is spontaneous, synchronized neuronal activity (Katz & Shatz, 1996; Khaziphov & Luhmann, 2006). This correlated network activity is believed to be essential for the generation of functional circuits in the developing nervous system (Spitzer, 2006). In both primate and rodent brain, early electrical and calcium network waves are observed pre- and postnatally in vivo and in vitro (Adelsberger et al., 2005; Garaschuk et al., 2000; Lamblin et al., 1999). These early activity patterns, which are known to control several developmental processes including neuronal differentiation, synaptogenesis and plasticity (Rakic & Komuro, 1995; Spitzer et al., 2004) are of critical importance for the correct development and maturation of the cortical circuitry. In this JoVE video, we demonstrate the methods used to image spontaneous activity in developing cortical networks. Calcium-sensitive indicators, such as Fura 2-AM ester diffuse across the cell membrane where intracellular esterase activity cleaves the AM esters to leave the cell-impermeant form of indicator dye. The impermeant form of indicator has carboxylic acid groups which are able to then detect and bind calcium ions intracellularly.. The fluorescence of the calcium-sensitive dye is transiently altered upon binding to calcium. Single or multi-photon imaging techniques are used to measure the change in photons being emitted from the dye, and thus indicate an alteration in intracellular calcium. Furthermore, these calcium-dependent indicators can be combined with other fluorescent markers to investigate cell types within the active network.
Neuroscience, Issue 56, calcium, imaging, mouse, network, development, cortex, multiphoton
3550
Play Button
A Faster, High Resolution, mtPA-GFP-based Mitochondrial Fusion Assay Acquiring Kinetic Data of Multiple Cells in Parallel Using Confocal Microscopy
Authors: Alenka Lovy, Anthony J.A. Molina, Fernanda M. Cerqueira, Kyle Trudeau, Orian S. Shirihai.
Institutions: Tufts School of Medicine, Wake Forest Baptist Medical Center, Boston University Medical Center.
Mitochondrial fusion plays an essential role in mitochondrial calcium homeostasis, bioenergetics, autophagy and quality control. Fusion is quantified in living cells by photo-conversion of matrix targeted photoactivatable GFP (mtPAGFP) in a subset of mitochondria. The rate at which the photoconverted molecules equilibrate across the entire mitochondrial population is used as a measure of fusion activity. Thus far measurements were performed using a single cell time lapse approach, quantifying the equilibration in one cell over an hour. Here, we scale up and automate a previously published live cell method based on using mtPAGFP and a low concentration of TMRE (15 nm). This method involves photoactivating a small portion of the mitochondrial network, collecting highly resolved stacks of confocal sections every 15 min for 1 hour, and quantifying the change in signal intensity. Depending on several factors such as ease of finding PAGFP expressing cells, and the signal of the photoactivated regions, it is possible to collect around 10 cells within the 15 min intervals. This provides a significant improvement in the time efficiency of this assay while maintaining the highly resolved subcellular quantification as well as the kinetic parameters necessary to capture the detail of mitochondrial behavior in its native cytoarchitectural environment. Mitochondrial dynamics play a role in many cellular processes including respiration, calcium regulation, and apoptosis1,2,3,13. The structure of the mitochondrial network affects the function of mitochondria, and the way they interact with the rest of the cell. Undergoing constant division and fusion, mitochondrial networks attain various shapes ranging from highly fused networks, to being more fragmented. Interestingly, Alzheimer's disease, Parkinson's disease, Charcot Marie Tooth 2A, and dominant optic atrophy have been correlated with altered mitochondrial morphology, namely fragmented networks4,10,13. Often times, upon fragmentation, mitochondria become depolarized, and upon accumulation this leads to impaired cell function18. Mitochondrial fission has been shown to signal a cell to progress toward apoptosis. It can also provide a mechanism by which to separate depolarized and inactive mitochondria to keep the bulk of the network robust14. Fusion of mitochondria, on the other hand, leads to sharing of matrix proteins, solutes, mtDNA and the electrochemical gradient, and also seems to prevent progression to apoptosis9. How fission and fusion of mitochondria affects cell homeostasis and ultimately the functioning of the organism needs further understanding, and therefore the continuous development and optimization of how to gather information on these phenomena is necessary. Existing mitochondrial fusion assays have revealed various insights into mitochondrial physiology, each having its own advantages. The hybrid PEG fusion assay7, mixes two populations of differently labeled cells (mtRFP and mtYFP), and analyzes the amount of mixing and colocalization of fluorophores in fused, multinucleated, cells. Although this method has yielded valuable information, not all cell types can fuse, and the conditions under which fusion is stimulated involves the use of toxic drugs that likely affect the normal fusion process. More recently, a cell free technique has been devised, using isolated mitochondria to observe fusion events based on a luciferase assay1,5. Two human cell lines are targeted with either the amino or a carboxy terminal part of Renilla luciferase along with a leucine zipper to ensure dimerization upon mixing. Mitochondria are isolated from each cell line, and fused. The fusion reaction can occur without the cytosol under physiological conditions in the presence of energy, appropriate temperature and inner mitochondrial membrane potential. Interestingly, the cytosol was found to modulate the extent of fusion, demonstrating that cell signaling regulates the fusion process 4,5. This assay will be very useful for high throughput screening to identify components of the fusion machinery and also pharmacological compounds that may affect mitochondrial dynamics. However, more detailed whole cell mitochondrial assays will be needed to complement this in vitro assay to observe these events within a cellular environment. A technique for monitoring whole-cell mitochondrial dynamics has been in use for some time and is based on a mitochondrially-targeted photoactivatable GFP (mtPAGFP)6,11. Upon expression of the mtPAGFP, a small portion of the mitochondrial network is photoactivated (10-20%), and the spread of the signal to the rest of the mitochondrial network is recorded every 15 minutes for 1 hour using time lapse confocal imaging. Each fusion event leads to a dilution of signal intensity, enabling quantification of the fusion rate. Although fusion and fission are continuously occurring in cells, this technique only monitors fusion as fission does not lead to a dilution of the PAGFP signal6. Co-labeling with low levels of TMRE (7-15 nM in INS1 cells) allows quantification of the membrane potential of mitochondria. When mitochondria are hyperpolarized they uptake more TMRE, and when they depolarize they lose the TMRE dye. Mitochondria that depolarize no longer have a sufficient membrane potential and tend not to fuse as efficiently if at all. Therefore, active fusing mitochondria can be tracked with these low levels of TMRE9,15. Accumulation of depolarized mitochondria that lack a TMRE signal may be a sign of phototoxicity or cell death. Higher concentrations of TMRE render mitochondria very sensitive to laser light, and therefore great care must be taken to avoid overlabeling with TMRE. If the effect of depolarization of mitochondria is the topic of interest, a technique using slightly higher levels of TMRE and more intense laser light can be used to depolarize mitochondria in a controlled fashion (Mitra and Lippincott-Schwartz, 2010). To ensure that toxicity due to TMRE is not an issue, we suggest exposing loaded cells (3-15 nM TMRE) to the imaging parameters that will be used in the assay (perhaps 7 stacks of 6 optical sections in a row), and assessing cell health after 2 hours. If the mitochondria appear too fragmented and cells are dying, other mitochondrial markers, such as dsRED or Mitotracker red could be used instead of TMRE. The mtPAGFP method has revealed details about mitochondrial network behavior that could not be visualized using other methods. For example, we now know that mitochondrial fusion can be full or transient, where matrix content can mix without changing the overall network morphology. Additionally, we know that the probability of fusion is independent of contact duration and organelle dimension, is influenced by organelle motility, membrane potential and history of previous fusion activity8,15,16,17. In this manuscript, we describe a methodology for scaling up the previously published protocol using mtPAGFP and 15nM TMRE8 in order to examine multiple cells at a time and improve the time efficiency of data collection without sacrificing the subcellular resolution. This has been made possible by the use of an automated microscope stage, and programmable image acquisition software. Zen software from Zeiss allows the user to mark and track several designated cells expressing mtPAGFP. Each of these cells can be photoactivated in a particular region of interest, and stacks of confocal slices can be monitored for mtPAGFP signal as well as TMRE at specified intervals. Other confocal systems could be used to perform this protocol provided there is an automated stage that is programmable, an incubator with CO2, and a means by which to photoactivate the PAGFP; either a multiphoton laser, or a 405 nm diode laser.
Molecular Biology, Issue 65, Genetics, Cellular Biology, Physics, confocal microscopy, mitochondria, fusion, TMRE, mtPAGFP, INS1, mitochondrial dynamics, mitochondrial morphology, mitochondrial network
3991
Play Button
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Authors: Alla Gagarinova, Mohan Babu, Jack Greenblatt, Andrew Emili.
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g. protein-protein) and functional (e.g. gene-gene or genetic) interactions (GI)1. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7, but GI information remains sparse for prokaryotes8, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10. Here, we present the key steps required to perform quantitative E. coli Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format. Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g. the 'Keio' collection11) and essential gene hypomorphic mutations (i.e. alleles conferring reduced protein expression, stability, or activity9, 12, 13) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e. slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2 as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
4056
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Trajectory Data Analyses for Pedestrian Space-time Activity Study
Authors: Feng Qi, Fei Du.
Institutions: Kean University, University of Wisconsin-Madison.
It is well recognized that human movement in the spatial and temporal dimensions has direct influence on disease transmission1-3. An infectious disease typically spreads via contact between infected and susceptible individuals in their overlapped activity spaces. Therefore, daily mobility-activity information can be used as an indicator to measure exposures to risk factors of infection. However, a major difficulty and thus the reason for paucity of studies of infectious disease transmission at the micro scale arise from the lack of detailed individual mobility data. Previously in transportation and tourism research detailed space-time activity data often relied on the time-space diary technique, which requires subjects to actively record their activities in time and space. This is highly demanding for the participants and collaboration from the participants greatly affects the quality of data4. Modern technologies such as GPS and mobile communications have made possible the automatic collection of trajectory data. The data collected, however, is not ideal for modeling human space-time activities, limited by the accuracies of existing devices. There is also no readily available tool for efficient processing of the data for human behavior study. We present here a suite of methods and an integrated ArcGIS desktop-based visual interface for the pre-processing and spatiotemporal analyses of trajectory data. We provide examples of how such processing may be used to model human space-time activities, especially with error-rich pedestrian trajectory data, that could be useful in public health studies such as infectious disease transmission modeling. The procedure presented includes pre-processing, trajectory segmentation, activity space characterization, density estimation and visualization, and a few other exploratory analysis methods. Pre-processing is the cleaning of noisy raw trajectory data. We introduce an interactive visual pre-processing interface as well as an automatic module. Trajectory segmentation5 involves the identification of indoor and outdoor parts from pre-processed space-time tracks. Again, both interactive visual segmentation and automatic segmentation are supported. Segmented space-time tracks are then analyzed to derive characteristics of one's activity space such as activity radius etc. Density estimation and visualization are used to examine large amount of trajectory data to model hot spots and interactions. We demonstrate both density surface mapping6 and density volume rendering7. We also include a couple of other exploratory data analyses (EDA) and visualizations tools, such as Google Earth animation support and connection analysis. The suite of analytical as well as visual methods presented in this paper may be applied to any trajectory data for space-time activity studies.
Environmental Sciences, Issue 72, Computer Science, Behavior, Infectious Diseases, Geography, Cartography, Data Display, Disease Outbreaks, cartography, human behavior, Trajectory data, space-time activity, GPS, GIS, ArcGIS, spatiotemporal analysis, visualization, segmentation, density surface, density volume, exploratory data analysis, modelling
50130
Play Button
Using Microwave and Macroscopic Samples of Dielectric Solids to Study the Photonic Properties of Disordered Photonic Bandgap Materials
Authors: Seyed Reza Hashemizad, Sam Tsitrin, Polin Yadak, Yingquan He, Daniel Cuneo, Eric Paul Williamson, Devin Liner, Weining Man.
Institutions: San Francisco State University.
Recently, disordered photonic materials have been suggested as an alternative to periodic crystals for the formation of a complete photonic bandgap (PBG). In this article we will describe the methods for constructing and characterizing macroscopic disordered photonic structures using microwaves. The microwave regime offers the most convenient experimental sample size to build and test PBG media. Easily manipulated dielectric lattice components extend flexibility in building various 2D structures on top of pre-printed plastic templates. Once built, the structures could be quickly modified with point and line defects to make freeform waveguides and filters. Testing is done using a widely available Vector Network Analyzer and pairs of microwave horn antennas. Due to the scale invariance property of electromagnetic fields, the results we obtained in the microwave region can be directly applied to infrared and optical regions. Our approach is simple but delivers exciting new insight into the nature of light and disordered matter interaction. Our representative results include the first experimental demonstration of the existence of a complete and isotropic PBG in a two-dimensional (2D) hyperuniform disordered dielectric structure. Additionally we demonstrate experimentally the ability of this novel photonic structure to guide electromagnetic waves (EM) through freeform waveguides of arbitrary shape.
Physics, Issue 91, optics and photonics, photonic crystals, photonic bandgap, hyperuniform, disordered media, waveguides
51614
Play Button
Designing and Implementing Nervous System Simulations on LEGO Robots
Authors: Daniel Blustein, Nikolai Rosenthal, Joseph Ayers.
Institutions: Northeastern University, Bremen University of Applied Sciences.
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.1 The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Neuroscience, Issue 75, Neurobiology, Bioengineering, Behavior, Mechanical Engineering, Computer Science, Marine Biology, Biomimetics, Marine Science, Neurosciences, Synthetic Biology, Robotics, robots, Modeling, models, Sensory Fusion, nervous system, Educational Tools, programming, software, lobster, Homarus americanus, animal model
50519
Play Button
Analyzing and Building Nucleic Acid Structures with 3DNA
Authors: Andrew V. Colasanti, Xiang-Jun Lu, Wilma K. Olson.
Institutions: Rutgers - The State University of New Jersey, Columbia University .
The 3DNA software package is a popular and versatile bioinformatics tool with capabilities to analyze, construct, and visualize three-dimensional nucleic acid structures. This article presents detailed protocols for a subset of new and popular features available in 3DNA, applicable to both individual structures and ensembles of related structures. Protocol 1 lists the set of instructions needed to download and install the software. This is followed, in Protocol 2, by the analysis of a nucleic acid structure, including the assignment of base pairs and the determination of rigid-body parameters that describe the structure and, in Protocol 3, by a description of the reconstruction of an atomic model of a structure from its rigid-body parameters. The most recent version of 3DNA, version 2.1, has new features for the analysis and manipulation of ensembles of structures, such as those deduced from nuclear magnetic resonance (NMR) measurements and molecular dynamic (MD) simulations; these features are presented in Protocols 4 and 5. In addition to the 3DNA stand-alone software package, the w3DNA web server, located at http://w3dna.rutgers.edu, provides a user-friendly interface to selected features of the software. Protocol 6 demonstrates a novel feature of the site for building models of long DNA molecules decorated with bound proteins at user-specified locations.
Genetics, Issue 74, Molecular Biology, Biochemistry, Bioengineering, Biophysics, Genomics, Chemical Biology, Quantitative Biology, conformational analysis, DNA, high-resolution structures, model building, molecular dynamics, nucleic acid structure, RNA, visualization, bioinformatics, three-dimensional, 3DNA, software
4401
Play Button
In Vivo 2-Photon Calcium Imaging in Layer 2/3 of Mice
Authors: Peyman Golshani, Carlos Portera-Cailliau.
Institutions: University of California, Los Angeles.
To understand network dynamics of microcircuits in the neocortex, it is essential to simultaneously record the activity of a large number of neurons . In-vivo two-photon calcium imaging is the only method that allows one to record the activity of a dense neuronal population with single-cell resolution . The method consists in implanting a cranial imaging window, injecting a fluorescent calcium indicator dye that can be taken up by large numbers of neurons and finally recording the activity of neurons with time lapse calcium imaging using an in-vivo two photon microscope. Co-injection of astrocyte-specific dyes allows one to differentiate neurons from astrocytes. The technique can be performed in mice expressing fluorescent molecules in specific subpopulations of neurons to better understand the network interactions of different groups of cells.
Neuroscience, Issue 13, 2-photon, two-photon, GFP mice, craniotomy, spine dynamics, cranial window
681
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.