JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Bounds on the average sensitivity of nested canalizing functions.
PUBLISHED: 01-01-2013
Nested canalizing Boolean functions (NCF) play an important role in biologically motivated regulatory networks and in signal processing, in particular describing stack filters. It has been conjectured that NCFs have a stabilizing effect on the network dynamics. It is well known that the average sensitivity plays a central role for the stability of (random) Boolean networks. Here we provide a tight upper bound on the average sensitivity of NCFs as a function of the number of relevant input variables. As conjectured in literature this bound is smaller than 4/3. This shows that a large number of functions appearing in biological networks belong to a class that has low average sensitivity, which is even close to a tight lower bound.
The antigenic diversity of HIV-1 has long been an obstacle to vaccine design, and this variability is especially pronounced in the V3 loop of the virus' surface envelope glycoprotein. We previously proposed that the crown of the V3 loop, although dynamic and sequence variable, is constrained throughout the population of HIV-1 viruses to an immunologically relevant β-hairpin tertiary structure. Importantly, there are thousands of different V3 loop crown sequences in circulating HIV-1 viruses, making 3D structural characterization of trends across the diversity of viruses difficult or impossible by crystallography or NMR. Our previous successful studies with folding of the V3 crown1, 2 used the ab initio algorithm 3 accessible in the ICM-Pro molecular modeling software package (Molsoft LLC, La Jolla, CA) and suggested that the crown of the V3 loop, specifically from positions 10 to 22, benefits sufficiently from the flexibility and length of its flanking stems to behave to a large degree as if it were an unconstrained peptide freely folding in solution. As such, rapid ab initio folding of just this portion of the V3 loop of any individual strain of the 60,000+ circulating HIV-1 strains can be informative. Here, we folded the V3 loop of the R2 strain to gain insight into the structural basis of its unique properties. R2 bears a rare V3 loop sequence thought to be responsible for the exquisite sensitivity of this strain to neutralization by patient sera and monoclonal antibodies4, 5. The strain mediates CD4-independent infection and appears to elicit broadly neutralizing antibodies. We demonstrate how evaluation of the results of the folding can be informative for associating observed structures in the folding with the immunological activities observed for R2.
26 Related JoVE Articles!
Play Button
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Authors: Amy H. Van Hove, Brandon D. Wilson, Danielle S. W. Benoit.
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g. primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
Play Button
A Method for Investigating Age-related Differences in the Functional Connectivity of Cognitive Control Networks Associated with Dimensional Change Card Sort Performance
Authors: Bianca DeBenedictis, J. Bruce Morton.
Institutions: University of Western Ontario.
The ability to adjust behavior to sudden changes in the environment develops gradually in childhood and adolescence. For example, in the Dimensional Change Card Sort task, participants switch from sorting cards one way, such as shape, to sorting them a different way, such as color. Adjusting behavior in this way exacts a small performance cost, or switch cost, such that responses are typically slower and more error-prone on switch trials in which the sorting rule changes as compared to repeat trials in which the sorting rule remains the same. The ability to flexibly adjust behavior is often said to develop gradually, in part because behavioral costs such as switch costs typically decrease with increasing age. Why aspects of higher-order cognition, such as behavioral flexibility, develop so gradually remains an open question. One hypothesis is that these changes occur in association with functional changes in broad-scale cognitive control networks. On this view, complex mental operations, such as switching, involve rapid interactions between several distributed brain regions, including those that update and maintain task rules, re-orient attention, and select behaviors. With development, functional connections between these regions strengthen, leading to faster and more efficient switching operations. The current video describes a method of testing this hypothesis through the collection and multivariate analysis of fMRI data from participants of different ages.
Behavior, Issue 87, Neurosciences, fMRI, Cognitive Control, Development, Functional Connectivity
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Preparation of Segmented Microtubules to Study Motions Driven by the Disassembling Microtubule Ends
Authors: Vladimir A. Volkov, Anatoly V. Zaytsev, Ekaterina L. Grishchuk.
Institutions: Russian Academy of Sciences, Federal Research Center of Pediatric Hematology, Oncology and Immunology, Moscow, Russia, University of Pennsylvania.
Microtubule depolymerization can provide force to transport different protein complexes and protein-coated beads in vitro. The underlying mechanisms are thought to play a vital role in the microtubule-dependent chromosome motions during cell division, but the relevant proteins and their exact roles are ill-defined. Thus, there is a growing need to develop assays with which to study such motility in vitro using purified components and defined biochemical milieu. Microtubules, however, are inherently unstable polymers; their switching between growth and shortening is stochastic and difficult to control. The protocols we describe here take advantage of the segmented microtubules that are made with the photoablatable stabilizing caps. Depolymerization of such segmented microtubules can be triggered with high temporal and spatial resolution, thereby assisting studies of motility at the disassembling microtubule ends. This technique can be used to carry out a quantitative analysis of the number of molecules in the fluorescently-labeled protein complexes, which move processively with dynamic microtubule ends. To optimize a signal-to-noise ratio in this and other quantitative fluorescent assays, coverslips should be treated to reduce nonspecific absorption of soluble fluorescently-labeled proteins. Detailed protocols are provided to take into account the unevenness of fluorescent illumination, and determine the intensity of a single fluorophore using equidistant Gaussian fit. Finally, we describe the use of segmented microtubules to study microtubule-dependent motions of the protein-coated microbeads, providing insights into the ability of different motor and nonmotor proteins to couple microtubule depolymerization to processive cargo motion.
Basic Protocol, Issue 85, microscopy flow chamber, single-molecule fluorescence, laser trap, microtubule-binding protein, microtubule-dependent motor, microtubule tip-tracking
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Using Informational Connectivity to Measure the Synchronous Emergence of fMRI Multi-voxel Information Across Time
Authors: Marc N. Coutanche, Sharon L. Thompson-Schill.
Institutions: University of Pennsylvania.
It is now appreciated that condition-relevant information can be present within distributed patterns of functional magnetic resonance imaging (fMRI) brain activity, even for conditions with similar levels of univariate activation. Multi-voxel pattern (MVP) analysis has been used to decode this information with great success. FMRI investigators also often seek to understand how brain regions interact in interconnected networks, and use functional connectivity (FC) to identify regions that have correlated responses over time. Just as univariate analyses can be insensitive to information in MVPs, FC may not fully characterize the brain networks that process conditions with characteristic MVP signatures. The method described here, informational connectivity (IC), can identify regions with correlated changes in MVP-discriminability across time, revealing connectivity that is not accessible to FC. The method can be exploratory, using searchlights to identify seed-connected areas, or planned, between pre-selected regions-of-interest. The results can elucidate networks of regions that process MVP-related conditions, can breakdown MVPA searchlight maps into separate networks, or can be compared across tasks and patient groups.
Neuroscience, Issue 89, fMRI, MVPA, connectivity, informational connectivity, functional connectivity, networks, multi-voxel pattern analysis, decoding, classification, method, multivariate
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
Setting-up an In Vitro Model of Rat Blood-brain Barrier (BBB): A Focus on BBB Impermeability and Receptor-mediated Transport
Authors: Yves Molino, Françoise Jabès, Emmanuelle Lacassagne, Nicolas Gaudin, Michel Khrestchatisky.
Institutions: VECT-HORUS SAS, CNRS, NICN UMR 7259.
The blood brain barrier (BBB) specifically regulates molecular and cellular flux between the blood and the nervous tissue. Our aim was to develop and characterize a highly reproducible rat syngeneic in vitro model of the BBB using co-cultures of primary rat brain endothelial cells (RBEC) and astrocytes to study receptors involved in transcytosis across the endothelial cell monolayer. Astrocytes were isolated by mechanical dissection following trypsin digestion and were frozen for later co-culture. RBEC were isolated from 5-week-old rat cortices. The brains were cleaned of meninges and white matter, and mechanically dissociated following enzymatic digestion. Thereafter, the tissue homogenate was centrifuged in bovine serum albumin to separate vessel fragments from nervous tissue. The vessel fragments underwent a second enzymatic digestion to free endothelial cells from their extracellular matrix. The remaining contaminating cells such as pericytes were further eliminated by plating the microvessel fragments in puromycin-containing medium. They were then passaged onto filters for co-culture with astrocytes grown on the bottom of the wells. RBEC expressed high levels of tight junction (TJ) proteins such as occludin, claudin-5 and ZO-1 with a typical localization at the cell borders. The transendothelial electrical resistance (TEER) of brain endothelial monolayers, indicating the tightness of TJs reached 300 ohm·cm2 on average. The endothelial permeability coefficients (Pe) for lucifer yellow (LY) was highly reproducible with an average of 0.26 ± 0.11 x 10-3 cm/min. Brain endothelial cells organized in monolayers expressed the efflux transporter P-glycoprotein (P-gp), showed a polarized transport of rhodamine 123, a ligand for P-gp, and showed specific transport of transferrin-Cy3 and DiILDL across the endothelial cell monolayer. In conclusion, we provide a protocol for setting up an in vitro BBB model that is highly reproducible due to the quality assurance methods, and that is suitable for research on BBB transporters and receptors.
Medicine, Issue 88, rat brain endothelial cells (RBEC), mouse, spinal cord, tight junction (TJ), receptor-mediated transport (RMT), low density lipoprotein (LDL), LDLR, transferrin, TfR, P-glycoprotein (P-gp), transendothelial electrical resistance (TEER),
Play Button
Linear Amplification Mediated PCR – Localization of Genetic Elements and Characterization of Unknown Flanking DNA
Authors: Richard Gabriel, Ina Kutschera, Cynthia C Bartholomae, Christof von Kalle, Manfred Schmidt.
Institutions: National Center for Tumor Diseases (NCT) and German Cancer Research Center (DKFZ).
Linear-amplification mediated PCR (LAM-PCR) has been developed to study hematopoiesis in gene corrected cells of patients treated by gene therapy with integrating vector systems. Due to the stable integration of retroviral vectors, integration sites can be used to study the clonal fate of individual cells and their progeny. LAM- PCR for the first time provided evidence that leukemia in gene therapy treated patients originated from provirus induced overexpression of a neighboring proto-oncogene. The high sensitivity and specificity of LAM-PCR compared to existing methods like inverse PCR and ligation mediated (LM)-PCR is achieved by an initial preamplification step (linear PCR of 100 cycles) using biotinylated vector specific primers which allow subsequent reaction steps to be carried out on solid phase (magnetic beads). LAM-PCR is currently the most sensitive method available to identify unknown DNA which is located in the proximity of known DNA. Recently, a variant of LAM-PCR has been developed that circumvents restriction digest thus abrogating retrieval bias of integration sites and enables a comprehensive analysis of provirus locations in host genomes. The following protocol explains step-by-step the amplification of both 3’- and 5’- sequences adjacent to the integrated lentiviral vector.
Genetics, Issue 88, gene therapy, integrome, integration site analysis, LAM-PCR, retroviral vectors, lentiviral vectors, AAV, deep sequencing, clonal inventory, mutagenesis screen
Play Button
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Authors: Mirella Vivoli, Halina R. Novak, Jennifer A. Littlechild, Nicholas J. Harmer.
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
Play Button
Enhanced Northern Blot Detection of Small RNA Species in Drosophila Melanogaster
Authors: Pietro Laneve, Angela Giangrande.
Institutions: Institut de Génétique et de Biologie Moléculaire et Cellulaire, Istituto Italiano di Tecnologia.
The last decades have witnessed the explosion of scientific interest around gene expression control mechanisms at the RNA level. This branch of molecular biology has been greatly fueled by the discovery of noncoding RNAs as major players in post-transcriptional regulation. Such a revolutionary perspective has been accompanied and triggered by the development of powerful technologies for profiling short RNAs expression, both at the high-throughput level (genome-wide identification) or as single-candidate analysis (steady state accumulation of specific species). Although several state-of-art strategies are currently available for dosing or visualizing such fleeing molecules, Northern Blot assay remains the eligible approach in molecular biology for immediate and accurate evaluation of RNA expression. It represents a first step toward the application of more sophisticated, costly technologies and, in many cases, remains a preferential method to easily gain insights into RNA biology. Here we overview an efficient protocol (Enhanced Northern Blot) for detecting weakly expressed microRNAs (or other small regulatory RNA species) from Drosophila melanogaster whole embryos, manually dissected larval/adult tissues or in vitro cultured cells. A very limited amount of RNA is required and the use of material from flow cytometry-isolated cells can be also envisaged.
Molecular Biology, Issue 90, Northern blotting, Noncoding RNAs, microRNAs, rasiRNA, Gene expression, Gcm/Glide, Drosophila melanogaster
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Creating Dynamic Images of Short-lived Dopamine Fluctuations with lp-ntPET: Dopamine Movies of Cigarette Smoking
Authors: Evan D. Morris, Su Jin Kim, Jenna M. Sullivan, Shuo Wang, Marc D. Normandin, Cristian C. Constantinescu, Kelly P. Cosgrove.
Institutions: Yale University, Yale University, Yale University, Yale University, Massachusetts General Hospital, University of California, Irvine.
We describe experimental and statistical steps for creating dopamine movies of the brain from dynamic PET data. The movies represent minute-to-minute fluctuations of dopamine induced by smoking a cigarette. The smoker is imaged during a natural smoking experience while other possible confounding effects (such as head motion, expectation, novelty, or aversion to smoking repeatedly) are minimized. We present the details of our unique analysis. Conventional methods for PET analysis estimate time-invariant kinetic model parameters which cannot capture short-term fluctuations in neurotransmitter release. Our analysis - yielding a dopamine movie - is based on our work with kinetic models and other decomposition techniques that allow for time-varying parameters 1-7. This aspect of the analysis - temporal-variation - is key to our work. Because our model is also linear in parameters, it is practical, computationally, to apply at the voxel level. The analysis technique is comprised of five main steps: pre-processing, modeling, statistical comparison, masking and visualization. Preprocessing is applied to the PET data with a unique 'HYPR' spatial filter 8 that reduces spatial noise but preserves critical temporal information. Modeling identifies the time-varying function that best describes the dopamine effect on 11C-raclopride uptake. The statistical step compares the fit of our (lp-ntPET) model 7 to a conventional model 9. Masking restricts treatment to those voxels best described by the new model. Visualization maps the dopamine function at each voxel to a color scale and produces a dopamine movie. Interim results and sample dopamine movies of cigarette smoking are presented.
Behavior, Issue 78, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Medicine, Anatomy, Physiology, Image Processing, Computer-Assisted, Receptors, Dopamine, Dopamine, Functional Neuroimaging, Binding, Competitive, mathematical modeling (systems analysis), Neurotransmission, transient, dopamine release, PET, modeling, linear, time-invariant, smoking, F-test, ventral-striatum, clinical techniques
Play Button
Amplifying and Quantifying HIV-1 RNA in HIV Infected Individuals with Viral Loads Below the Limit of Detection by Standard Clinical Assays
Authors: Helene Mens, Mary Kearney, Ann Wiegand, Jonathan Spindler, Frank Maldarelli, John W. Mellors, John M. Coffin.
Institutions: NCI-Frederick, University of Pittsburgh, Tuffts University.
Amplifying viral genes and quantifying HIV-1 RNA in HIV-1 infected individuals with viral loads below the limit of detection by standard assays (below 50-75 copies/ml) is necessary to gain insight to viral dynamics and virus host interactions in patients who naturally control the infection and those who are on combination antiretroviral therapy (cART). Here we describe how to amplify viral genomes by single genome sequencing (the SGS protocol) 13, 19 and how to accurately quantify HIV-1 RNA in patients with low viral loads (the single-copy assay (SCA) protocol) 12, 20. The single-copy assay is a real-time PCR assay with sensitivity depending on the volume of plasma being assayed. If a single virus genome is detected in 7 ml of plasma, then the RNA copy number is reported to be 0.3 copies/ml. The assay has an internal control testing for the efficiency of RNA extraction, and controls for possible amplification from DNA or contamination. Patient samples are measured in triplicate. The single-genome sequencing assay (SGS), now widely used and considered to be non-labor intensive 3, 7, 12, 14, 15,is a limiting dilution assay, in which endpoint diluted cDNA product is spread over a 96-well plate. According to a Poisson distribution, when less than 1/3 of the wells give product, there is an 80% chance of the PCR product being resultant of amplification from a single cDNA molecule. SGS has the advantage over cloning of not being subjected to resampling and not being biased by PCR-introduced recombination 19. However, the amplification success of SCA and SGS depend on primer design. Both assays were developed for HIV-1 subtype B, but can be adapted for other subtypes and other regions of the genome by changing primers, probes, and standards.
Immunology, Issue 55, single genome sequencing, SGS, real-time PCR, single-copy assay, SCA, HIV-1, ultra-sensitive, RNA extraction
Play Button
Bioengineering Human Microvascular Networks in Immunodeficient Mice
Authors: Ruei-Zeng Lin, Juan M. Melero-Martin.
Institutions: Harvard Medical School.
The future of tissue engineering and cell-based therapies for tissue regeneration will likely rely on our ability to generate functional vascular networks in vivo. In this regard, the search for experimental models to build blood vessel networks in vivo is of utmost importance 1. The feasibility of bioengineering microvascular networks in vivo was first shown using human tissue-derived mature endothelial cells (ECs) 2-4; however, such autologous endothelial cells present problems for wide clinical use, because they are difficult to obtain in sufficient quantities and require harvesting from existing vasculature. These limitations have instigated the search for other sources of ECs. The identification of endothelial colony-forming cells (ECFCs) in blood presented an opportunity to non-invasively obtain ECs 5-7. We and other authors have shown that adult and cord blood-derived ECFCs have the capacity to form functional vascular networks in vivo 7-11. Importantly, these studies have also shown that to obtain stable and durable vascular networks, ECFCs require co-implantation with perivascular cells. The assay we describe here illustrates this concept: we show how human cord blood-derived ECFCs can be combined with bone marrow-derived mesenchymal stem cells (MSCs) as a single cell suspension in a collagen/fibronectin/fibrinogen gel to form a functional human vascular network within 7 days after implantation into an immunodeficient mouse. The presence of human ECFC-lined lumens containing host erythrocytes can be seen throughout the implants indicating not only the formation (de novo) of a vascular network, but also the development of functional anastomoses with the host circulatory system. This murine model of bioengineered human vascular network is ideally suited for studies on the cellular and molecular mechanisms of human vascular network formation and for the development of strategies to vascularize engineered tissues.
Bioengineering, Issue 53, vascular network, blood vessel, vasculogenesis, angiogenesis, endothelial progenitor cells, endothelial colony-forming cells, mesenchymal stem cells, collagen gel, fibrin gel, tissue engineering, regenerative medicine
Play Button
A Faster, High Resolution, mtPA-GFP-based Mitochondrial Fusion Assay Acquiring Kinetic Data of Multiple Cells in Parallel Using Confocal Microscopy
Authors: Alenka Lovy, Anthony J.A. Molina, Fernanda M. Cerqueira, Kyle Trudeau, Orian S. Shirihai.
Institutions: Tufts School of Medicine, Wake Forest Baptist Medical Center, Boston University Medical Center.
Mitochondrial fusion plays an essential role in mitochondrial calcium homeostasis, bioenergetics, autophagy and quality control. Fusion is quantified in living cells by photo-conversion of matrix targeted photoactivatable GFP (mtPAGFP) in a subset of mitochondria. The rate at which the photoconverted molecules equilibrate across the entire mitochondrial population is used as a measure of fusion activity. Thus far measurements were performed using a single cell time lapse approach, quantifying the equilibration in one cell over an hour. Here, we scale up and automate a previously published live cell method based on using mtPAGFP and a low concentration of TMRE (15 nm). This method involves photoactivating a small portion of the mitochondrial network, collecting highly resolved stacks of confocal sections every 15 min for 1 hour, and quantifying the change in signal intensity. Depending on several factors such as ease of finding PAGFP expressing cells, and the signal of the photoactivated regions, it is possible to collect around 10 cells within the 15 min intervals. This provides a significant improvement in the time efficiency of this assay while maintaining the highly resolved subcellular quantification as well as the kinetic parameters necessary to capture the detail of mitochondrial behavior in its native cytoarchitectural environment. Mitochondrial dynamics play a role in many cellular processes including respiration, calcium regulation, and apoptosis1,2,3,13. The structure of the mitochondrial network affects the function of mitochondria, and the way they interact with the rest of the cell. Undergoing constant division and fusion, mitochondrial networks attain various shapes ranging from highly fused networks, to being more fragmented. Interestingly, Alzheimer's disease, Parkinson's disease, Charcot Marie Tooth 2A, and dominant optic atrophy have been correlated with altered mitochondrial morphology, namely fragmented networks4,10,13. Often times, upon fragmentation, mitochondria become depolarized, and upon accumulation this leads to impaired cell function18. Mitochondrial fission has been shown to signal a cell to progress toward apoptosis. It can also provide a mechanism by which to separate depolarized and inactive mitochondria to keep the bulk of the network robust14. Fusion of mitochondria, on the other hand, leads to sharing of matrix proteins, solutes, mtDNA and the electrochemical gradient, and also seems to prevent progression to apoptosis9. How fission and fusion of mitochondria affects cell homeostasis and ultimately the functioning of the organism needs further understanding, and therefore the continuous development and optimization of how to gather information on these phenomena is necessary. Existing mitochondrial fusion assays have revealed various insights into mitochondrial physiology, each having its own advantages. The hybrid PEG fusion assay7, mixes two populations of differently labeled cells (mtRFP and mtYFP), and analyzes the amount of mixing and colocalization of fluorophores in fused, multinucleated, cells. Although this method has yielded valuable information, not all cell types can fuse, and the conditions under which fusion is stimulated involves the use of toxic drugs that likely affect the normal fusion process. More recently, a cell free technique has been devised, using isolated mitochondria to observe fusion events based on a luciferase assay1,5. Two human cell lines are targeted with either the amino or a carboxy terminal part of Renilla luciferase along with a leucine zipper to ensure dimerization upon mixing. Mitochondria are isolated from each cell line, and fused. The fusion reaction can occur without the cytosol under physiological conditions in the presence of energy, appropriate temperature and inner mitochondrial membrane potential. Interestingly, the cytosol was found to modulate the extent of fusion, demonstrating that cell signaling regulates the fusion process 4,5. This assay will be very useful for high throughput screening to identify components of the fusion machinery and also pharmacological compounds that may affect mitochondrial dynamics. However, more detailed whole cell mitochondrial assays will be needed to complement this in vitro assay to observe these events within a cellular environment. A technique for monitoring whole-cell mitochondrial dynamics has been in use for some time and is based on a mitochondrially-targeted photoactivatable GFP (mtPAGFP)6,11. Upon expression of the mtPAGFP, a small portion of the mitochondrial network is photoactivated (10-20%), and the spread of the signal to the rest of the mitochondrial network is recorded every 15 minutes for 1 hour using time lapse confocal imaging. Each fusion event leads to a dilution of signal intensity, enabling quantification of the fusion rate. Although fusion and fission are continuously occurring in cells, this technique only monitors fusion as fission does not lead to a dilution of the PAGFP signal6. Co-labeling with low levels of TMRE (7-15 nM in INS1 cells) allows quantification of the membrane potential of mitochondria. When mitochondria are hyperpolarized they uptake more TMRE, and when they depolarize they lose the TMRE dye. Mitochondria that depolarize no longer have a sufficient membrane potential and tend not to fuse as efficiently if at all. Therefore, active fusing mitochondria can be tracked with these low levels of TMRE9,15. Accumulation of depolarized mitochondria that lack a TMRE signal may be a sign of phototoxicity or cell death. Higher concentrations of TMRE render mitochondria very sensitive to laser light, and therefore great care must be taken to avoid overlabeling with TMRE. If the effect of depolarization of mitochondria is the topic of interest, a technique using slightly higher levels of TMRE and more intense laser light can be used to depolarize mitochondria in a controlled fashion (Mitra and Lippincott-Schwartz, 2010). To ensure that toxicity due to TMRE is not an issue, we suggest exposing loaded cells (3-15 nM TMRE) to the imaging parameters that will be used in the assay (perhaps 7 stacks of 6 optical sections in a row), and assessing cell health after 2 hours. If the mitochondria appear too fragmented and cells are dying, other mitochondrial markers, such as dsRED or Mitotracker red could be used instead of TMRE. The mtPAGFP method has revealed details about mitochondrial network behavior that could not be visualized using other methods. For example, we now know that mitochondrial fusion can be full or transient, where matrix content can mix without changing the overall network morphology. Additionally, we know that the probability of fusion is independent of contact duration and organelle dimension, is influenced by organelle motility, membrane potential and history of previous fusion activity8,15,16,17. In this manuscript, we describe a methodology for scaling up the previously published protocol using mtPAGFP and 15nM TMRE8 in order to examine multiple cells at a time and improve the time efficiency of data collection without sacrificing the subcellular resolution. This has been made possible by the use of an automated microscope stage, and programmable image acquisition software. Zen software from Zeiss allows the user to mark and track several designated cells expressing mtPAGFP. Each of these cells can be photoactivated in a particular region of interest, and stacks of confocal slices can be monitored for mtPAGFP signal as well as TMRE at specified intervals. Other confocal systems could be used to perform this protocol provided there is an automated stage that is programmable, an incubator with CO2, and a means by which to photoactivate the PAGFP; either a multiphoton laser, or a 405 nm diode laser.
Molecular Biology, Issue 65, Genetics, Cellular Biology, Physics, confocal microscopy, mitochondria, fusion, TMRE, mtPAGFP, INS1, mitochondrial dynamics, mitochondrial morphology, mitochondrial network
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Affinity Purification of Influenza Virus Ribonucleoprotein Complexes from the Chromatin of Infected Cells
Authors: Geoffrey P. Chase, Martin Schwemmle.
Institutions: Universitätsklinikum Freiburg.
Like all negative-strand RNA viruses, the genome of influenza viruses is packaged in the form of viral ribonucleoprotein complexes (vRNP), in which the single-stranded genome is encapsidated by the nucleoprotein (NP), and associated with the trimeric polymerase complex consisting of the PA, PB1, and PB2 subunits. However, in contrast to most RNA viruses, influenza viruses perform viral RNA synthesis in the nuclei of infected cells. Interestingly, viral mRNA synthesis uses cellular pre-mRNAs as primers, and it has been proposed that this process takes place on chromatin1. Interactions between the viral polymerase and the host RNA polymerase II, as well as between NP and host nucleosomes have also been characterized1,2. Recently, the generation of recombinant influenza viruses encoding a One-Strep-Tag genetically fused to the C-terminus of the PB2 subunit of the viral polymerase (rWSN-PB2-Strep3) has been described. These recombinant viruses allow the purification of PB2-containing complexes, including vRNPs, from infected cells. To obtain purified vRNPs, cell cultures are infected, and vRNPs are affinity purified from lysates derived from these cells. However, the lysis procedures used to date have been based on one-step detergent lysis, which, despite the presence of a general nuclease, often extract chromatin-bound material only inefficiently. Our preliminary work suggested that a large portion of nuclear vRNPs were not extracted during traditional cell lysis, and therefore could not be affinity purified. To increase this extraction efficiency, and to separate chromatin-bound from non-chromatin-bound nuclear vRNPs, we adapted a step-wise subcellular extraction protocol to influenza virus-infected cells. Briefly, this procedure first separates the nuclei from the cell and then extracts soluble nuclear proteins (here termed the "nucleoplasmic" fraction). The remaining insoluble nuclear material is then digested with Benzonase, an unspecific DNA/RNA nuclease, followed by two salt extraction steps: first using 150 mM NaCl (termed "ch150"), then 500 mM NaCl ("ch500") (Fig. 1). These salt extraction steps were chosen based on our observation that 500 mM NaCl was sufficient to solubilize over 85% of nuclear vRNPs yet still allow binding of tagged vRNPs to the affinity matrix. After subcellular fractionation of infected cells, it is possible to affinity purify PB2-tagged vRNPs from each individual fraction and analyze their protein and RNA components using Western Blot and primer extension, respectively. Recently, we utilized this method to discover that vRNP export complexes form during late points after infection on the chromatin fraction extracted with 500 mM NaCl (ch500)3.
Virology, Issue 64, Immunology, Molecular Biology, Influenza A virus, affinity purification, subcellular fractionation, chromatin, vRNP complexes, polymerase
Play Button
Optical Recording of Suprathreshold Neural Activity with Single-cell and Single-spike Resolution
Authors: Gayathri Nattar Ranganathan, Helmut J. Koester.
Institutions: The University of Texas at Austin.
Signaling of information in the vertebrate central nervous system is often carried by populations of neurons rather than individual neurons. Also propagation of suprathreshold spiking activity involves populations of neurons. Empirical studies addressing cortical function directly thus require recordings from populations of neurons with high resolution. Here we describe an optical method and a deconvolution algorithm to record neural activity from up to 100 neurons with single-cell and single-spike resolution. This method relies on detection of the transient increases in intracellular somatic calcium concentration associated with suprathreshold electrical spikes (action potentials) in cortical neurons. High temporal resolution of the optical recordings is achieved by a fast random-access scanning technique using acousto-optical deflectors (AODs)1. Two-photon excitation of the calcium-sensitive dye results in high spatial resolution in opaque brain tissue2. Reconstruction of spikes from the fluorescence calcium recordings is achieved by a maximum-likelihood method. Simultaneous electrophysiological and optical recordings indicate that our method reliably detects spikes (>97% spike detection efficiency), has a low rate of false positive spike detection (< 0.003 spikes/sec), and a high temporal precision (about 3 msec) 3. This optical method of spike detection can be used to record neural activity in vitro and in anesthetized animals in vivo3,4.
Neuroscience, Issue 67, functional calcium imaging, spatiotemporal patterns of activity, dithered random-access scanning
Play Button
Mapping Bacterial Functional Networks and Pathways in Escherichia Coli using Synthetic Genetic Arrays
Authors: Alla Gagarinova, Mohan Babu, Jack Greenblatt, Andrew Emili.
Institutions: University of Toronto, University of Toronto, University of Regina.
Phenotypes are determined by a complex series of physical (e.g. protein-protein) and functional (e.g. gene-gene or genetic) interactions (GI)1. While physical interactions can indicate which bacterial proteins are associated as complexes, they do not necessarily reveal pathway-level functional relationships1. GI screens, in which the growth of double mutants bearing two deleted or inactivated genes is measured and compared to the corresponding single mutants, can illuminate epistatic dependencies between loci and hence provide a means to query and discover novel functional relationships2. Large-scale GI maps have been reported for eukaryotic organisms like yeast3-7, but GI information remains sparse for prokaryotes8, which hinders the functional annotation of bacterial genomes. To this end, we and others have developed high-throughput quantitative bacterial GI screening methods9, 10. Here, we present the key steps required to perform quantitative E. coli Synthetic Genetic Array (eSGA) screening procedure on a genome-scale9, using natural bacterial conjugation and homologous recombination to systemically generate and measure the fitness of large numbers of double mutants in a colony array format. Briefly, a robot is used to transfer, through conjugation, chloramphenicol (Cm) - marked mutant alleles from engineered Hfr (High frequency of recombination) 'donor strains' into an ordered array of kanamycin (Kan) - marked F- recipient strains. Typically, we use loss-of-function single mutants bearing non-essential gene deletions (e.g. the 'Keio' collection11) and essential gene hypomorphic mutations (i.e. alleles conferring reduced protein expression, stability, or activity9, 12, 13) to query the functional associations of non-essential and essential genes, respectively. After conjugation and ensuing genetic exchange mediated by homologous recombination, the resulting double mutants are selected on solid medium containing both antibiotics. After outgrowth, the plates are digitally imaged and colony sizes are quantitatively scored using an in-house automated image processing system14. GIs are revealed when the growth rate of a double mutant is either significantly better or worse than expected9. Aggravating (or negative) GIs often result between loss-of-function mutations in pairs of genes from compensatory pathways that impinge on the same essential process2. Here, the loss of a single gene is buffered, such that either single mutant is viable. However, the loss of both pathways is deleterious and results in synthetic lethality or sickness (i.e. slow growth). Conversely, alleviating (or positive) interactions can occur between genes in the same pathway or protein complex2 as the deletion of either gene alone is often sufficient to perturb the normal function of the pathway or complex such that additional perturbations do not reduce activity, and hence growth, further. Overall, systematically identifying and analyzing GI networks can provide unbiased, global maps of the functional relationships between large numbers of genes, from which pathway-level information missed by other approaches can be inferred9.
Genetics, Issue 69, Molecular Biology, Medicine, Biochemistry, Microbiology, Aggravating, alleviating, conjugation, double mutant, Escherichia coli, genetic interaction, Gram-negative bacteria, homologous recombination, network, synthetic lethality or sickness, suppression
Play Button
Constant Pressure-controlled Extrusion Method for the Preparation of Nano-sized Lipid Vesicles
Authors: Leslie A. Morton, Jonel P. Saludes, Hang Yin.
Institutions: University of Colorado Boulder, University of Colorado Boulder.
Liposomes are artificially prepared vesicles consisting of natural and synthetic phospholipids that are widely used as a cell membrane mimicking platform to study protein-protein and protein-lipid interactions3, monitor drug delivery4,5, and encapsulation4. Phospholipids naturally create curved lipid bilayers, distinguishing itself from a micelle.6 Liposomes are traditionally classified by size and number of bilayers, i.e. large unilamellar vesicles (LUVs), small unilamellar vesicles (SUVs) and multilamellar vesicles (MLVs)7. In particular, the preparation of homogeneous liposomes of various sizes is important for studying membrane curvature that plays a vital role in cell signaling, endo- and exocytosis, membrane fusion, and protein trafficking8. Several groups analyze how proteins are used to modulate processes that involve membrane curvature and thus prepare liposomes of diameters <100 - 400 nm to study their behavior on cell functions3. Others focus on liposome-drug encapsulation, studying liposomes as vehicles to carry and deliver a drug of interest9. Drug encapsulation can be achieved as reported during liposome formation9. Our extrusion step should not affect the encapsulated drug for two reasons, i.e. (1) drug encapsulation should be achieved prior to this step and (2) liposomes should retain their natural biophysical stability, securely carrying the drug in the aqueous core. These research goals further suggest the need for an optimized method to design stable sub-micron lipid vesicles. Nonetheless, the current liposome preparation technologies (sonication10, freeze-and-thaw10, sedimentation) do not allow preparation of liposomes with highly curved surface (i.e. diameter <100 nm) with high consistency and efficiency10,5, which limits the biophysical studies of an emerging field of membrane curvature sensing. Herein, we present a robust preparation method for a variety of biologically relevant liposomes. Manual extrusion using gas-tight syringes and polycarbonate membranes10,5 is a common practice but heterogeneity is often observed when using pore sizes <100 nm due to due to variability of manual pressure applied. We employed a constant pressure-controlled extrusion apparatus to prepare synthetic liposomes whose diameters range between 30 and 400 nm. Dynamic light scattering (DLS)10, electron microscopy11 and nanoparticle tracking analysis (NTA)12 were used to quantify the liposome sizes as described in our protocol, with commercial polystyrene (PS) beads used as a calibration standard. A near linear correlation was observed between the employed pore sizes and the experimentally determined liposomes, indicating high fidelity of our pressure-controlled liposome preparation method. Further, we have shown that this lipid vesicle preparation method is generally applicable, independent of various liposome sizes. Lastly, we have also demonstrated in a time course study that these prepared liposomes were stable for up to 16 hours. A representative nano-sized liposome preparation protocol is demonstrated below.
Bioengineering, Issue 64, Biomedical Engineering, Liposomes, particle extrusion, nano-sized vesicles, dynamic light scattering (DLS), nanoparticle tracking analysis (NTA)
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Applications of EEG Neuroimaging Data: Event-related Potentials, Spectral Power, and Multiscale Entropy
Authors: Jennifer J. Heisz, Anthony R. McIntosh.
Institutions: Baycrest.
When considering human neuroimaging data, an appreciation of signal variability represents a fundamental innovation in the way we think about brain signal. Typically, researchers represent the brain's response as the mean across repeated experimental trials and disregard signal fluctuations over time as "noise". However, it is becoming clear that brain signal variability conveys meaningful functional information about neural network dynamics. This article describes the novel method of multiscale entropy (MSE) for quantifying brain signal variability. MSE may be particularly informative of neural network dynamics because it shows timescale dependence and sensitivity to linear and nonlinear dynamics in the data.
Neuroscience, Issue 76, Neurobiology, Anatomy, Physiology, Medicine, Biomedical Engineering, Electroencephalography, EEG, electroencephalogram, Multiscale entropy, sample entropy, MEG, neuroimaging, variability, noise, timescale, non-linear, brain signal, information theory, brain, imaging
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Play Button
Designing and Implementing Nervous System Simulations on LEGO Robots
Authors: Daniel Blustein, Nikolai Rosenthal, Joseph Ayers.
Institutions: Northeastern University, Bremen University of Applied Sciences.
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.1 The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Neuroscience, Issue 75, Neurobiology, Bioengineering, Behavior, Mechanical Engineering, Computer Science, Marine Biology, Biomimetics, Marine Science, Neurosciences, Synthetic Biology, Robotics, robots, Modeling, models, Sensory Fusion, nervous system, Educational Tools, programming, software, lobster, Homarus americanus, animal model
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.