JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Performance enhancement of MC-CDMA system through novel sensitive bit algorithm aided turbo multi user detection.
PUBLISHED: 02-26-2015
Multi carrier code division multiple access (MC-CDMA) system is a promising multi carrier modulation (MCM) technique for high data rate wireless communication over frequency selective fading channels. MC-CDMA system is a combination of code division multiple access (CDMA) and orthogonal frequency division multiplexing (OFDM). The OFDM parts reduce multipath fading and inter symbol interference (ISI) and the CDMA part increases spectrum utilization. Advantages of this technique are its robustness in case of multipath propagation and improve security with the minimize ISI. Nevertheless, due to the loss of orthogonality at the receiver in a mobile environment, the multiple access interference (MAI) appears. The MAI is one of the factors that degrade the bit error rate (BER) performance of MC-CDMA system. The multiuser detection (MUD) and turbo coding are the two dominant techniques for enhancing the performance of the MC-CDMA systems in terms of BER as a solution of overcome to MAI effects. In this paper a low complexity iterative soft sensitive bits algorithm (SBA) aided logarithmic-Maximum a-Posteriori algorithm (Log MAP) based turbo MUD is proposed. Simulation results show that the proposed method provides better BER performance with low complexity decoding, by mitigating the detrimental effects of MAI.
Authors: Jacopo Tessadori, Michela Chiappalone.
Published: 03-02-2015
Information coding in the Central Nervous System (CNS) remains unexplored. There is mounting evidence that, even at a very low level, the representation of a given stimulus might be dependent on context and history. If this is actually the case, bi-directional interactions between the brain (or if need be a reduced model of it) and sensory-motor system can shed a light on how encoding and decoding of information is performed. Here an experimental system is introduced and described in which the activity of a neuronal element (i.e., a network of neurons extracted from embryonic mammalian hippocampi) is given context and used to control the movement of an artificial agent, while environmental information is fed back to the culture as a sequence of electrical stimuli. This architecture allows a quick selection of diverse encoding, decoding, and learning algorithms to test different hypotheses on the computational properties of neuronal networks.
24 Related JoVE Articles!
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
Measuring Diffusion Coefficients via Two-photon Fluorescence Recovery After Photobleaching
Authors: Kelley D. Sullivan, Edward B. Brown.
Institutions: University of Rochester, University of Rochester.
Multi-fluorescence recovery after photobleaching is a microscopy technique used to measure the diffusion coefficient (or analogous transport parameters) of macromolecules, and can be applied to both in vitro and in vivo biological systems. Multi-fluorescence recovery after photobleaching is performed by photobleaching a region of interest within a fluorescent sample using an intense laser flash, then attenuating the beam and monitoring the fluorescence as still-fluorescent molecules from outside the region of interest diffuse in to replace the photobleached molecules. We will begin our demonstration by aligning the laser beam through the Pockels Cell (laser modulator) and along the optical path through the laser scan box and objective lens to the sample. For simplicity, we will use a sample of aqueous fluorescent dye. We will then determine the proper experimental parameters for our sample including, monitor and bleaching powers, bleach duration, bin widths (for photon counting), and fluorescence recovery time. Next, we will describe the procedure for taking recovery curves, a process that can be largely automated via LabVIEW (National Instruments, Austin, TX) for enhanced throughput. Finally, the diffusion coefficient is determined by fitting the recovery data to the appropriate mathematical model using a least-squares fitting algorithm, readily programmable using software such as MATLAB (The Mathworks, Natick, MA).
Cellular Biology, Issue 36, Diffusion, fluorescence recovery after photobleaching, MP-FRAP, FPR, multi-photon
Play Button
High-throughput Image Analysis of Tumor Spheroids: A User-friendly Software Application to Measure the Size of Spheroids Automatically and Accurately
Authors: Wenjin Chen, Chung Wong, Evan Vosburgh, Arnold J. Levine, David J. Foran, Eugenia Y. Xu.
Institutions: Raymond and Beverly Sackler Foundation, New Jersey, Rutgers University, Rutgers University, Institute for Advanced Study, New Jersey.
The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application – SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary “Manual Initialize” and “Hand Draw” tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.
Cancer Biology, Issue 89, computer programming, high-throughput, image analysis, tumor spheroids, 3D, software application, cancer therapy, drug screen, neuroendocrine tumor cell line, BON-1, cancer research
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Reduced-gravity Environment Hardware Demonstrations of a Prototype Miniaturized Flow Cytometer and Companion Microfluidic Mixing Technology
Authors: William S. Phipps, Zhizhong Yin, Candice Bae, Julia Z. Sharpe, Andrew M. Bishara, Emily S. Nelson, Aaron S. Weaver, Daniel Brown, Terri L. McKay, DeVon Griffin, Eugene Y. Chan.
Institutions: DNA Medicine Institute, Harvard Medical School, NASA Glenn Research Center, ZIN Technologies.
Until recently, astronaut blood samples were collected in-flight, transported to earth on the Space Shuttle, and analyzed in terrestrial laboratories. If humans are to travel beyond low Earth orbit, a transition towards space-ready, point-of-care (POC) testing is required. Such testing needs to be comprehensive, easy to perform in a reduced-gravity environment, and unaffected by the stresses of launch and spaceflight. Countless POC devices have been developed to mimic laboratory scale counterparts, but most have narrow applications and few have demonstrable use in an in-flight, reduced-gravity environment. In fact, demonstrations of biomedical diagnostics in reduced gravity are limited altogether, making component choice and certain logistical challenges difficult to approach when seeking to test new technology. To help fill the void, we are presenting a modular method for the construction and operation of a prototype blood diagnostic device and its associated parabolic flight test rig that meet the standards for flight-testing onboard a parabolic flight, reduced-gravity aircraft. The method first focuses on rig assembly for in-flight, reduced-gravity testing of a flow cytometer and a companion microfluidic mixing chip. Components are adaptable to other designs and some custom components, such as a microvolume sample loader and the micromixer may be of particular interest. The method then shifts focus to flight preparation, by offering guidelines and suggestions to prepare for a successful flight test with regard to user training, development of a standard operating procedure (SOP), and other issues. Finally, in-flight experimental procedures specific to our demonstrations are described.
Cellular Biology, Issue 93, Point-of-care, prototype, diagnostics, spaceflight, reduced gravity, parabolic flight, flow cytometry, fluorescence, cell counting, micromixing, spiral-vortex, blood mixing
Play Button
Tracking the Mammary Architectural Features and Detecting Breast Cancer with Magnetic Resonance Diffusion Tensor Imaging
Authors: Noam Nissan, Edna Furman-Haran, Myra Feinberg-Shapiro, Dov Grobgeld, Erez Eyal, Tania Zehavi, Hadassa Degani.
Institutions: Weizmann Institute of Science, Weizmann Institute of Science, Meir Medical Center, Meir Medical Center.
Breast cancer is the most common cause of cancer among women worldwide. Early detection of breast cancer has a critical role in improving the quality of life and survival of breast cancer patients. In this paper a new approach for the detection of breast cancer is described, based on tracking the mammary architectural elements using diffusion tensor imaging (DTI). The paper focuses on the scanning protocols and image processing algorithms and software that were designed to fit the diffusion properties of the mammary fibroglandular tissue and its changes during malignant transformation. The final output yields pixel by pixel vector maps that track the architecture of the entire mammary ductal glandular trees and parametric maps of the diffusion tensor coefficients and anisotropy indices. The efficiency of the method to detect breast cancer was tested by scanning women volunteers including 68 patients with breast cancer confirmed by histopathology findings. Regions with cancer cells exhibited a marked reduction in the diffusion coefficients and in the maximal anisotropy index as compared to the normal breast tissue, providing an intrinsic contrast for delineating the boundaries of malignant growth. Overall, the sensitivity of the DTI parameters to detect breast cancer was found to be high, particularly in dense breasts, and comparable to the current standard breast MRI method that requires injection of a contrast agent. Thus, this method offers a completely non-invasive, safe and sensitive tool for breast cancer detection.
Medicine, Issue 94, Magnetic Resonance Imaging, breast, breast cancer, diagnosis, water diffusion, diffusion tensor imaging
Play Button
Isolation and Quantitative Immunocytochemical Characterization of Primary Myogenic Cells and Fibroblasts from Human Skeletal Muscle
Authors: Chibeza C. Agley, Anthea M. Rowlerson, Cristiana P. Velloso, Norman L. Lazarus, Stephen D. R. Harridge.
Institutions: King's College London, Cambridge Stem Cell Institute.
The repair and regeneration of skeletal muscle requires the action of satellite cells, which are the resident muscle stem cells. These can be isolated from human muscle biopsy samples using enzymatic digestion and their myogenic properties studied in culture. Quantitatively, the two main adherent cell types obtained from enzymatic digestion are: (i) the satellite cells (termed myogenic cells or muscle precursor cells), identified initially as CD56+ and later as CD56+/desmin+ cells and (ii) muscle-derived fibroblasts, identified as CD56 and TE-7+. Fibroblasts proliferate very efficiently in culture and in mixed cell populations these cells may overrun myogenic cells to dominate the culture. The isolation and purification of different cell types from human muscle is thus an important methodological consideration when trying to investigate the innate behavior of either cell type in culture. Here we describe a system of sorting based on the gentle enzymatic digestion of cells using collagenase and dispase followed by magnetic activated cell sorting (MACS) which gives both a high purity (>95% myogenic cells) and good yield (~2.8 x 106 ± 8.87 x 105 cells/g tissue after 7 days in vitro) for experiments in culture. This approach is based on incubating the mixed muscle-derived cell population with magnetic microbeads beads conjugated to an antibody against CD56 and then passing cells though a magnetic field. CD56+ cells bound to microbeads are retained by the field whereas CD56cells pass unimpeded through the column. Cell suspensions from any stage of the sorting process can be plated and cultured. Following a given intervention, cell morphology, and the expression and localization of proteins including nuclear transcription factors can be quantified using immunofluorescent labeling with specific antibodies and an image processing and analysis package.
Developmental Biology, Issue 95, Stem cell Biology, Tissue Engineering, Stem Cells, Satellite Cells, Skeletal Muscle, Adipocytes, Myogenic Cells, Myoblasts, Fibroblasts, Magnetic Activated Cell Sorting, Image Analysis
Play Button
Adapting Human Videofluoroscopic Swallow Study Methods to Detect and Characterize Dysphagia in Murine Disease Models
Authors: Teresa E. Lever, Sabrina M. Braun, Ryan T. Brooks, Rebecca A. Harris, Loren L. Littrell, Ryan M. Neff, Cameron J. Hinkel, Mitchell J. Allen, Mollie A. Ulsas.
Institutions: University of Missouri, University of Missouri, University of Missouri.
This study adapted human videofluoroscopic swallowing study (VFSS) methods for use with murine disease models for the purpose of facilitating translational dysphagia research. Successful outcomes are dependent upon three critical components: test chambers that permit self-feeding while standing unrestrained in a confined space, recipes that mask the aversive taste/odor of commercially-available oral contrast agents, and a step-by-step test protocol that permits quantification of swallow physiology. Elimination of one or more of these components will have a detrimental impact on the study results. Moreover, the energy level capability of the fluoroscopy system will determine which swallow parameters can be investigated. Most research centers have high energy fluoroscopes designed for use with people and larger animals, which results in exceptionally poor image quality when testing mice and other small rodents. Despite this limitation, we have identified seven VFSS parameters that are consistently quantifiable in mice when using a high energy fluoroscope in combination with the new murine VFSS protocol. We recently obtained a low energy fluoroscopy system with exceptionally high imaging resolution and magnification capabilities that was designed for use with mice and other small rodents. Preliminary work using this new system, in combination with the new murine VFSS protocol, has identified 13 swallow parameters that are consistently quantifiable in mice, which is nearly double the number obtained using conventional (i.e., high energy) fluoroscopes. Identification of additional swallow parameters is expected as we optimize the capabilities of this new system. Results thus far demonstrate the utility of using a low energy fluoroscopy system to detect and quantify subtle changes in swallow physiology that may otherwise be overlooked when using high energy fluoroscopes to investigate murine disease models.
Medicine, Issue 97, mouse, murine, rodent, swallowing, deglutition, dysphagia, videofluoroscopy, radiation, iohexol, barium, palatability, taste, translational, disease models
Play Button
Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone
Authors: Sandra Zehentmeier, Zoltan Cseresnyes, Juan Escribano Navarro, Raluca A. Niesner, Anja E. Hauser.
Institutions: German Rheumatism Research Center, a Leibniz Institute, German Rheumatism Research Center, a Leibniz Institute, Max-Delbrück Center for Molecular Medicine, Wimasis GmbH, Charité - University of Medicine.
Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data.
Developmental Biology, Issue 98, Image analysis, neighborhood analysis, bone marrow, stromal cells, bone marrow niches, simulation, bone cryosectioning, bone histology
Play Button
Development of a Quantitative Recombinase Polymerase Amplification Assay with an Internal Positive Control
Authors: Zachary A. Crannell, Brittany Rohrman, Rebecca Richards-Kortum.
Institutions: Rice University.
It was recently demonstrated that recombinase polymerase amplification (RPA), an isothermal amplification platform for pathogen detection, may be used to quantify DNA sample concentration using a standard curve. In this manuscript, a detailed protocol for developing and implementing a real-time quantitative recombinase polymerase amplification assay (qRPA assay) is provided. Using HIV-1 DNA quantification as an example, the assembly of real-time RPA reactions, the design of an internal positive control (IPC) sequence, and co-amplification of the IPC and target of interest are all described. Instructions and data processing scripts for the construction of a standard curve using data from multiple experiments are provided, which may be used to predict the concentration of unknown samples or assess the performance of the assay. Finally, an alternative method for collecting real-time fluorescence data with a microscope and a stage heater as a step towards developing a point-of-care qRPA assay is described. The protocol and scripts provided may be used for the development of a qRPA assay for any DNA target of interest.
Genetics, Issue 97, recombinase polymerase amplification, isothermal amplification, quantitative, diagnostic, HIV-1, viral load
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
Time Multiplexing Super Resolving Technique for Imaging from a Moving Platform
Authors: Asaf Ilovitsh, Shlomo Zach, Zeev Zalevsky.
Institutions: Bar-Ilan University, Kfar Saba, Israel.
We propose a method for increasing the resolution of an object and overcoming the diffraction limit of an optical system installed on top of a moving imaging system, such as an airborne platform or satellite. The resolution improvement is obtained in a two-step process. First, three low resolution differently defocused images are being captured and the optical phase is retrieved using an improved iterative Gerchberg-Saxton based algorithm. The phase retrieval allows to numerically back propagate the field to the aperture plane. Second, the imaging system is shifted and the first step is repeated. The obtained optical fields at the aperture plane are combined and a synthetically increased lens aperture is generated along the direction of movement, yielding higher imaging resolution. The method resembles a well-known approach from the microwave regime called the Synthetic Aperture Radar (SAR) in which the antenna size is synthetically increased along the platform propagation direction. The proposed method is demonstrated through laboratory experiment.
Physics, Issue 84, Superresolution, Fourier optics, Remote Sensing and Sensors, Digital Image Processing, optics, resolution
Play Button
High Sensitivity 5-hydroxymethylcytosine Detection in Balb/C Brain Tissue
Authors: Theodore Davis, Romualdas Vaisvila.
Institutions: New England Biolabs.
DNA hydroxymethylation is a long known modification of DNA, but has recently become a focus in epigenetic research. Mammalian DNA is enzymatically modified at the 5th carbon position of cytosine (C) residues to 5-mC, predominately in the context of CpG dinucleotides. 5-mC is amenable to enzymatic oxidation to 5-hmC by the Tet family of enzymes, which are believed to be involved in development and disease. Currently, the biological role of 5-hmC is not fully understood, but is generating a lot of interest due to its potential as a biomarker. This is due to several groundbreaking studies identifying 5-hydroxymethylcytosine in mouse embryonic stem (ES) and neuronal cells. Research techniques, including bisulfite sequencing methods, are unable to easily distinguish between 5-mC and 5-hmC . A few protocols exist that can measure global amounts of 5-hydroxymethylcytosine in the genome, including liquid chromatography coupled with mass spectrometry analysis or thin layer chromatography of single nucleosides digested from genomic DNA. Antibodies that target 5-hydroxymethylcytosine also exist, which can be used for dot blot analysis, immunofluorescence, or precipitation of hydroxymethylated DNA, but these antibodies do not have single base resolution.In addition, resolution depends on the size of the immunoprecipitated DNA and for microarray experiments, depends on probe design. Since it is unknown exactly where 5-hydroxymethylcytosine exists in the genome or its role in epigenetic regulation, new techniques are required that can identify locus specific hydroxymethylation. The EpiMark 5-hmC and 5-mC Analysis Kit provides a solution for distinguishing between these two modifications at specific loci. The EpiMark 5-hmC and 5-mC Analysis Kit is a simple and robust method for the identification and quantitation of 5-methylcytosine and 5-hydroxymethylcytosine within a specific DNA locus. This enzymatic approach utilizes the differential methylation sensitivity of the isoschizomers MspI and HpaII in a simple 3-step protocol. Genomic DNA of interest is treated with T4-BGT, adding a glucose moeity to 5-hydroxymethylcytosine. This reaction is sequence-independent, therefore all 5-hmC will be glucosylated; unmodified or 5-mC containing DNA will not be affected. This glucosylation is then followed by restriction endonuclease digestion. MspI and HpaII recognize the same sequence (CCGG) but are sensitive to different methylation states. HpaII cleaves only a completely unmodified site: any modification (5-mC, 5-hmC or 5-ghmC) at either cytosine blocks cleavage. MspI recognizes and cleaves 5-mC and 5-hmC, but not 5-ghmC. The third part of the protocol is interrogation of the locus by PCR. As little as 20 ng of input DNA can be used. Amplification of the experimental (glucosylated and digested) and control (mock glucosylated and digested) target DNA with primers flanking a CCGG site of interest (100-200 bp) is performed. If the CpG site contains 5-hydroxymethylcytosine, a band is detected after glucosylation and digestion, but not in the non-glucosylated control reaction. Real time PCR will give an approximation of how much hydroxymethylcytosine is in this particular site. In this experiment, we will analyze the 5-hydroxymethylcytosine amount in a mouse Babl/C brain sample by end point PCR.
Neuroscience, Issue 48, EpiMark, Epigenetics, 5-hydroxymethylcytosine, 5-methylcytosine, methylation, hydroxymethylation
Play Button
Guide Wire Assisted Catheterization and Colored Dye Injection for Vascular Mapping of Monochorionic Twin Placentas
Authors: Eric B. Jelin, Samuel C. Schecter, Kelly D. Gonzales, Shinjiro Hirose, Hanmin Lee, Geoffrey A. Machin, Larry Rand, Vickie A. Feldstein.
Institutions: University of California, San Francisco, University of Alberta, University of California, San Francisco, University of California, San Francisco.
Monochorionic (MC) twin pregnancies are associated with significantly higher morbidity and mortality rates than dichorionic twins. Approximately 50% of MC twin pregnancies develop complications arising from the shared placenta and associated vascular connections1. Severe twin-to-twin syndrome (TTTS) is reported to account for approximately 20% of these complications2,3. Inter-twin vascular connections occur in almost all MC placentas and are related to the prognosis and outcome of these high-risk twin pregnancies. The number, size and type of connections have been implicated in the development of TTTS and other MC twin conditions. Three types of inter-twin vascular connections occur: 1) artery to vein connections (AVs) in which a branch artery carrying deoxygenated blood from one twin courses along the fetal surface of the placenta and dives into a placental cotyledon. Blood flows via a deep intraparenchymal capillary network into a draining vein that emerges at the fetal surface of the placenta and brings oxygenated blood toward the other twin. There is unidirectional flow from the twin supplying the afferent artery toward the twin receiving the efferent vein; 2) artery to artery connections (AAs) in which a branch artery from each twin meets directly on the superficial placental surface resulting in a vessel with pulsatile bidirectional flow, and 3) vein to vein connections (VVs) in which a branch vein from each twin meets directly on the superficial placental surface allowing low pressure bidirectional flow. In utero obstetric sonography with targeted Doppler interrogation has been used to identify the presence of AV and AA connections4. Prenatally detected AAs that have been confirmed by postnatal placental injection studies have been shown to be associated with an improved prognosis for both twins5. Furthermore, fetoscopic laser ablation of inter-twin vascular connections on the fetal surface of the shared placenta is now the preferred treatment for early, severe TTTS. Postnatal placental injection studies provide a valuable method to confirm the accuracy of prenatal Doppler ultrasound findings and the efficacy of fetal laser therapy6. Using colored dyes separately hand-injected into the arterial and venous circulations of each twin, the technique highlights and delineates AVs, AAs, and VVs. This definitive demonstration of MC placental vascular anatomy may then be correlated with Doppler ultrasound findings and neonatal outcome to enhance our understanding of the pathophysiology of MC twinning and its sequelae. Here we demonstrate our placental injection technique.
Medicine, Issue 55, placenta, monochorionic twins, vascular mapping, twin-to-twin transfusion syndrome (TTTS), obstetrics, fetal surgery
Play Button
Simultaneous Electroencephalography, Real-time Measurement of Lactate Concentration and Optogenetic Manipulation of Neuronal Activity in the Rodent Cerebral Cortex
Authors: William C. Clegern, Michele E. Moore, Michelle A. Schmidt, Jonathan Wisor.
Institutions: Washington State University.
Although the brain represents less than 5% of the body by mass, it utilizes approximately one quarter of the glucose used by the body at rest1. The function of non rapid eye movement sleep (NREMS), the largest portion of sleep by time, is uncertain. However, one salient feature of NREMS is a significant reduction in the rate of cerebral glucose utilization relative to wakefulness2-4. This and other findings have led to the widely held belief that sleep serves a function related to cerebral metabolism. Yet, the mechanisms underlying the reduction in cerebral glucose metabolism during NREMS remain to be elucidated. One phenomenon associated with NREMS that might impact cerebral metabolic rate is the occurrence of slow waves, oscillations at frequencies less than 4 Hz, in the electroencephalogram5,6. These slow waves detected at the level of the skull or cerebral cortical surface reflect the oscillations of underlying neurons between a depolarized/up state and a hyperpolarized/down state7. During the down state, cells do not undergo action potentials for intervals of up to several hundred milliseconds. Restoration of ionic concentration gradients subsequent to action potentials represents a significant metabolic load on the cell8; absence of action potentials during down states associated with NREMS may contribute to reduced metabolism relative to wake. Two technical challenges had to be addressed in order for this hypothetical relationship to be tested. First, it was necessary to measure cerebral glycolytic metabolism with a temporal resolution reflective of the dynamics of the cerebral EEG (that is, over seconds rather than minutes). To do so, we measured the concentration of lactate, the product of aerobic glycolysis, and therefore a readout of the rate of glucose metabolism in the brains of mice. Lactate was measured using a lactate oxidase based real time sensor embedded in the frontal cortex. The sensing mechanism consists of a platinum-iridium electrode surrounded by a layer of lactate oxidase molecules. Metabolism of lactate by lactate oxidase produces hydrogen peroxide, which produces a current in the platinum-iridium electrode. So a ramping up of cerebral glycolysis provides an increase in the concentration of substrate for lactate oxidase, which then is reflected in increased current at the sensing electrode. It was additionally necessary to measure these variables while manipulating the excitability of the cerebral cortex, in order to isolate this variable from other facets of NREMS. We devised an experimental system for simultaneous measurement of neuronal activity via the elecetroencephalogram, measurement of glycolytic flux via a lactate biosensor, and manipulation of cerebral cortical neuronal activity via optogenetic activation of pyramidal neurons. We have utilized this system to document the relationship between sleep-related electroencephalographic waveforms and the moment-to-moment dynamics of lactate concentration in the cerebral cortex. The protocol may be useful for any individual interested in studying, in freely behaving rodents, the relationship between neuronal activity measured at the electroencephalographic level and cellular energetics within the brain.
Neuroscience, Issue 70, Physiology, Anatomy, Medicine, Pharmacology, Surgery, Sleep, rapid eye movement, glucose, glycolysis, pyramidal neurons, channelrhodopsin, optogenetics, optogenetic stimulation, electroencephalogram, EEG, EMG, brain, animal model
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Play Button
Simultaneous Scalp Electroencephalography (EEG), Electromyography (EMG), and Whole-body Segmental Inertial Recording for Multi-modal Neural Decoding
Authors: Thomas C. Bulea, Atilla Kilicarslan, Recep Ozdemir, William H. Paloski, Jose L. Contreras-Vidal.
Institutions: National Institutes of Health, University of Houston, University of Houston, University of Houston, University of Houston.
Recent studies support the involvement of supraspinal networks in control of bipedal human walking. Part of this evidence encompasses studies, including our previous work, demonstrating that gait kinematics and limb coordination during treadmill walking can be inferred from the scalp electroencephalogram (EEG) with reasonably high decoding accuracies. These results provide impetus for development of non-invasive brain-machine-interface (BMI) systems for use in restoration and/or augmentation of gait- a primary goal of rehabilitation research. To date, studies examining EEG decoding of activity during gait have been limited to treadmill walking in a controlled environment. However, to be practically viable a BMI system must be applicable for use in everyday locomotor tasks such as over ground walking and turning. Here, we present a novel protocol for non-invasive collection of brain activity (EEG), muscle activity (electromyography (EMG)), and whole-body kinematic data (head, torso, and limb trajectories) during both treadmill and over ground walking tasks. By collecting these data in the uncontrolled environment insight can be gained regarding the feasibility of decoding unconstrained gait and surface EMG from scalp EEG.
Behavior, Issue 77, Neuroscience, Neurobiology, Medicine, Anatomy, Physiology, Biomedical Engineering, Molecular Biology, Electroencephalography, EEG, Electromyography, EMG, electroencephalograph, gait, brain-computer interface, brain machine interface, neural decoding, over-ground walking, robotic gait, brain, imaging, clinical techniques
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
High Throughput Microfluidic Rapid and Low Cost Prototyping Packaging Methods
Authors: Amine Miled, Mohamad Sawan.
Institutions: Polytechnique Montreal.
In this work, 3 different packaging and assembly techniques are presented. They can be classified into two categories: one-time use and reusable packaging techniques. The one-time use packaging technique employs UV-based and temperature curing epoxies to connect microtubes to access holes, wire-bonding for integrated circuit connections, and silver epoxy for electrical connections. This method is based on a robust assembly technique that can support relatively high pressure close to 1 psi and does not need any support to strengthen the microfluidic architecture. Reusable packaging techniques consist of PDMS-based microtube interconnectors and anisotropic adhesive films for electrical connections. These devices are more sensitive and fragile. Consequently, Plexiglas support is added to the microfluidic structure to improve the electrical contact when anisotropic adhesive films are used, and also to strengthen the microfluidic architecture. In addition, a micromanipulator is needed to maintain tubes while using a thin PDMS layer to connect them to the access holes. Different PDMS layer thicknesses, ranging from 0.45-3 mm, are tested to compare the best adherence versus injection rates. Applied injection rates are varied from 50-300 μl/hr for 0.45-3 mm PDMS layers, respectively. These techniques are mainly applicable for low-pressure applications. However, they can be extended for high-pressure ones through plasma-oxygen process to permanently seal the PDMS to glass substrates. The main advantage of this technique, besides the fact that it is reusable, consists of keeping the device observable when the microchannel length is very short (in the range of 3 mm or lower).
Bioengineering, Issue 82, Microfluidics, PDMS, Lab-on-chip, Rapid-Prototyping, Microfabrication
Play Button
Fabrication of High Contrast Gratings for the Spectrum Splitting Dispersive Element in a Concentrated Photovoltaic System
Authors: Yuhan Yao, He Liu, Wei Wu.
Institutions: University of Sothern California.
High contrast gratings are designed and fabricated and its application is proposed in a parallel spectrum splitting dispersive element that can improve the solar conversion efficiency of a concentrated photovoltaic system. The proposed system will also lower the solar cell cost in the concentrated photovoltaic system by replacing the expensive tandem solar cells with the cost-effective single junction solar cells. The structures and the parameters of high contrast gratings for the dispersive elements were numerically optimized. The large-area fabrication of high contrast gratings was experimentally demonstrated using nanoimprint lithography and dry etching. The quality of grating material and the performance of the fabricated device were both experimentally characterized. By analyzing the measurement results, the possible side effects from the fabrication processes are discussed and several methods that have the potential to improve the fabrication processes are proposed, which can help to increase the optical efficiency of the fabricated devices.
Engineering, Issue 101, Parallel spectrum splitting, dispersive element, high contrast grating, concentrated photovoltaic system, nanoimprint lithography, reactive ion etching
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.