Obtaining in vivo human brain tissue volumetrics from MRI is often complicated by various technical and biological issues. These challenges are exacerbated when significant brain atrophy and age-related white matter changes (e.g. Leukoaraiosis) are present. Lesion Explorer (LE) is an accurate and reliable neuroimaging pipeline specifically developed to address such issues commonly observed on MRI of Alzheimer's disease and normal elderly. The pipeline is a complex set of semi-automatic procedures which has been previously validated in a series of internal and external reliability tests1,2. However, LE's accuracy and reliability is highly dependent on properly trained manual operators to execute commands, identify distinct anatomical landmarks, and manually edit/verify various computer-generated segmentation outputs.
LE can be divided into 3 main components, each requiring a set of commands and manual operations: 1) Brain-Sizer, 2) SABRE, and 3) Lesion-Seg. Brain-Sizer's manual operations involve editing of the automatic skull-stripped total intracranial vault (TIV) extraction mask, designation of ventricular cerebrospinal fluid (vCSF), and removal of subtentorial structures. The SABRE component requires checking of image alignment along the anterior and posterior commissure (ACPC) plane, and identification of several anatomical landmarks required for regional parcellation. Finally, the Lesion-Seg component involves manual checking of the automatic lesion segmentation of subcortical hyperintensities (SH) for false positive errors.
While on-site training of the LE pipeline is preferable, readily available visual teaching tools with interactive training images are a viable alternative. Developed to ensure a high degree of accuracy and reliability, the following is a step-by-step, video-guided, standardized protocol for LE's manual procedures.
27 Related JoVE Articles!
High-speed Particle Image Velocimetry Near Surfaces
Institutions: University of Michigan.
Multi-dimensional and transient flows play a key role in many areas of science, engineering, and health sciences but are often not well understood. The complex nature of these flows may be studied using particle image velocimetry (PIV), a laser-based imaging technique for optically accessible flows. Though many forms of PIV exist that extend the technique beyond the original planar two-component velocity measurement capabilities, the basic PIV system consists of a light source (laser), a camera, tracer particles, and analysis algorithms. The imaging and recording parameters, the light source, and the algorithms are adjusted to optimize the recording for the flow of interest and obtain valid velocity data.
Common PIV investigations measure two-component velocities in a plane at a few frames per second. However, recent developments in instrumentation have facilitated high-frame rate (> 1 kHz) measurements capable of resolving transient flows with high temporal resolution. Therefore, high-frame rate measurements have enabled investigations on the evolution of the structure and dynamics of highly transient flows. These investigations play a critical role in understanding the fundamental physics of complex flows.
A detailed description for performing high-resolution, high-speed planar PIV to study a transient flow near the surface of a flat plate is presented here. Details for adjusting the parameter constraints such as image and recording properties, the laser sheet properties, and processing algorithms to adapt PIV for any flow of interest are included.
Physics, Issue 76, Mechanical Engineering, Fluid Mechanics, flow measurement, fluid heat transfer, internal flow in turbomachinery (applications), boundary layer flow (general), flow visualization (instrumentation), laser instruments (design and operation), Boundary layer, micro-PIV, optical laser diagnostics, internal combustion engines, flow, fluids, particle, velocimetry, visualization
3D Orbital Tracking in a Modified Two-photon Microscope: An Application to the Tracking of Intracellular Vesicles
Institutions: University of California, Irvine.
The objective of this video protocol is to discuss how to perform and analyze a three-dimensional fluorescent orbital particle tracking experiment using a modified two-photon microscope1
. As opposed to conventional approaches (raster scan or wide field based on a stack of frames), the 3D orbital tracking allows to localize and follow with a high spatial (10 nm accuracy) and temporal resolution (50 Hz frequency response) the 3D displacement of a moving fluorescent particle on length-scales of hundreds of microns2
. The method is based on a feedback algorithm that controls the hardware of a two-photon laser scanning microscope in order to perform a circular orbit around the object to be tracked: the feedback mechanism will maintain the fluorescent object in the center by controlling the displacement of the scanning beam3-5
. To demonstrate the advantages of this technique, we followed a fast moving organelle, the lysosome, within a living cell6,7
. Cells were plated according to standard protocols, and stained using a commercially lysosome dye. We discuss briefly the hardware configuration and in more detail the control software, to perform a 3D orbital tracking experiment inside living cells. We discuss in detail the parameters required in order to control the scanning microscope and enable the motion of the beam in a closed orbit around the particle. We conclude by demonstrating how this method can be effectively used to track the fast motion of a labeled lysosome along microtubules in 3D within a live cell. Lysosomes can move with speeds in the range of 0.4-0.5 µm/sec, typically displaying a directed motion along the microtubule network8
Bioengineering, Issue 92, fluorescence, single particle tracking, laser scanning microscope, two-photon, vesicle transport, live-cell imaging, optics
Micron-scale Resolution Optical Tomography of Entire Mouse Brains with Confocal Light Sheet Microscopy
Institutions: European Laboratory for Non-linear Spectroscopy (LENS), University Campus Bio-medico of Rome, University of Cassino, National Institute of Optics (CNR-INO), Allen Institute for Brain Science, University of Florence, ICON Foundation, Sesto Fiorentino, Italy.
Understanding the architecture of mammalian brain at single-cell resolution is one of the key issues of neuroscience. However, mapping neuronal soma and projections throughout the whole brain is still challenging for imaging and data management technologies. Indeed, macroscopic volumes need to be reconstructed with high resolution and contrast in a reasonable time, producing datasets in the TeraByte range. We recently demonstrated an optical method (confocal light sheet microscopy, CLSM) capable of obtaining micron-scale reconstruction of entire mouse brains labeled with enhanced green fluorescent protein (EGFP). Combining light sheet illumination and confocal detection, CLSM allows deep imaging inside macroscopic cleared specimens with high contrast and speed. Here we describe the complete experimental pipeline to obtain comprehensive and human-readable images of entire mouse brains labeled with fluorescent proteins. The clearing and the mounting procedures are described, together with the steps to perform an optical tomography on its whole volume by acquiring many parallel adjacent stacks. We showed the usage of open-source custom-made software tools enabling stitching of the multiple stacks and multi-resolution data navigation. Finally, we illustrated some example of brain maps: the cerebellum from an L7-GFP transgenic mouse, in which all Purkinje cells are selectively labeled, and the whole brain from a thy1-GFP-M mouse, characterized by a random sparse neuronal labeling.
Neuroscience, Issue 80, Microscopy, Neuroanatomy, Connectomics, Light sheet microscopy, Whole-brain imaging
4D Imaging of Protein Aggregation in Live Cells
Institutions: Hebrew University of Jerusalem .
One of the key tasks of any living cell is maintaining the proper folding of newly synthesized proteins in the face of ever-changing environmental conditions and an intracellular environment that is tightly packed, sticky, and hazardous to protein stability1
. The ability to dynamically balance protein production, folding and degradation demands highly-specialized quality control machinery, whose absolute necessity is observed best when it malfunctions. Diseases such as ALS, Alzheimer's, Parkinson's, and certain forms of Cystic Fibrosis have a direct link to protein folding quality control components2
, and therefore future therapeutic development requires a basic understanding of underlying processes. Our experimental challenge is to understand how cells integrate damage signals and mount responses that are tailored to diverse circumstances.
The primary reason why protein misfolding represents an existential threat to the cell is the propensity of incorrectly folded proteins to aggregate, thus causing a global perturbation of the crowded and delicate intracellular folding environment1
. The folding health, or "proteostasis," of the cellular proteome is maintained, even under the duress of aging, stress and oxidative damage, by the coordinated action of different mechanistic units in an elaborate quality control system3,4
. A specialized machinery of molecular chaperones can bind non-native polypeptides and promote their folding into the native state1
, target them for degradation by the ubiquitin-proteasome system5
, or direct them to protective aggregation inclusions6-9
In eukaryotes, the cytosolic aggregation quality control load is partitioned between two compartments8-10
: the juxtanuclear quality control compartment (JUNQ) and the insoluble protein deposit (IPOD) (Figure 1
- model). Proteins that are ubiquitinated by the protein folding quality control machinery are delivered to the JUNQ, where they are processed for degradation by the proteasome. Misfolded proteins that are not ubiquitinated are diverted to the IPOD, where they are actively aggregated in a protective compartment.
Up until this point, the methodological paradigm of live-cell fluorescence microscopy has largely been to label proteins and track their locations in the cell at specific time-points and usually in two dimensions. As new technologies have begun to grant experimenters unprecedented access to the submicron scale in living cells, the dynamic architecture of the cytosol has come into view as a challenging new frontier for experimental characterization. We present a method for rapidly monitoring the 3D spatial distributions of multiple fluorescently labeled proteins in the yeast cytosol over time. 3D timelapse (4D imaging) is not merely a technical challenge; rather, it also facilitates a dramatic shift in the conceptual framework used to analyze cellular structure.
We utilize a cytosolic folding sensor protein in live yeast to visualize distinct fates for misfolded proteins in cellular aggregation quality control, using rapid 4D fluorescent imaging. The temperature sensitive mutant of the Ubc9 protein10-12
) is extremely effective both as a sensor of cellular proteostasis, and a physiological model for tracking aggregation quality control. As with most ts proteins, Ubc9ts
is fully folded and functional at permissive temperatures due to active cellular chaperones. Above 30 °C, or when the cell faces misfolding stress, Ubc9ts
misfolds and follows the fate of a native globular protein that has been misfolded due to mutation, heat denaturation, or oxidative damage. By fusing it to GFP or other fluorophores, it can be tracked in 3D as it forms Stress Foci, or is directed to JUNQ or IPOD.
Cellular Biology, Issue 74, Molecular Biology, Genetics, Proteins, Aggregation quality control, protein folding quality control, GFP, JUNQ (juxtanuclear quality control compartment), IPOD (insoluble protein deposit), proteostasis sensor, 4D live cell imaging, live cells, laser, cell biology, protein folding, Ubc9ts, yeast, assay, cell, imaging
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
How to Ignite an Atmospheric Pressure Microwave Plasma Torch without Any Additional Igniters
Institutions: University of Stuttgart.
This movie shows how an atmospheric pressure plasma torch can be ignited by microwave power with no additional igniters. After ignition of the plasma, a stable and continuous operation of the plasma is possible and the plasma torch can be used for many different applications. On one hand, the hot (3,600 K gas temperature) plasma can be used for chemical processes and on the other hand the cold afterglow (temperatures down to almost RT) can be applied for surface processes. For example chemical syntheses are interesting volume processes. Here the microwave plasma torch can be used for the decomposition of waste gases which are harmful and contribute to the global warming but are needed as etching gases in growing industry sectors like the semiconductor branch. Another application is the dissociation of CO2
. Surplus electrical energy from renewable energy sources can be used to dissociate CO2
to CO and O2
. The CO can be further processed to gaseous or liquid higher hydrocarbons thereby providing chemical storage of the energy, synthetic fuels or platform chemicals for the chemical industry. Applications of the afterglow of the plasma torch are the treatment of surfaces to increase the adhesion of lacquer, glue or paint, and the sterilization or decontamination of different kind of surfaces. The movie will explain how to ignite the plasma solely by microwave power without any additional igniters, e.g.
, electric sparks. The microwave plasma torch is based on a combination of two resonators — a coaxial one which provides the ignition of the plasma and a cylindrical one which guarantees a continuous and stable operation of the plasma after ignition. The plasma can be operated in a long microwave transparent tube for volume processes or shaped by orifices for surface treatment purposes.
Engineering, Issue 98, atmospheric pressure plasma, microwave plasma, plasma ignition, resonator structure, coaxial resonator, cylindrical resonator, plasma torch, stable plasma operation, continuous plasma operation, high speed camera
Electric Cell-substrate Impedance Sensing for the Quantification of Endothelial Proliferation, Barrier Function, and Motility
Institutions: Institute for Cardiovascular Research, VU University Medical Center, Institute for Cardiovascular Research, VU University Medical Center.
Electric Cell-substrate Impedance Sensing (ECIS) is an in vitro
impedance measuring system to quantify the behavior of cells within adherent cell layers. To this end, cells are grown in special culture chambers on top of opposing, circular gold electrodes. A constant small alternating current is applied between the electrodes and the potential across is measured. The insulating properties of the cell membrane create a resistance towards the electrical current flow resulting in an increased electrical potential between the electrodes. Measuring cellular impedance in this manner allows the automated study of cell attachment, growth, morphology, function, and motility. Although the ECIS measurement itself is straightforward and easy to learn, the underlying theory is complex and selection of the right settings and correct analysis and interpretation of the data is not self-evident. Yet, a clear protocol describing the individual steps from the experimental design to preparation, realization, and analysis of the experiment is not available. In this article the basic measurement principle as well as possible applications, experimental considerations, advantages and limitations of the ECIS system are discussed. A guide is provided for the study of cell attachment, spreading and proliferation; quantification of cell behavior in a confluent layer, with regard to barrier function, cell motility, quality of cell-cell and cell-substrate adhesions; and quantification of wound healing and cellular responses to vasoactive stimuli. Representative results are discussed based on human microvascular (MVEC) and human umbilical vein endothelial cells (HUVEC), but are applicable to all adherent growing cells.
Bioengineering, Issue 85, ECIS, Impedance Spectroscopy, Resistance, TEER, Endothelial Barrier, Cell Adhesions, Focal Adhesions, Proliferation, Migration, Motility, Wound Healing
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
Quasi-light Storage for Optical Data Packets
Institutions: Hochschule für Telekommunikation, Leipzig.
Today's telecommunication is based on optical packets which transmit the information in optical fiber networks around the world. Currently, the processing of the signals is done in the electrical domain. Direct storage in the optical domain would avoid the transfer of the packets to the electrical and back to the optical domain in every network node and, therefore, increase the speed and possibly reduce the energy consumption of telecommunications. However, light consists of photons which propagate with the speed of light in vacuum. Thus, the storage of light is a big challenge. There exist some methods to slow down the speed of the light, or to store it in excitations of a medium. However, these methods cannot be used for the storage of optical data packets used in telecommunications networks. Here we show how the time-frequency-coherence, which holds for every signal and therefore for optical packets as well, can be exploited to build an optical memory. We will review the background and show in detail and through examples, how a frequency comb can be used for the copying of an optical packet which enters the memory. One of these time domain copies is then extracted from the memory by a time domain switch. We will show this method for intensity as well as for phase modulated signals.
Physics, Issue 84, optical communications, Optical Light Storage, stimulated Brillouin scattering, Optical Signal Processing, optical data packets, telecommunications
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33
. To help improve this understanding, proton magnetic resonance spectroscopy (1
H-MRS) can be used as it allows the in vivo
quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41
. In fact, a recent study demonstrated that 1
H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34
. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1
H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31
. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
A Cost-effective and Reliable Method to Predict Mechanical Stress in Single-use and Standard Pumps
Institutions: Zurich University of Applied Sciences, Levitronix Ltd., SOPAT Ltd., Technische Universität Berlin.
Pumps are mainly used when transferring sterile culture broths in biopharmaceutical and biotechnological production processes. However, during the pumping process shear forces occur which can lead to qualitative and/or quantitative product loss. To calculate the mechanical stress with limited experimental expense, an oil-water emulsion system was used, whose suitability was demonstrated for drop size detections in bioreactors1
. As drop breakup of the oil-water emulsion system is a function of mechanical stress, drop sizes need to be counted over the experimental time of shear stress investigations. In previous studies, the inline endoscopy has been shown to be an accurate and reliable measurement technique for drop size detections in liquid/liquid dispersions. The aim of this protocol is to show the suitability of the inline endoscopy technique for drop size measurements in pumping processes. In order to express the drop size, the Sauter mean diameter d32
was used as the representative diameter of drops in the oil-water emulsion. The results showed low variation in the Sauter mean diameters, which were quantified by standard deviations of below 15%, indicating the reliability of the measurement technique.
Engineering, Issue 102, Inline endoscopy, Drop size measurement, Emulsion, Single-use, Magnetically levitated centrifugal pumps
Investigating the Spreading and Toxicity of Prion-like Proteins Using the Metazoan Model Organism C. elegans
Institutions: Northwestern University.
Prions are unconventional self-propagating proteinaceous particles, devoid of any coding nucleic acid. These proteinaceous seeds serve as templates for the conversion and replication of their benign cellular isoform. Accumulating evidence suggests that many protein aggregates can act as self-propagating templates and corrupt the folding of cognate proteins. Although aggregates can be functional under certain circumstances, this process often leads to the disruption of the cellular protein homeostasis (proteostasis), eventually leading to devastating diseases such as Alzheimer’s disease (AD), Parkinson’s disease (PD), Amyotrophic lateral sclerosis (ALS), or transmissible spongiform encephalopathies (TSEs). The exact mechanisms of prion propagation and cell-to-cell spreading of protein aggregates are still subjects of intense investigation. To further this knowledge, recently a new metazoan model in Caenorhabditis elegans
, for expression of the prion domain of the cytosolic yeast prion protein Sup35 has been established. This prion model offers several advantages, as it allows direct monitoring of the fluorescently tagged prion domain in living animals and ease of genetic approaches. Described here are methods to study prion-like behavior of protein aggregates and to identify modifiers of prion-induced toxicity using C. elegans
Cellular Biology, Issue 95, Caenorhabditis elegans, neurodegenerative diseases, protein misfolding diseases, prion-like spreading, cell-to-cell transmission, protein aggregation, non-cell autonomous toxicity, proteostasis
Micro 3D Printing Using a Digital Projector and its Application in the Study of Soft Materials Mechanics
Institutions: Massachusetts Institute of Technology.
Buckling is a classical topic in mechanics. While buckling has long been studied as one of the major structural failure modes1
, it has recently drawn new attention as a unique mechanism for pattern transformation. Nature is full of such examples where a wealth of exotic patterns are formed through mechanical instability2-5
. Inspired by this elegant mechanism, many studies have demonstrated creation and transformation of patterns using soft materials such as elastomers and hydrogels6-11
. Swelling gels are of particular interest because they can spontaneously trigger mechanical instability to create various patterns without the need of external force6-10
. Recently, we have reported demonstration of full control over buckling pattern of micro-scaled tubular gels using projection micro-stereolithography (PμSL), a three-dimensional (3D) manufacturing technology capable of rapidly converting computer generated 3D models into physical objects at high resolution12,13
. Here we present a simple method to build up a simplified PμSL system using a commercially available digital data projector to study swelling-induced buckling instability for controlled pattern transformation.
A simple desktop 3D printer is built using an off-the-shelf digital data projector and simple optical components such as a convex lens and a mirror14
. Cross-sectional images extracted from a 3D solid model is projected on the photosensitive resin surface in sequence, polymerizing liquid resin into a desired 3D solid structure in a layer-by-layer fashion. Even with this simple configuration and easy process, arbitrary 3D objects can be readily fabricated with sub-100 μm resolution.
This desktop 3D printer holds potential in the study of soft material mechanics by offering a great opportunity to explore various 3D geometries. We use this system to fabricate tubular shaped hydrogel structure with different dimensions. Fixed on the bottom to the substrate, the tubular gel develops inhomogeneous stress during swelling, which gives rise to buckling instability. Various wavy patterns appear along the circumference of the tube when the gel structures undergo buckling. Experiment shows that circumferential buckling of desired mode can be created in a controlled manner. Pattern transformation of three-dimensionally structured tubular gels has significant implication not only in mechanics and material science, but also in many other emerging fields such as tunable matamaterials.
Mechanical Engineering, Issue 69, Materials Science, Physics, Chemical Engineering, 3D printing, stereo-lithography, photo-polymerization, gel, swelling, elastic instability, buckling, pattern formation
Reconstruction of 3-Dimensional Histology Volume and its Application to Study Mouse Mammary Glands
Institutions: University of Toronto, Sunnybrook Research Institute, University of Toronto, Sunnybrook Research Institute, Medical University of South Carolina, University of Manitoba.
Histology volume reconstruction facilitates the study of 3D shape and volume change of an organ at the level of macrostructures made up of cells. It can also be used to investigate and validate novel techniques and algorithms in volumetric medical imaging and therapies. Creating 3D high-resolution atlases of different organs1,2,3
is another application of histology volume reconstruction. This provides a resource for investigating tissue structures and the spatial relationship between various cellular features. We present an image registration approach for histology volume reconstruction, which uses a set of optical blockface images. The reconstructed histology volume represents a reliable shape of the processed specimen with no propagated post-processing registration error. The Hematoxylin and Eosin (H&E) stained sections of two mouse mammary glands were registered to their corresponding blockface images using boundary points extracted from the edges of the specimen in histology and blockface images. The accuracy of the registration was visually evaluated. The alignment of the macrostructures of the mammary glands was also visually assessed at high resolution.
This study delineates the different steps of this image registration pipeline, ranging from excision of the mammary gland through to 3D histology volume reconstruction. While 2D histology images reveal the structural differences between pairs of sections, 3D histology volume provides the ability to visualize the differences in shape and volume of the mammary glands.
Bioengineering, Issue 89,
Histology Volume Reconstruction, Transgenic Mouse Model, Image Registration, Digital Histology, Image Processing, Mouse Mammary Gland
Transcript and Metabolite Profiling for the Evaluation of Tobacco Tree and Poplar as Feedstock for the Bio-based Industry
Institutions: Max Planck Institute for Molecular Plant Physiology, Royal Holloway, University of London, VIB, UGhent, ETH Zurich, EMPA, Royal Institute of Technology (KTH), European Research and Project Office GmbH, ABBA Gaia S.L., Pflanzenöltechnologie, Capax Environmental Services, Green Fuels, Neutral Consulting Ltd, University of Melbourne.
The global demand for food, feed, energy, and water poses extraordinary challenges for future generations. It is evident that robust platforms for the exploration of renewable resources are necessary to overcome these challenges. Within the multinational framework MultiBioPro we are developing biorefinery pipelines to maximize the use of plant biomass. More specifically, we use poplar and tobacco tree (Nicotiana glauca
) as target crop species for improving saccharification, isoprenoid, long chain hydrocarbon contents, fiber quality, and suberin and lignin contents. The methods used to obtain these outputs include GC-MS, LC-MS and RNA sequencing platforms. The metabolite pipelines are well established tools to generate these types of data, but also have the limitations in that only well characterized metabolites can be used. The deep sequencing will allow us to include all transcripts present during the developmental stages of the tobacco tree leaf, but has to be mapped back to the sequence of Nicotiana tabacum
. With these set-ups, we aim at a basic understanding for underlying processes and at establishing an industrial framework to exploit the outcomes. In a more long term perspective, we believe that data generated here will provide means for a sustainable biorefinery process using poplar and tobacco tree as raw material. To date the basal level of metabolites in the samples have been analyzed and the protocols utilized are provided in this article.
Environmental Sciences, Issue 87, botany, plants, Biorefining, Poplar, Tobacco tree, Arabidopsis, suberin, lignin, cell walls, biomass, long-chain hydrocarbons, isoprenoids, Nicotiana glauca, systems biology
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Phage Phenomics: Physiological Approaches to Characterize Novel Viral Proteins
Institutions: San Diego State University, San Diego State University, San Diego State University, San Diego State University, San Diego State University, Argonne National Laboratory, Broad Institute.
Current investigations into phage-host interactions are dependent on extrapolating knowledge from (meta)genomes. Interestingly, 60 - 95% of all phage sequences share no homology to current annotated proteins. As a result, a large proportion of phage genes are annotated as hypothetical. This reality heavily affects the annotation of both structural and auxiliary metabolic genes. Here we present phenomic methods designed to capture the physiological response(s) of a selected host during expression of one of these unknown phage genes. Multi-phenotype Assay Plates (MAPs) are used to monitor the diversity of host substrate utilization and subsequent biomass formation, while metabolomics provides bi-product analysis by monitoring metabolite abundance and diversity. Both tools are used simultaneously to provide a phenotypic profile associated with expression of a single putative phage open reading frame (ORF). Representative results for both methods are compared, highlighting the phenotypic profile differences of a host carrying either putative structural or metabolic phage genes. In addition, the visualization techniques and high throughput computational pipelines that facilitated experimental analysis are presented.
Immunology, Issue 100, phenomics, phage, viral metagenome, Multi-phenotype Assay Plates (MAPs), continuous culture, metabolomics
Scalable High Throughput Selection From Phage-displayed Synthetic Antibody Libraries
Institutions: The Recombinant Antibody Network, University of Toronto, University of California, San Francisco at Mission Bay, The University of Chicago.
The demand for antibodies that fulfill the needs of both basic and clinical research applications is high and will dramatically increase in the future. However, it is apparent that traditional monoclonal technologies are not alone up to this task. This has led to the development of alternate methods to satisfy the demand for high quality and renewable affinity reagents to all accessible elements of the proteome. Toward this end, high throughput methods for conducting selections from phage-displayed synthetic antibody libraries have been devised for applications involving diverse antigens and optimized for rapid throughput and success. Herein, a protocol is described in detail that illustrates with video demonstration the parallel selection of Fab-phage clones from high diversity libraries against hundreds of targets using either a manual 96 channel liquid handler or automated robotics system. Using this protocol, a single user can generate hundreds of antigens, select antibodies to them in parallel and validate antibody binding within 6-8 weeks. Highlighted are: i) a viable antigen format, ii) pre-selection antigen characterization, iii) critical steps that influence the selection of specific and high affinity clones, and iv) ways of monitoring selection effectiveness and early stage antibody clone characterization. With this approach, we have obtained synthetic antibody fragments (Fabs) to many target classes including single-pass membrane receptors, secreted protein hormones, and multi-domain intracellular proteins. These fragments are readily converted to full-length antibodies and have been validated to exhibit high affinity and specificity. Further, they have been demonstrated to be functional in a variety of standard immunoassays including Western blotting, ELISA, cellular immunofluorescence, immunoprecipitation and related assays. This methodology will accelerate antibody discovery and ultimately bring us closer to realizing the goal of generating renewable, high quality antibodies to the proteome.
Immunology, Issue 95, Bacteria, Viruses, Amino Acids, Peptides, and Proteins, Nucleic Acids, Nucleotides, and Nucleosides, Life Sciences (General), phage display, synthetic antibodies, high throughput, antibody selection, scalable methodology
A Coupled Experiment-finite Element Modeling Methodology for Assessing High Strain Rate Mechanical Response of Soft Biomaterials
Institutions: Mississippi State University, Mississippi State University.
This study offers a combined experimental and finite element (FE) simulation approach for examining the mechanical behavior of soft biomaterials (e.g.
brain, liver, tendon, fat, etc.
) when exposed to high strain rates. This study utilized a Split-Hopkinson Pressure Bar (SHPB) to generate strain rates of 100-1,500 sec-1
. The SHPB employed a striker bar consisting of a viscoelastic material (polycarbonate). A sample of the biomaterial was obtained shortly postmortem and prepared for SHPB testing. The specimen was interposed between the incident and transmitted bars, and the pneumatic components of the SHPB were activated to drive the striker bar toward the incident bar. The resulting impact generated a compressive stress wave (i.e.
incident wave) that traveled through the incident bar. When the compressive stress wave reached the end of the incident bar, a portion continued forward through the sample and transmitted bar (i.e.
transmitted wave) while another portion reversed through the incident bar as a tensile wave (i.e.
reflected wave). These waves were measured using strain gages mounted on the incident and transmitted bars. The true stress-strain behavior of the sample was determined from equations based on wave propagation and dynamic force equilibrium. The experimental stress-strain response was three dimensional in nature because the specimen bulged. As such, the hydrostatic stress (first invariant) was used to generate the stress-strain response. In order to extract the uniaxial (one-dimensional) mechanical response of the tissue, an iterative coupled optimization was performed using experimental results and Finite Element Analysis (FEA), which contained an Internal State Variable (ISV) material model used for the tissue. The ISV material model used in the FE simulations of the experimental setup was iteratively calibrated (i.e.
optimized) to the experimental data such that the experiment and FEA strain gage values and first invariant of stresses were in good agreement.
Bioengineering, Issue 99, Split-Hopkinson Pressure Bar, High Strain Rate, Finite Element Modeling, Soft Biomaterials, Dynamic Experiments, Internal State Variable Modeling, Brain, Liver, Tendon, Fat
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
High Throughput Quantitative Expression Screening and Purification Applied to Recombinant Disulfide-rich Venom Proteins Produced in E. coli
Institutions: Aix-Marseille Université, Commissariat à l'énergie atomique et aux énergies alternatives (CEA) Saclay, France.
Escherichia coli (E. coli)
is the most widely used expression system for the production of recombinant proteins for structural and functional studies. However, purifying proteins is sometimes challenging since many proteins are expressed in an insoluble form. When working with difficult or multiple targets it is therefore recommended to use high throughput (HTP) protein expression screening on a small scale (1-4 ml cultures) to quickly identify conditions for soluble expression. To cope with the various structural genomics programs of the lab, a quantitative (within a range of 0.1-100 mg/L culture of recombinant protein) and HTP protein expression screening protocol was implemented and validated on thousands of proteins. The protocols were automated with the use of a liquid handling robot but can also be performed manually without specialized equipment.
Disulfide-rich venom proteins are gaining increasing recognition for their potential as therapeutic drug leads. They can be highly potent and selective, but their complex disulfide bond networks make them challenging to produce. As a member of the FP7 European Venomics project (www.venomics.eu), our challenge is to develop successful production strategies with the aim of producing thousands of novel venom proteins for functional characterization. Aided by the redox properties of disulfide bond isomerase DsbC, we adapted our HTP production pipeline for the expression of oxidized, functional venom peptides in the E. coli
cytoplasm. The protocols are also applicable to the production of diverse disulfide-rich proteins. Here we demonstrate our pipeline applied to the production of animal venom proteins. With the protocols described herein it is likely that soluble disulfide-rich proteins will be obtained in as little as a week. Even from a small scale, there is the potential to use the purified proteins for validating the oxidation state by mass spectrometry, for characterization in pilot studies, or for sensitive micro-assays.
Bioengineering, Issue 89, E. coli, expression, recombinant, high throughput (HTP), purification, auto-induction, immobilized metal affinity chromatography (IMAC), tobacco etch virus protease (TEV) cleavage, disulfide bond isomerase C (DsbC) fusion, disulfide bonds, animal venom proteins/peptides
Purifying the Impure: Sequencing Metagenomes and Metatranscriptomes from Complex Animal-associated Samples
Institutions: San Diego State University, DOE Joint Genome Institute, University of Colorado, University of Colorado.
The accessibility of high-throughput sequencing has revolutionized many fields of biology. In order to better understand host-associated viral and microbial communities, a comprehensive workflow for DNA and RNA extraction was developed. The workflow concurrently generates viral and microbial metagenomes, as well as metatranscriptomes, from a single sample for next-generation sequencing. The coupling of these approaches provides an overview of both the taxonomical characteristics and the community encoded functions. The presented methods use Cystic Fibrosis (CF) sputum, a problematic sample type, because it is exceptionally viscous and contains high amount of mucins, free neutrophil DNA, and other unknown contaminants. The protocols described here target these problems and successfully recover viral and microbial DNA with minimal human DNA contamination. To complement the metagenomics studies, a metatranscriptomics protocol was optimized to recover both microbial and host mRNA that contains relatively few ribosomal RNA (rRNA) sequences. An overview of the data characteristics is presented to serve as a reference for assessing the success of the methods. Additional CF sputum samples were also collected to (i) evaluate the consistency of the microbiome profiles across seven consecutive days within a single patient, and (ii) compare the consistency of metagenomic approach to a 16S ribosomal RNA gene-based sequencing. The results showed that daily fluctuation of microbial profiles without antibiotic perturbation was minimal and the taxonomy profiles of the common CF-associated bacteria were highly similar between the 16S rDNA libraries and metagenomes generated from the hypotonic lysis (HL)-derived DNA. However, the differences between 16S rDNA taxonomical profiles generated from total DNA and HL-derived DNA suggest that hypotonic lysis and the washing steps benefit in not only removing the human-derived DNA, but also microbial-derived extracellular DNA that may misrepresent the actual microbial profiles.
Molecular Biology, Issue 94, virome, microbiome, metagenomics, metatranscriptomics, cystic fibrosis, mucosal-surface
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+
release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Measuring the Bending Stiffness of Bacterial Cells Using an Optical Trap
Institutions: Princeton University, Princeton University.
We developed a protocol to measure the bending rigidity of filamentous rod-shaped bacteria. Forces are applied with an optical trap, a microscopic three-dimensional spring made of light that is formed when a high-intensity laser beam is focused to a very small spot by a microscope's objective lens. To bend a cell, we first bind live bacteria to a chemically-treated coverslip. As these cells grow, the middle of the cells remains bound to the coverslip but the growing ends are free of this restraint. By inducing filamentous growth with the drug cephalexin, we are able to identify cells in which one end of the cell was stuck to the surface while the other end remained unattached and susceptible to bending forces. A bending force is then applied with an optical trap by binding a polylysine-coated bead to the tip of a growing cell. Both the force and the displacement of the bead are recorded and the bending stiffness of the cell is the slope of this relationship.
Microbiology, Issue 38, optical trap, cell mechanics, E. coli, cell bending
Patient-specific Modeling of the Heart: Estimation of Ventricular Fiber Orientations
Institutions: Johns Hopkins University.
Patient-specific simulations of heart (dys)function aimed at personalizing cardiac therapy are hampered by the absence of in vivo
imaging technology for clinically acquiring myocardial fiber orientations. The objective of this project was to develop a methodology to estimate cardiac fiber orientations from in vivo
images of patient heart geometries. An accurate representation of ventricular geometry and fiber orientations was reconstructed, respectively, from high-resolution ex vivo structural magnetic resonance (MR) and diffusion tensor (DT) MR images of a normal human heart, referred to as the atlas. Ventricular geometry of a patient heart was extracted, via
semiautomatic segmentation, from an in vivo
computed tomography (CT) image. Using image transformation algorithms, the atlas ventricular geometry was deformed to match that of the patient. Finally, the deformation field was applied to the atlas fiber orientations to obtain an estimate of patient fiber orientations. The accuracy of the fiber estimates was assessed using six normal and three failing canine hearts. The mean absolute difference between inclination angles of acquired and estimated fiber orientations was 15.4 °. Computational simulations of ventricular activation maps and pseudo-ECGs in sinus rhythm and ventricular tachycardia indicated that there are no significant differences between estimated and acquired fiber orientations at a clinically observable level.The new insights obtained from the project will pave the way for the development of patient-specific models of the heart that can aid physicians in personalized diagnosis and decisions regarding electrophysiological interventions.
Bioengineering, Issue 71, Biomedical Engineering, Medicine, Anatomy, Physiology, Cardiology, Myocytes, Cardiac, Image Processing, Computer-Assisted, Magnetic Resonance Imaging, MRI, Diffusion Magnetic Resonance Imaging, Cardiac Electrophysiology, computerized simulation (general), mathematical modeling (systems analysis), Cardiomyocyte, biomedical image processing, patient-specific modeling, Electrophysiology, simulation
The Preparation of Electrohydrodynamic Bridges from Polar Dielectric Liquids
Institutions: Wetsus - Centre of Excellence for Sustainable Water Technology, IRCAM GmbH, Graz University of Technology.
Horizontal and vertical liquid bridges are simple and powerful tools for exploring the interaction of high intensity electric fields (8-20 kV/cm) and polar dielectric liquids. These bridges are unique from capillary bridges in that they exhibit extensibility beyond a few millimeters, have complex bi-directional mass transfer patterns, and emit non-Planck infrared radiation. A number of common solvents can form such bridges as well as low conductivity solutions and colloidal suspensions. The macroscopic behavior is governed by electrohydrodynamics and provides a means of studying fluid flow phenomena without the presence of rigid walls. Prior to the onset of a liquid bridge several important phenomena can be observed including advancing meniscus height (electrowetting), bulk fluid circulation (the Sumoto effect), and the ejection of charged droplets (electrospray). The interaction between surface, polarization, and displacement forces can be directly examined by varying applied voltage and bridge length. The electric field, assisted by gravity, stabilizes the liquid bridge against Rayleigh-Plateau instabilities. Construction of basic apparatus for both vertical and horizontal orientation along with operational examples, including thermographic images, for three liquids (e.g.
, water, DMSO, and glycerol) is presented.
Physics, Issue 91, floating water bridge, polar dielectric liquids, liquid bridge, electrohydrodynamics, thermography, dielectrophoresis, electrowetting, Sumoto effect, Armstrong effect