JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Optimization of tagged MRI for quantification of liver stiffness using computer simulated data.
PUBLISHED: 01-01-2014
The heartbeat has been proposed as an intrinsic source of motion that can be used in combination with tagged Magnetic Resonance Imaging (MRI) to measure displacements induced in the liver as an index of liver stiffness. Optimizing a tagged MRI acquisition protocol in terms of sensitivity to these displacements, which are in the order of pixel size, is necessary to develop the method as a quantification tool for staging fibrosis. We reproduced a study of cardiac-induced strain in the liver at 3T and simulated tagged MR images with different grid tag patterns to evaluate the performance of the Harmonic Phase (HARP) image analysis method and its dependence on the parameters of tag spacing and grid angle. The Partial Volume Effect (PVE), T1 relaxation, and different levels of noise were taken into account. Four displacement fields of increasing intensity were created and applied to the tagged MR images of the liver. These fields simulated the deformation at different liver stiffnesses. An Error Index (EI) was calculated to evaluate the estimation accuracy for various parameter values. In the absence of noise, the estimation accuracy of the displacement fields increased as tag spacings decreased. EIs for each of the four displacement fields were lower at 0° and the local minima of the EI were found to correspond to multiples of pixel size. The accuracy of the estimation decreased for increasing levels of added noise; as the level increased, the improved estimation caused by decreasing the tag spacing tended to zero. The optimal tag spacing turned out to be a compromise between the smallest tag period that is a multiple of the pixel size and is achievable in a real acquisition and the tag spacing that guarantees an accurate liver displacement measure in the presence of realistic levels of noise.
Authors: Fijoy Vadakkumpadan, Hermenegild Arevalo, Natalia A. Trayanova.
Published: 01-08-2013
Patient-specific simulations of heart (dys)function aimed at personalizing cardiac therapy are hampered by the absence of in vivo imaging technology for clinically acquiring myocardial fiber orientations. The objective of this project was to develop a methodology to estimate cardiac fiber orientations from in vivo images of patient heart geometries. An accurate representation of ventricular geometry and fiber orientations was reconstructed, respectively, from high-resolution ex vivo structural magnetic resonance (MR) and diffusion tensor (DT) MR images of a normal human heart, referred to as the atlas. Ventricular geometry of a patient heart was extracted, via semiautomatic segmentation, from an in vivo computed tomography (CT) image. Using image transformation algorithms, the atlas ventricular geometry was deformed to match that of the patient. Finally, the deformation field was applied to the atlas fiber orientations to obtain an estimate of patient fiber orientations. The accuracy of the fiber estimates was assessed using six normal and three failing canine hearts. The mean absolute difference between inclination angles of acquired and estimated fiber orientations was 15.4 °. Computational simulations of ventricular activation maps and pseudo-ECGs in sinus rhythm and ventricular tachycardia indicated that there are no significant differences between estimated and acquired fiber orientations at a clinically observable level.The new insights obtained from the project will pave the way for the development of patient-specific models of the heart that can aid physicians in personalized diagnosis and decisions regarding electrophysiological interventions.
25 Related JoVE Articles!
Play Button
Measuring Ascending Aortic Stiffness In Vivo in Mice Using Ultrasound
Authors: Maggie M. Kuo, Viachaslau Barodka, Theodore P. Abraham, Jochen Steppan, Artin A. Shoukas, Mark Butlin, Alberto Avolio, Dan E. Berkowitz, Lakshmi Santhanam.
Institutions: Johns Hopkins University, Johns Hopkins University, Johns Hopkins University, Macquarie University.
We present a protocol for measuring in vivo aortic stiffness in mice using high-resolution ultrasound imaging. Aortic diameter is measured by ultrasound and aortic blood pressure is measured invasively with a solid-state pressure catheter. Blood pressure is raised then lowered incrementally by intravenous infusion of vasoactive drugs phenylephrine and sodium nitroprusside. Aortic diameter is measured for each pressure step to characterize the pressure-diameter relationship of the ascending aorta. Stiffness indices derived from the pressure-diameter relationship can be calculated from the data collected. Calculation of arterial compliance is described in this protocol. This technique can be used to investigate mechanisms underlying increased aortic stiffness associated with cardiovascular disease and aging. The technique produces a physiologically relevant measure of stiffness compared to ex vivo approaches because physiological influences on aortic stiffness are incorporated in the measurement. The primary limitation of this technique is the measurement error introduced from the movement of the aorta during the cardiac cycle. This motion can be compensated by adjusting the location of the probe with the aortic movement as well as making multiple measurements of the aortic pressure-diameter relationship and expanding the experimental group size.
Medicine, Issue 94, Aortic stiffness, ultrasound, in vivo, aortic compliance, elastic modulus, mouse model, cardiovascular disease
Play Button
In vivo 19F MRI for Cell Tracking
Authors: Mangala Srinivas, Philipp Boehm-Sturm, Markus Aswendt, Eberhard D. Pracht, Carl G. Figdor, I. Jolanda de Vries, Mathias Hoehn.
Institutions: Radboud University Medical Center, Max Planck Institute for Neurological Research, German Center for Neurodegenerative Diseases (DZNE).
In vivo 19F MRI allows quantitative cell tracking without the use of ionizing radiation. It is a noninvasive technique that can be applied to humans. Here, we describe a general protocol for cell labeling, imaging, and image processing. The technique is applicable to various cell types and animal models, although here we focus on a typical mouse model for tracking murine immune cells. The most important issues for cell labeling are described, as these are relevant to all models. Similarly, key imaging parameters are listed, although the details will vary depending on the MRI system and the individual setup. Finally, we include an image processing protocol for quantification. Variations for this, and other parts of the protocol, are assessed in the Discussion section. Based on the detailed procedure described here, the user will need to adapt the protocol for each specific cell type, cell label, animal model, and imaging setup. Note that the protocol can also be adapted for human use, as long as clinical restrictions are met.
Medicine, Issue 81, Animal Models, Immune System Diseases, MRI, 19F MRI, Cell Tracking, Quantification, Cell Label, In vivo Imaging
Play Button
In situ Compressive Loading and Correlative Noninvasive Imaging of the Bone-periodontal Ligament-tooth Fibrous Joint
Authors: Andrew T. Jang, Jeremy D. Lin, Youngho Seo, Sergey Etchin, Arno Merkle, Kevin Fahey, Sunita P. Ho.
Institutions: University of California San Francisco, University of California San Francisco, Xradia Inc..
This study demonstrates a novel biomechanics testing protocol. The advantage of this protocol includes the use of an in situ loading device coupled to a high resolution X-ray microscope, thus enabling visualization of internal structural elements under simulated physiological loads and wet conditions. Experimental specimens will include intact bone-periodontal ligament (PDL)-tooth fibrous joints. Results will illustrate three important features of the protocol as they can be applied to organ level biomechanics: 1) reactionary force vs. displacement: tooth displacement within the alveolar socket and its reactionary response to loading, 2) three-dimensional (3D) spatial configuration and morphometrics: geometric relationship of the tooth with the alveolar socket, and 3) changes in readouts 1 and 2 due to a change in loading axis, i.e. from concentric to eccentric loads. Efficacy of the proposed protocol will be evaluated by coupling mechanical testing readouts to 3D morphometrics and overall biomechanics of the joint. In addition, this technique will emphasize on the need to equilibrate experimental conditions, specifically reactionary loads prior to acquiring tomograms of fibrous joints. It should be noted that the proposed protocol is limited to testing specimens under ex vivo conditions, and that use of contrast agents to visualize soft tissue mechanical response could lead to erroneous conclusions about tissue and organ-level biomechanics.
Bioengineering, Issue 85, biomechanics, bone-periodontal ligament-tooth complex, concentric loads, eccentric loads, contrast agent
Play Button
Shrinkage of Dental Composite in Simulated Cavity Measured with Digital Image Correlation
Authors: Jianying Li, Preetanjali Thakur, Alex S. L. Fok.
Institutions: University of Minnesota.
Polymerization shrinkage of dental resin composites can lead to restoration debonding or cracked tooth tissues in composite-restored teeth. In order to understand where and how shrinkage strain and stress develop in such restored teeth, Digital Image Correlation (DIC) was used to provide a comprehensive view of the displacement and strain distributions within model restorations that had undergone polymerization shrinkage. Specimens with model cavities were made of cylindrical glass rods with both diameter and length being 10 mm. The dimensions of the mesial-occlusal-distal (MOD) cavity prepared in each specimen measured 3 mm and 2 mm in width and depth, respectively. After filling the cavity with resin composite, the surface under observation was sprayed with first a thin layer of white paint and then fine black charcoal powder to create high-contrast speckles. Pictures of that surface were then taken before curing and 5 min after. Finally, the two pictures were correlated using DIC software to calculate the displacement and strain distributions. The resin composite shrunk vertically towards the bottom of the cavity, with the top center portion of the restoration having the largest downward displacement. At the same time, it shrunk horizontally towards its vertical midline. Shrinkage of the composite stretched the material in the vicinity of the “tooth-restoration” interface, resulting in cuspal deflections and high tensile strains around the restoration. Material close to the cavity walls or floor had direct strains mostly in the directions perpendicular to the interfaces. Summation of the two direct strain components showed a relatively uniform distribution around the restoration and its magnitude equaled approximately to the volumetric shrinkage strain of the material.
Medicine, Issue 89, image processing, computer-assisted, polymer matrix composites, testing of materials (composite materials), dental composite restoration, polymerization shrinkage, digital image correlation, full-field strain measurement, interfacial debonding
Play Button
Confocal Imaging of Confined Quiescent and Flowing Colloid-polymer Mixtures
Authors: Rahul Pandey, Melissa Spannuth, Jacinta C. Conrad.
Institutions: University of Houston.
The behavior of confined colloidal suspensions with attractive interparticle interactions is critical to the rational design of materials for directed assembly1-3, drug delivery4, improved hydrocarbon recovery5-7, and flowable electrodes for energy storage8. Suspensions containing fluorescent colloids and non-adsorbing polymers are appealing model systems, as the ratio of the polymer radius of gyration to the particle radius and concentration of polymer control the range and strength of the interparticle attraction, respectively. By tuning the polymer properties and the volume fraction of the colloids, colloid fluids, fluids of clusters, gels, crystals, and glasses can be obtained9. Confocal microscopy, a variant of fluorescence microscopy, allows an optically transparent and fluorescent sample to be imaged with high spatial and temporal resolution in three dimensions. In this technique, a small pinhole or slit blocks the emitted fluorescent light from regions of the sample that are outside the focal volume of the microscope optical system. As a result, only a thin section of the sample in the focal plane is imaged. This technique is particularly well suited to probe the structure and dynamics in dense colloidal suspensions at the single-particle scale: the particles are large enough to be resolved using visible light and diffuse slowly enough to be captured at typical scan speeds of commercial confocal systems10. Improvements in scan speeds and analysis algorithms have also enabled quantitative confocal imaging of flowing suspensions11-16,37. In this paper, we demonstrate confocal microscopy experiments to probe the confined phase behavior and flow properties of colloid-polymer mixtures. We first prepare colloid-polymer mixtures that are density- and refractive-index matched. Next, we report a standard protocol for imaging quiescent dense colloid-polymer mixtures under varying confinement in thin wedge-shaped cells. Finally, we demonstrate a protocol for imaging colloid-polymer mixtures during microchannel flow.
Chemistry, Issue 87, confocal microscopy, particle tracking, colloids, suspensions, confinement, gelation, microfluidics, image correlation, dynamics, suspension flow
Play Button
Quantification of Global Diastolic Function by Kinematic Modeling-based Analysis of Transmitral Flow via the Parametrized Diastolic Filling Formalism
Authors: Sina Mossahebi, Simeng Zhu, Howard Chen, Leonid Shmuylovich, Erina Ghosh, Sándor J. Kovács.
Institutions: Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis.
Quantitative cardiac function assessment remains a challenge for physiologists and clinicians. Although historically invasive methods have comprised the only means available, the development of noninvasive imaging modalities (echocardiography, MRI, CT) having high temporal and spatial resolution provide a new window for quantitative diastolic function assessment. Echocardiography is the agreed upon standard for diastolic function assessment, but indexes in current clinical use merely utilize selected features of chamber dimension (M-mode) or blood/tissue motion (Doppler) waveforms without incorporating the physiologic causal determinants of the motion itself. The recognition that all left ventricles (LV) initiate filling by serving as mechanical suction pumps allows global diastolic function to be assessed based on laws of motion that apply to all chambers. What differentiates one heart from another are the parameters of the equation of motion that governs filling. Accordingly, development of the Parametrized Diastolic Filling (PDF) formalism has shown that the entire range of clinically observed early transmitral flow (Doppler E-wave) patterns are extremely well fit by the laws of damped oscillatory motion. This permits analysis of individual E-waves in accordance with a causal mechanism (recoil-initiated suction) that yields three (numerically) unique lumped parameters whose physiologic analogues are chamber stiffness (k), viscoelasticity/relaxation (c), and load (xo). The recording of transmitral flow (Doppler E-waves) is standard practice in clinical cardiology and, therefore, the echocardiographic recording method is only briefly reviewed. Our focus is on determination of the PDF parameters from routinely recorded E-wave data. As the highlighted results indicate, once the PDF parameters have been obtained from a suitable number of load varying E-waves, the investigator is free to use the parameters or construct indexes from the parameters (such as stored energy 1/2kxo2, maximum A-V pressure gradient kxo, load independent index of diastolic function, etc.) and select the aspect of physiology or pathophysiology to be quantified.
Bioengineering, Issue 91, cardiovascular physiology, ventricular mechanics, diastolic function, mathematical modeling, Doppler echocardiography, hemodynamics, biomechanics
Play Button
A Mouse Model for Pathogen-induced Chronic Inflammation at Local and Systemic Sites
Authors: George Papadopoulos, Carolyn D. Kramer, Connie S. Slocum, Ellen O. Weinberg, Ning Hua, Cynthia V. Gudino, James A. Hamilton, Caroline A. Genco.
Institutions: Boston University School of Medicine, Boston University School of Medicine.
Chronic inflammation is a major driver of pathological tissue damage and a unifying characteristic of many chronic diseases in humans including neoplastic, autoimmune, and chronic inflammatory diseases. Emerging evidence implicates pathogen-induced chronic inflammation in the development and progression of chronic diseases with a wide variety of clinical manifestations. Due to the complex and multifactorial etiology of chronic disease, designing experiments for proof of causality and the establishment of mechanistic links is nearly impossible in humans. An advantage of using animal models is that both genetic and environmental factors that may influence the course of a particular disease can be controlled. Thus, designing relevant animal models of infection represents a key step in identifying host and pathogen specific mechanisms that contribute to chronic inflammation. Here we describe a mouse model of pathogen-induced chronic inflammation at local and systemic sites following infection with the oral pathogen Porphyromonas gingivalis, a bacterium closely associated with human periodontal disease. Oral infection of specific-pathogen free mice induces a local inflammatory response resulting in destruction of tooth supporting alveolar bone, a hallmark of periodontal disease. In an established mouse model of atherosclerosis, infection with P. gingivalis accelerates inflammatory plaque deposition within the aortic sinus and innominate artery, accompanied by activation of the vascular endothelium, an increased immune cell infiltrate, and elevated expression of inflammatory mediators within lesions. We detail methodologies for the assessment of inflammation at local and systemic sites. The use of transgenic mice and defined bacterial mutants makes this model particularly suitable for identifying both host and microbial factors involved in the initiation, progression, and outcome of disease. Additionally, the model can be used to screen for novel therapeutic strategies, including vaccination and pharmacological intervention.
Immunology, Issue 90, Pathogen-Induced Chronic Inflammation; Porphyromonas gingivalis; Oral Bone Loss; Periodontal Disease; Atherosclerosis; Chronic Inflammation; Host-Pathogen Interaction; microCT; MRI
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
From Fast Fluorescence Imaging to Molecular Diffusion Law on Live Cell Membranes in a Commercial Microscope
Authors: Carmine Di Rienzo, Enrico Gratton, Fabio Beltram, Francesco Cardarelli.
Institutions: Scuola Normale Superiore, Instituto Italiano di Tecnologia, University of California, Irvine.
It has become increasingly evident that the spatial distribution and the motion of membrane components like lipids and proteins are key factors in the regulation of many cellular functions. However, due to the fast dynamics and the tiny structures involved, a very high spatio-temporal resolution is required to catch the real behavior of molecules. Here we present the experimental protocol for studying the dynamics of fluorescently-labeled plasma-membrane proteins and lipids in live cells with high spatiotemporal resolution. Notably, this approach doesn’t need to track each molecule, but it calculates population behavior using all molecules in a given region of the membrane. The starting point is a fast imaging of a given region on the membrane. Afterwards, a complete spatio-temporal autocorrelation function is calculated correlating acquired images at increasing time delays, for example each 2, 3, n repetitions. It is possible to demonstrate that the width of the peak of the spatial autocorrelation function increases at increasing time delay as a function of particle movement due to diffusion. Therefore, fitting of the series of autocorrelation functions enables to extract the actual protein mean square displacement from imaging (iMSD), here presented in the form of apparent diffusivity vs average displacement. This yields a quantitative view of the average dynamics of single molecules with nanometer accuracy. By using a GFP-tagged variant of the Transferrin Receptor (TfR) and an ATTO488 labeled 1-palmitoyl-2-hydroxy-sn-glycero-3-phosphoethanolamine (PPE) it is possible to observe the spatiotemporal regulation of protein and lipid diffusion on µm-sized membrane regions in the micro-to-milli-second time range.
Bioengineering, Issue 92, fluorescence, protein dynamics, lipid dynamics, membrane heterogeneity, transient confinement, single molecule, GFP
Play Button
A Guide to Modern Quantitative Fluorescent Western Blotting with Troubleshooting Strategies
Authors: Samantha L. Eaton, Maica Llavero Hurtado, Karla J. Oldknow, Laura C. Graham, Thomas W. Marchant, Thomas H. Gillingwater, Colin Farquharson, Thomas M. Wishart.
Institutions: University of Edinburgh, University of Edinburgh, University of Edinburgh, University of Edinburgh.
The late 1970s saw the first publicly reported use of the western blot, a technique for assessing the presence and relative abundance of specific proteins within complex biological samples. Since then, western blotting methodology has become a common component of the molecular biologists experimental repertoire. A cursory search of PubMed using the term “western blot” suggests that in excess of two hundred and twenty thousand published manuscripts have made use of this technique by the year 2014. Importantly, the last ten years have seen technical imaging advances coupled with the development of sensitive fluorescent labels which have improved sensitivity and yielded even greater ranges of linear detection. The result is a now truly Quantifiable Fluorescence based Western Blot (QFWB) that allows biologists to carry out comparative expression analysis with greater sensitivity and accuracy than ever before. Many “optimized” western blotting methodologies exist and are utilized in different laboratories. These often prove difficult to implement due to the requirement of subtle but undocumented procedural amendments. This protocol provides a comprehensive description of an established and robust QFWB method, complete with troubleshooting strategies.
Basic Protocols, Issue 93, western blotting, fluorescent, LI-COR, protein, quantitative analysis, loading control
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Magnetic Resonance Derived Myocardial Strain Assessment Using Feature Tracking
Authors: Kan N. Hor, Rolf Baumann, Gianni Pedrizzetti, Gianni Tonti, William M. Gottliebson, Michael Taylor, D. Woodrow Benson, Wojciech Mazur.
Institutions: Cincinnati Children Hospital Medical Center (CCHMC), Imaging Systems GmbH, Advanced Medical Imaging Development SRL, The Christ Hospital.
Purpose: An accurate and practical method to measure parameters like strain in myocardial tissue is of great clinical value, since it has been shown, that strain is a more sensitive and earlier marker for contractile dysfunction than the frequently used parameter EF. Current technologies for CMR are time consuming and difficult to implement in clinical practice. Feature tracking is a technology that can lead to more automization and robustness of quantitative analysis of medical images with less time consumption than comparable methods. Methods: An automatic or manual input in a single phase serves as an initialization from which the system starts to track the displacement of individual patterns representing anatomical structures over time. The specialty of this method is that the images do not need to be manipulated in any way beforehand like e.g. tagging of CMR images. Results: The method is very well suited for tracking muscular tissue and with this allowing quantitative elaboration of myocardium and also blood flow. Conclusions: This new method offers a robust and time saving procedure to quantify myocardial tissue and blood with displacement, velocity and deformation parameters on regular sequences of CMR imaging. It therefore can be implemented in clinical practice.
Medicine, Issue 48, feature tracking, strain, displacement, CMR
Play Button
Magnetic Resonance Imaging Quantification of Pulmonary Perfusion using Calibrated Arterial Spin Labeling
Authors: Tatsuya J. Arai, G. Kim Prisk, Sebastiaan Holverda, Rui Carlos Sá, Rebecca J. Theilmann, A. Cortney Henderson, Matthew V. Cronin, Richard B. Buxton, Susan R. Hopkins.
Institutions: University of California San Diego - UCSD, University of California San Diego - UCSD, University of California San Diego - UCSD.
This demonstrates a MR imaging method to measure the spatial distribution of pulmonary blood flow in healthy subjects during normoxia (inspired O2, fraction (FIO2) = 0.21) hypoxia (FIO2 = 0.125), and hyperoxia (FIO2 = 1.00). In addition, the physiological responses of the subject are monitored in the MR scan environment. MR images were obtained on a 1.5 T GE MRI scanner during a breath hold from a sagittal slice in the right lung at functional residual capacity. An arterial spin labeling sequence (ASL-FAIRER) was used to measure the spatial distribution of pulmonary blood flow 1,2 and a multi-echo fast gradient echo (mGRE) sequence 3 was used to quantify the regional proton (i.e. H2O) density, allowing the quantification of density-normalized perfusion for each voxel (milliliters blood per minute per gram lung tissue). With a pneumatic switching valve and facemask equipped with a 2-way non-rebreathing valve, different oxygen concentrations were introduced to the subject in the MR scanner through the inspired gas tubing. A metabolic cart collected expiratory gas via expiratory tubing. Mixed expiratory O2 and CO2 concentrations, oxygen consumption, carbon dioxide production, respiratory exchange ratio, respiratory frequency and tidal volume were measured. Heart rate and oxygen saturation were monitored using pulse-oximetry. Data obtained from a normal subject showed that, as expected, heart rate was higher in hypoxia (60 bpm) than during normoxia (51) or hyperoxia (50) and the arterial oxygen saturation (SpO2) was reduced during hypoxia to 86%. Mean ventilation was 8.31 L/min BTPS during hypoxia, 7.04 L/min during normoxia, and 6.64 L/min during hyperoxia. Tidal volume was 0.76 L during hypoxia, 0.69 L during normoxia, and 0.67 L during hyperoxia. Representative quantified ASL data showed that the mean density normalized perfusion was 8.86 ml/min/g during hypoxia, 8.26 ml/min/g during normoxia and 8.46 ml/min/g during hyperoxia, respectively. In this subject, the relative dispersion4, an index of global heterogeneity, was increased in hypoxia (1.07 during hypoxia, 0.85 during normoxia, and 0.87 during hyperoxia) while the fractal dimension (Ds), another index of heterogeneity reflecting vascular branching structure, was unchanged (1.24 during hypoxia, 1.26 during normoxia, and 1.26 during hyperoxia). Overview. This protocol will demonstrate the acquisition of data to measure the distribution of pulmonary perfusion noninvasively under conditions of normoxia, hypoxia, and hyperoxia using a magnetic resonance imaging technique known as arterial spin labeling (ASL). Rationale: Measurement of pulmonary blood flow and lung proton density using MR technique offers high spatial resolution images which can be quantified and the ability to perform repeated measurements under several different physiological conditions. In human studies, PET, SPECT, and CT are commonly used as the alternative techniques. However, these techniques involve exposure to ionizing radiation, and thus are not suitable for repeated measurements in human subjects.
Medicine, Issue 51, arterial spin labeling, lung proton density, functional lung imaging, hypoxic pulmonary vasoconstriction, oxygen consumption, ventilation, magnetic resonance imaging
Play Button
ReAsH/FlAsH Labeling and Image Analysis of Tetracysteine Sensor Proteins in Cells
Authors: Sevgi Irtegun, Yasmin M. Ramdzan, Terrence D. Mulhern, Danny M. Hatters.
Institutions: Bio21 Molecular Science and Biotechnology Institute.
Fluorescent proteins and dyes are essential tools for the study of protein trafficking, localization and function in cells. While fluorescent proteins such as green fluorescence protein (GFP) have been extensively used as fusion partners to proteins to track the properties of a protein of interest1, recent developments with smaller tags enable new functionalities of proteins to be examined in cells such as conformational change and protein-association 2, 3. One small tag system involves a tetracysteine motif (CCXXCC) genetically inserted into a target protein, which binds to biarsenical dyes, ReAsH (red fluorescent) and FlAsH (green fluorescent), with high specificity even in live cells 2. The TC/biarsenical dye system offers far less steric constraints to the host protein than fluorescent proteins which has enabled several new approaches to measure conformational change and protein-protein interactions 4-7. We recently developed a novel application of TC tags as sensors of oligomerization in cells expressing mutant huntingtin, which when mutated aggregates in neurons in Huntington disease 7. Huntingtin was tagged with two fluorescent dyes, one a fluorescent protein to track protein location, and the second a TC tag which only binds biarsenical dyes in monomers. Hence, changes in colocalization between protein and biarsenical dye reactivity enabled submicroscopic oligomer content to be spatially mapped within cells. Here, we describe how to label TC-tagged proteins fused to a fluorescent protein (Cherry, GFP or CFP) with FlAsH or ReAsH in live mammalian cells and how to quantify the two color fluorescence (Cherry/FlAsH, CFP/FlAsH or GFP/ReAsH combinations).
Cell Biology, Issue 54, tetracysteine, TC, ReAsH, FlAsH, biarsenical dyes, fluorescence, imaging, confocal microscopy, ImageJ, GFP
Play Button
Magnetic Resonance Elastography Methodology for the Evaluation of Tissue Engineered Construct Growth
Authors: Evan T. Curtis, Simeng Zhang, Vahid Khalilzad-Sharghi, Thomas Boulet, Shadi F. Othman.
Institutions: University of Nebraska-Lincoln, University of Nebraska-Lincoln.
Traditional mechanical testing often results in the destruction of the sample, and in the case of long term tissue engineered construct studies, the use of destructive assessment is not acceptable. A proposed alternative is the use of an imaging process called magnetic resonance elastography. Elastography is a nondestructive method for determining the engineered outcome by measuring local mechanical property values (i.e., complex shear modulus), which are essential markers for identifying the structure and functionality of a tissue. As a noninvasive means for evaluation, the monitoring of engineered constructs with imaging modalities such as magnetic resonance imaging (MRI) has seen increasing interest in the past decade1. For example, the magnetic resonance (MR) techniques of diffusion and relaxometry have been able to characterize the changes in chemical and physical properties during engineered tissue development2. The method proposed in the following protocol uses microscopic magnetic resonance elastography (μMRE) as a noninvasive MR based technique for measuring the mechanical properties of small soft tissues3. MRE is achieved by coupling a sonic mechanical actuator with the tissue of interest and recording the shear wave propagation with an MR scanner4. Recently, μMRE has been applied in tissue engineering to acquire essential growth information that is traditionally measured using destructive mechanical macroscopic techniques5. In the following procedure, elastography is achieved through the imaging of engineered constructs with a modified Hahn spin-echo sequence coupled with a mechanical actuator. As shown in Figure 1, the modified sequence synchronizes image acquisition with the transmission of external shear waves; subsequently, the motion is sensitized through the use of oscillating bipolar pairs. Following collection of images with positive and negative motion sensitization, complex division of the data produce a shear wave image. Then, the image is assessed using an inversion algorithm to generate a shear stiffness map6. The resulting measurements at each voxel have been shown to strongly correlate (R2>0.9914) with data collected using dynamic mechanical analysis7. In this study, elastography is integrated into the tissue development process for monitoring human mesenchymal stem cell (hMSC) differentiation into adipogenic and osteogenic constructs as shown in Figure 2.
Bioengineering, Issue 60, mesenchymal stem cells, tissue engineering (TE), regenerative medicine, adipose TE, magnetic resonance elastography (MRE), biomechanics, elasticity
Play Button
Echo Particle Image Velocimetry
Authors: Nicholas DeMarchi, Christopher White.
Institutions: University of New Hampshire.
The transport of mass, momentum, and energy in fluid flows is ultimately determined by spatiotemporal distributions of the fluid velocity field.1 Consequently, a prerequisite for understanding, predicting, and controlling fluid flows is the capability to measure the velocity field with adequate spatial and temporal resolution.2 For velocity measurements in optically opaque fluids or through optically opaque geometries, echo particle image velocimetry (EPIV) is an attractive diagnostic technique to generate "instantaneous" two-dimensional fields of velocity.3,4,5,6 In this paper, the operating protocol for an EPIV system built by integrating a commercial medical ultrasound machine7 with a PC running commercial particle image velocimetry (PIV) software8 is described, and validation measurements in Hagen-Poiseuille (i.e., laminar pipe) flow are reported. For the EPIV measurements, a phased array probe connected to the medical ultrasound machine is used to generate a two-dimensional ultrasound image by pulsing the piezoelectric probe elements at different times. Each probe element transmits an ultrasound pulse into the fluid, and tracer particles in the fluid (either naturally occurring or seeded) reflect ultrasound echoes back to the probe where they are recorded. The amplitude of the reflected ultrasound waves and their time delay relative to transmission are used to create what is known as B-mode (brightness mode) two-dimensional ultrasound images. Specifically, the time delay is used to determine the position of the scatterer in the fluid and the amplitude is used to assign intensity to the scatterer. The time required to obtain a single B-mode image, t, is determined by the time it take to pulse all the elements of the phased array probe. For acquiring multiple B-mode images, the frame rate of the system in frames per second (fps) = 1/δt. (See 9 for a review of ultrasound imaging.) For a typical EPIV experiment, the frame rate is between 20-60 fps, depending on flow conditions, and 100-1000 B-mode images of the spatial distribution of the tracer particles in the flow are acquired. Once acquired, the B-mode ultrasound images are transmitted via an ethernet connection to the PC running the PIV commercial software. Using the PIV software, tracer particle displacement fields, D(x,y)[pixels], (where x and y denote horizontal and vertical spatial position in the ultrasound image, respectively) are acquired by applying cross correlation algorithms to successive ultrasound B-mode images.10 The velocity fields, u(x,y)[m/s], are determined from the displacements fields, knowing the time step between image pairs, ΔT[s], and the image magnification, M[meter/pixel], i.e., u(x,y) = MD(x,y)/ΔT. The time step between images ΔT = 1/fps + D(x,y)/B, where B[pixels/s] is the time it takes for the ultrasound probe to sweep across the image width. In the present study, M = 77[μm/pixel], fps = 49.5[1/s], and B = 25,047[pixels/s]. Once acquired, the velocity fields can be analyzed to compute flow quantities of interest.
Mechanical Engineering, Issue 70, Physics, Engineering, Physical Sciences, Ultrasound, cross correlation, velocimetry, opaque fluids, particle, flow, fluid, EPIV
Play Button
Reduction in Left Ventricular Wall Stress and Improvement in Function in Failing Hearts using Algisyl-LVR
Authors: Lik Chuan Lee, Zhang Zhihong, Andrew Hinson, Julius M. Guccione.
Institutions: UCSF/VA Medical Center, LoneStar Heart, Inc..
Injection of Algisyl-LVR, a treatment under clinical development, is intended to treat patients with dilated cardiomyopathy. This treatment was recently used for the first time in patients who had symptomatic heart failure. In all patients, cardiac function of the left ventricle (LV) improved significantly, as manifested by consistent reduction of the LV volume and wall stress. Here we describe this novel treatment procedure and the methods used to quantify its effects on LV wall stress and function. Algisyl-LVR is a biopolymer gel consisting of Na+-Alginate and Ca2+-Alginate. The treatment procedure was carried out by mixing these two components and then combining them into one syringe for intramyocardial injections. This mixture was injected at 10 to 19 locations mid-way between the base and apex of the LV free wall in patients. Magnetic resonance imaging (MRI), together with mathematical modeling, was used to quantify the effects of this treatment in patients before treatment and at various time points during recovery. The epicardial and endocardial surfaces were first digitized from the MR images to reconstruct the LV geometry at end-systole and at end-diastole. Left ventricular cavity volumes were then measured from these reconstructed surfaces. Mathematical models of the LV were created from these MRI-reconstructed surfaces to calculate regional myofiber stress. Each LV model was constructed so that 1) it deforms according to a previously validated stress-strain relationship of the myocardium, and 2) the predicted LV cavity volume from these models matches the corresponding MRI-measured volume at end-diastole and end-systole. Diastolic filling was simulated by loading the LV endocardial surface with a prescribed end-diastolic pressure. Systolic contraction was simulated by concurrently loading the endocardial surface with a prescribed end-systolic pressure and adding active contraction in the myofiber direction. Regional myofiber stress at end-diastole and end-systole was computed from the deformed LV based on the stress-strain relationship.
Medicine, Issue 74, Biomedical Engineering, Anatomy, Physiology, Biophysics, Molecular Biology, Surgery, Cardiology, Cardiovascular Diseases, bioinjection, ventricular wall stress, mathematical model, heart failure, cardiac function, myocardium, left ventricle, LV, MRI, imaging, clinical techniques
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Play Button
Determining 3D Flow Fields via Multi-camera Light Field Imaging
Authors: Tadd T. Truscott, Jesse Belden, Joseph R. Nielson, David J. Daily, Scott L. Thomson.
Institutions: Brigham Young University, Naval Undersea Warfare Center, Newport, RI.
In the field of fluid mechanics, the resolution of computational schemes has outpaced experimental methods and widened the gap between predicted and observed phenomena in fluid flows. Thus, a need exists for an accessible method capable of resolving three-dimensional (3D) data sets for a range of problems. We present a novel technique for performing quantitative 3D imaging of many types of flow fields. The 3D technique enables investigation of complicated velocity fields and bubbly flows. Measurements of these types present a variety of challenges to the instrument. For instance, optically dense bubbly multiphase flows cannot be readily imaged by traditional, non-invasive flow measurement techniques due to the bubbles occluding optical access to the interior regions of the volume of interest. By using Light Field Imaging we are able to reparameterize images captured by an array of cameras to reconstruct a 3D volumetric map for every time instance, despite partial occlusions in the volume. The technique makes use of an algorithm known as synthetic aperture (SA) refocusing, whereby a 3D focal stack is generated by combining images from several cameras post-capture 1. Light Field Imaging allows for the capture of angular as well as spatial information about the light rays, and hence enables 3D scene reconstruction. Quantitative information can then be extracted from the 3D reconstructions using a variety of processing algorithms. In particular, we have developed measurement methods based on Light Field Imaging for performing 3D particle image velocimetry (PIV), extracting bubbles in a 3D field and tracking the boundary of a flickering flame. We present the fundamentals of the Light Field Imaging methodology in the context of our setup for performing 3DPIV of the airflow passing over a set of synthetic vocal folds, and show representative results from application of the technique to a bubble-entraining plunging jet.
Physics, Issue 73, Mechanical Engineering, Fluid Mechanics, Engineering, synthetic aperture imaging, light field, camera array, particle image velocimetry, three dimensional, vector fields, image processing, auto calibration, vocal chords, bubbles, flow, fluids
Play Button
Fluorescent Labeling of COS-7 Expressing SNAP-tag Fusion Proteins for Live Cell Imaging
Authors: Christopher R. Provost, Luo Sun.
Institutions: New England Biolabs.
SNAP-tag and CLIP-tag protein labeling systems enable the specific, covalent attachment of molecules, including fluorescent dyes, to a protein of interest in live cells. These systems offer a broad selection of fluorescent substrates optimized for a range of imaging instrumentation. Once cloned and expressed, the tagged protein can be used with a variety of substrates for numerous downstream applications without having to clone again. There are two steps to using this system: cloning and expression of the protein of interest as a SNAP-tag fusion, and labeling of the fusion with the SNAP-tag substrate of choice. The SNAP-tag is a small protein based on human O6-alkylguanine-DNA-alkyltransferase (hAGT), a DNA repair protein. SNAP-tag labels are dyes conjugated to guanine or chloropyrimidine leaving groups via a benzyl linker. In the labeling reaction, the substituted benzyl group of the substrate is covalently attached to the SNAP-tag. CLIP-tag is a modified version of SNAP-tag, engineered to react with benzylcytosine rather than benzylguanine derivatives. When used in conjunction with SNAP-tag, CLIP-tag enables the orthogonal and complementary labeling of two proteins simultaneously in the same cells.
Cellular Biology, Issue 39, fluorescence, labeling, imaging, SNAP-tag, tag, microscopy, AGT, surface, intracellular, fusion
Play Button
Laparoscopic Left Liver Sectoriectomy of Caroli's Disease Limited to Segment II and III
Authors: Luigi Boni, Gianlorenzo Dionigi, Francesca Rovera, Matteo Di Giuseppe.
Institutions: University of Insubria, University of Insubria.
Caroli's disease is defined as a abnormal dilatation of the intra-hepatica bile ducts: Its incidence is extremely low (1 in 1,000,000 population) and in most of the cases the whole liver is interested and liver transplantation is the treatment of choice. In case of dilatation limited to the left or right lobe, liver resection can be performed. For many year the standard approach for liver resection has been a formal laparotomy by means of a large incision of abdomen that is characterized by significant post-operatie morbidity. More recently, minimally invasive, laparoscopic approach has been proposed as possible surgical technique for liver resection both for benign and malignant diseases. The main benefits of the minimally invasive approach is represented by a significant reduction of the surgical trauma that allows a faster recovery a less post-operative complications. This video shows a case of Caroli s disease occured in a 58 years old male admitted at the gastroenterology department for sudden onset of abdominal pain associated with fever (>38C° ), nausea and shivering. Abdominal ultrasound demonstrated a significant dilatation of intra-hepatic left sited bile ducts with no evidences of gallbladder or common bile duct stones. Such findings were confirmed abdominal high resolution computer tomography. Laparoscopic left sectoriectomy was planned. Five trocars and 30° optic was used, exploration of the abdominal cavity showed no adhesions or evidences of other diseases. In order to control blood inflow to the liver, vascular clamp was placed on the hepatic pedicle (Pringle s manouvre), Parenchymal division is carried out with a combined use of 5 mm bipolar forceps and 5 mm ultrasonic dissector. A severely dilated left hepatic duct was isolated and divided using a 45mm endoscopic vascular stapler. Liver dissection was continued up to isolation of the main left portal branch that was then divided with a further cartridge of 45 mm vascular stapler. At his point the left liver remains attached only by the left hepatic vein: division of the triangular ligament was performed using monopolar hook and the hepatic vein isolated and the divided using vascular stapler. Haemostatis was refined by application of argon beam coagulation and no bleeding was revealed even after removal of the vascular clamp (total Pringle s time 27 minutes). Postoperative course was uneventful, minimal elevation of the liver function tests was recorded in post-operative day 1 but returned to normal at discharged on post-operative day 3.
Medicine, Issue 24, Laparoscopy, Liver resection, Caroli's disease, Left sectoriectomy
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.