JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Systematic computation of nonlinear cellular and molecular dynamics with low-power CytoMimetic circuits: a simulation study.
PLoS ONE
PUBLISHED: 02-05-2013
This paper presents a novel method for the systematic implementation of low-power microelectronic circuits aimed at computing nonlinear cellular and molecular dynamics. The method proposed is based on the Nonlinear Bernoulli Cell Formalism (NBCF), an advanced mathematical framework stemming from the Bernoulli Cell Formalism (BCF) originally exploited for the modular synthesis and analysis of linear, time-invariant, high dynamic range, logarithmic filters. Our approach identifies and exploits the striking similarities existing between the NBCF and coupled nonlinear ordinary differential equations (ODEs) typically appearing in models of naturally encountered biochemical systems. The resulting continuous-time, continuous-value, low-power CytoMimetic electronic circuits succeed in simulating fast and with good accuracy cellular and molecular dynamics. The application of the method is illustrated by synthesising for the first time microelectronic CytoMimetic topologies which simulate successfully: 1) a nonlinear intracellular calcium oscillations model for several Hill coefficient values and 2) a gene-protein regulatory system model. The dynamic behaviours generated by the proposed CytoMimetic circuits are compared and found to be in very good agreement with their biological counterparts. The circuits exploit the exponential law codifying the low-power subthreshold operation regime and have been simulated with realistic parameters from a commercially available CMOS process. They occupy an area of a fraction of a square-millimetre, while consuming between 1 and 12 microwatts of power. Simulations of fabrication-related variability results are also presented.
Authors: Kelley D. Sullivan, Edward B. Brown.
Published: 02-26-2010
ABSTRACT
Multi-fluorescence recovery after photobleaching is a microscopy technique used to measure the diffusion coefficient (or analogous transport parameters) of macromolecules, and can be applied to both in vitro and in vivo biological systems. Multi-fluorescence recovery after photobleaching is performed by photobleaching a region of interest within a fluorescent sample using an intense laser flash, then attenuating the beam and monitoring the fluorescence as still-fluorescent molecules from outside the region of interest diffuse in to replace the photobleached molecules. We will begin our demonstration by aligning the laser beam through the Pockels Cell (laser modulator) and along the optical path through the laser scan box and objective lens to the sample. For simplicity, we will use a sample of aqueous fluorescent dye. We will then determine the proper experimental parameters for our sample including, monitor and bleaching powers, bleach duration, bin widths (for photon counting), and fluorescence recovery time. Next, we will describe the procedure for taking recovery curves, a process that can be largely automated via LabVIEW (National Instruments, Austin, TX) for enhanced throughput. Finally, the diffusion coefficient is determined by fitting the recovery data to the appropriate mathematical model using a least-squares fitting algorithm, readily programmable using software such as MATLAB (The Mathworks, Natick, MA).
22 Related JoVE Articles!
Play Button
Optimize Flue Gas Settings to Promote Microalgae Growth in Photobioreactors via Computer Simulations
Authors: Lian He, Amelia B Chen, Yi Yu, Leah Kucera, Yinjie Tang.
Institutions: Washington University in St. Louis, St. Louis, Wuhan University of China, Washington University in St. Louis.
Flue gas from power plants can promote algal cultivation and reduce greenhouse gas emissions1. Microalgae not only capture solar energy more efficiently than plants3, but also synthesize advanced biofuels2-4. Generally, atmospheric CO2 is not a sufficient source for supporting maximal algal growth5. On the other hand, the high concentrations of CO2 in industrial exhaust gases have adverse effects on algal physiology. Consequently, both cultivation conditions (such as nutrients and light) and the control of the flue gas flow into the photo-bioreactors are important to develop an efficient “flue gas to algae” system. Researchers have proposed different photobioreactor configurations4,6 and cultivation strategies7,8 with flue gas. Here, we present a protocol that demonstrates how to use models to predict the microalgal growth in response to flue gas settings. We perform both experimental illustration and model simulations to determine the favorable conditions for algal growth with flue gas. We develop a Monod-based model coupled with mass transfer and light intensity equations to simulate the microalgal growth in a homogenous photo-bioreactor. The model simulation compares algal growth and flue gas consumptions under different flue-gas settings. The model illustrates: 1) how algal growth is influenced by different volumetric mass transfer coefficients of CO2; 2) how we can find optimal CO2 concentration for algal growth via the dynamic optimization approach (DOA); 3) how we can design a rectangular on-off flue gas pulse to promote algal biomass growth and to reduce the usage of flue gas. On the experimental side, we present a protocol for growing Chlorella under the flue gas (generated by natural gas combustion). The experimental results qualitatively validate the model predictions that the high frequency flue gas pulses can significantly improve algal cultivation.
Environmental Sciences, Issue 80, Microbiology, Cellular Biology, Marine Biology, Primary Cell Culture, Chlorella, CO2, mass transfer, Monod model, On-off pulse, Simulink
50718
Play Button
High Throughput Microfluidic Rapid and Low Cost Prototyping Packaging Methods
Authors: Amine Miled, Mohamad Sawan.
Institutions: Polytechnique Montreal.
In this work, 3 different packaging and assembly techniques are presented. They can be classified into two categories: one-time use and reusable packaging techniques. The one-time use packaging technique employs UV-based and temperature curing epoxies to connect microtubes to access holes, wire-bonding for integrated circuit connections, and silver epoxy for electrical connections. This method is based on a robust assembly technique that can support relatively high pressure close to 1 psi and does not need any support to strengthen the microfluidic architecture. Reusable packaging techniques consist of PDMS-based microtube interconnectors and anisotropic adhesive films for electrical connections. These devices are more sensitive and fragile. Consequently, Plexiglas support is added to the microfluidic structure to improve the electrical contact when anisotropic adhesive films are used, and also to strengthen the microfluidic architecture. In addition, a micromanipulator is needed to maintain tubes while using a thin PDMS layer to connect them to the access holes. Different PDMS layer thicknesses, ranging from 0.45-3 mm, are tested to compare the best adherence versus injection rates. Applied injection rates are varied from 50-300 μl/hr for 0.45-3 mm PDMS layers, respectively. These techniques are mainly applicable for low-pressure applications. However, they can be extended for high-pressure ones through plasma-oxygen process to permanently seal the PDMS to glass substrates. The main advantage of this technique, besides the fact that it is reusable, consists of keeping the device observable when the microchannel length is very short (in the range of 3 mm or lower).
Bioengineering, Issue 82, Microfluidics, PDMS, Lab-on-chip, Rapid-Prototyping, Microfabrication
50735
Play Button
Experimental Measurement of Settling Velocity of Spherical Particles in Unconfined and Confined Surfactant-based Shear Thinning Viscoelastic Fluids
Authors: Sahil Malhotra, Mukul M. Sharma.
Institutions: The University of Texas at Austin.
An experimental study is performed to measure the terminal settling velocities of spherical particles in surfactant based shear thinning viscoelastic (VES) fluids. The measurements are made for particles settling in unbounded fluids and fluids between parallel walls. VES fluids over a wide range of rheological properties are prepared and rheologically characterized. The rheological characterization involves steady shear-viscosity and dynamic oscillatory-shear measurements to quantify the viscous and elastic properties respectively. The settling velocities under unbounded conditions are measured in beakers having diameters at least 25x the diameter of particles. For measuring settling velocities between parallel walls, two experimental cells with different wall spacing are constructed. Spherical particles of varying sizes are gently dropped in the fluids and allowed to settle. The process is recorded with a high resolution video camera and the trajectory of the particle is recorded using image analysis software. Terminal settling velocities are calculated from the data. The impact of elasticity on settling velocity in unbounded fluids is quantified by comparing the experimental settling velocity to the settling velocity calculated by the inelastic drag predictions of Renaud et al.1 Results show that elasticity of fluids can increase or decrease the settling velocity. The magnitude of reduction/increase is a function of the rheological properties of the fluids and properties of particles. Confining walls are observed to cause a retardation effect on settling and the retardation is measured in terms of wall factors.
Physics, Issue 83, chemical engineering, settling velocity, Reynolds number, shear thinning, wall retardation
50749
Play Button
Designing Silk-silk Protein Alloy Materials for Biomedical Applications
Authors: Xiao Hu, Solomon Duki, Joseph Forys, Jeffrey Hettinger, Justin Buchicchio, Tabbetha Dobbins, Catherine Yang.
Institutions: Rowan University, Rowan University, Cooper Medical School of Rowan University, Rowan University.
Fibrous proteins display different sequences and structures that have been used for various applications in biomedical fields such as biosensors, nanomedicine, tissue regeneration, and drug delivery. Designing materials based on the molecular-scale interactions between these proteins will help generate new multifunctional protein alloy biomaterials with tunable properties. Such alloy material systems also provide advantages in comparison to traditional synthetic polymers due to the materials biodegradability, biocompatibility, and tenability in the body. This article used the protein blends of wild tussah silk (Antheraea pernyi) and domestic mulberry silk (Bombyx mori) as an example to provide useful protocols regarding these topics, including how to predict protein-protein interactions by computational methods, how to produce protein alloy solutions, how to verify alloy systems by thermal analysis, and how to fabricate variable alloy materials including optical materials with diffraction gratings, electric materials with circuits coatings, and pharmaceutical materials for drug release and delivery. These methods can provide important information for designing the next generation multifunctional biomaterials based on different protein alloys.
Bioengineering, Issue 90, protein alloys, biomaterials, biomedical, silk blends, computational simulation, implantable electronic devices
50891
Play Button
Measurement of Coherence Decay in GaMnAs Using Femtosecond Four-wave Mixing
Authors: Daniel Webber, Tristan de Boer, Murat Yildirim, Sam March, Reuble Mathew, Angela Gamouras, Xinyu Liu, Margaret Dobrowolska, Jacek Furdyna, Kimberley Hall.
Institutions: Dalhousie University, University of Notre Dame.
The application of femtosecond four-wave mixing to the study of fundamental properties of diluted magnetic semiconductors ((s,p)-d hybridization, spin-flip scattering) is described, using experiments on GaMnAs as a prototype III-Mn-V system.  Spectrally-resolved and time-resolved experimental configurations are described, including the use of zero-background autocorrelation techniques for pulse optimization.  The etching process used to prepare GaMnAs samples for four-wave mixing experiments is also highlighted.  The high temporal resolution of this technique, afforded by the use of short (20 fsec) optical pulses, permits the rapid spin-flip scattering process in this system to be studied directly in the time domain, providing new insight into the strong exchange coupling responsible for carrier-mediated ferromagnetism.  We also show that spectral resolution of the four-wave mixing signal allows one to extract clear signatures of (s,p)-d hybridization in this system, unlike linear spectroscopy techniques.   This increased sensitivity is due to the nonlinearity of the technique, which suppresses defect-related contributions to the optical response. This method may be used to measure the time scale for coherence decay (tied to the fastest scattering processes) in a wide variety of semiconductor systems of interest for next generation electronics and optoelectronics.
Physics, Issue 82, Four-wave mixing, spin-flip scattering, ultrafast, GaMnAs, diluted magnetic semiconductor, photon echo, dephasing, GaAs, low temperature grown semiconductor, exchange, ferromagnetic
51094
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Quantification of Global Diastolic Function by Kinematic Modeling-based Analysis of Transmitral Flow via the Parametrized Diastolic Filling Formalism
Authors: Sina Mossahebi, Simeng Zhu, Howard Chen, Leonid Shmuylovich, Erina Ghosh, Sándor J. Kovács.
Institutions: Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis.
Quantitative cardiac function assessment remains a challenge for physiologists and clinicians. Although historically invasive methods have comprised the only means available, the development of noninvasive imaging modalities (echocardiography, MRI, CT) having high temporal and spatial resolution provide a new window for quantitative diastolic function assessment. Echocardiography is the agreed upon standard for diastolic function assessment, but indexes in current clinical use merely utilize selected features of chamber dimension (M-mode) or blood/tissue motion (Doppler) waveforms without incorporating the physiologic causal determinants of the motion itself. The recognition that all left ventricles (LV) initiate filling by serving as mechanical suction pumps allows global diastolic function to be assessed based on laws of motion that apply to all chambers. What differentiates one heart from another are the parameters of the equation of motion that governs filling. Accordingly, development of the Parametrized Diastolic Filling (PDF) formalism has shown that the entire range of clinically observed early transmitral flow (Doppler E-wave) patterns are extremely well fit by the laws of damped oscillatory motion. This permits analysis of individual E-waves in accordance with a causal mechanism (recoil-initiated suction) that yields three (numerically) unique lumped parameters whose physiologic analogues are chamber stiffness (k), viscoelasticity/relaxation (c), and load (xo). The recording of transmitral flow (Doppler E-waves) is standard practice in clinical cardiology and, therefore, the echocardiographic recording method is only briefly reviewed. Our focus is on determination of the PDF parameters from routinely recorded E-wave data. As the highlighted results indicate, once the PDF parameters have been obtained from a suitable number of load varying E-waves, the investigator is free to use the parameters or construct indexes from the parameters (such as stored energy 1/2kxo2, maximum A-V pressure gradient kxo, load independent index of diastolic function, etc.) and select the aspect of physiology or pathophysiology to be quantified.
Bioengineering, Issue 91, cardiovascular physiology, ventricular mechanics, diastolic function, mathematical modeling, Doppler echocardiography, hemodynamics, biomechanics
51471
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
The Preparation of Electrohydrodynamic Bridges from Polar Dielectric Liquids
Authors: Adam D. Wexler, Mónica López Sáenz, Oliver Schreer, Jakob Woisetschläger, Elmar C. Fuchs.
Institutions: Wetsus - Centre of Excellence for Sustainable Water Technology, IRCAM GmbH, Graz University of Technology.
Horizontal and vertical liquid bridges are simple and powerful tools for exploring the interaction of high intensity electric fields (8-20 kV/cm) and polar dielectric liquids. These bridges are unique from capillary bridges in that they exhibit extensibility beyond a few millimeters, have complex bi-directional mass transfer patterns, and emit non-Planck infrared radiation. A number of common solvents can form such bridges as well as low conductivity solutions and colloidal suspensions. The macroscopic behavior is governed by electrohydrodynamics and provides a means of studying fluid flow phenomena without the presence of rigid walls. Prior to the onset of a liquid bridge several important phenomena can be observed including advancing meniscus height (electrowetting), bulk fluid circulation (the Sumoto effect), and the ejection of charged droplets (electrospray). The interaction between surface, polarization, and displacement forces can be directly examined by varying applied voltage and bridge length. The electric field, assisted by gravity, stabilizes the liquid bridge against Rayleigh-Plateau instabilities. Construction of basic apparatus for both vertical and horizontal orientation along with operational examples, including thermographic images, for three liquids (e.g., water, DMSO, and glycerol) is presented.
Physics, Issue 91, floating water bridge, polar dielectric liquids, liquid bridge, electrohydrodynamics, thermography, dielectrophoresis, electrowetting, Sumoto effect, Armstrong effect
51819
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
50680
Play Button
Viability Assays for Cells in Culture
Authors: Jessica M. Posimo, Ajay S. Unnithan, Amanda M. Gleixner, Hailey J. Choi, Yiran Jiang, Sree H. Pulugulla, Rehana K. Leak.
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
50645
Play Button
The Generation of Higher-order Laguerre-Gauss Optical Beams for High-precision Interferometry
Authors: Ludovico Carbone, Paul Fulda, Charlotte Bond, Frank Brueckner, Daniel Brown, Mengyao Wang, Deepali Lodhia, Rebecca Palmer, Andreas Freise.
Institutions: University of Birmingham.
Thermal noise in high-reflectivity mirrors is a major impediment for several types of high-precision interferometric experiments that aim to reach the standard quantum limit or to cool mechanical systems to their quantum ground state. This is for example the case of future gravitational wave observatories, whose sensitivity to gravitational wave signals is expected to be limited in the most sensitive frequency band, by atomic vibration of their mirror masses. One promising approach being pursued to overcome this limitation is to employ higher-order Laguerre-Gauss (LG) optical beams in place of the conventionally used fundamental mode. Owing to their more homogeneous light intensity distribution these beams average more effectively over the thermally driven fluctuations of the mirror surface, which in turn reduces the uncertainty in the mirror position sensed by the laser light. We demonstrate a promising method to generate higher-order LG beams by shaping a fundamental Gaussian beam with the help of diffractive optical elements. We show that with conventional sensing and control techniques that are known for stabilizing fundamental laser beams, higher-order LG modes can be purified and stabilized just as well at a comparably high level. A set of diagnostic tools allows us to control and tailor the properties of generated LG beams. This enabled us to produce an LG beam with the highest purity reported to date. The demonstrated compatibility of higher-order LG modes with standard interferometry techniques and with the use of standard spherical optics makes them an ideal candidate for application in a future generation of high-precision interferometry.
Physics, Issue 78, Optics, Astronomy, Astrophysics, Gravitational waves, Laser interferometry, Metrology, Thermal noise, Laguerre-Gauss modes, interferometry
50564
Play Button
Simulation, Fabrication and Characterization of THz Metamaterial Absorbers
Authors: James P. Grant, Iain J.H. McCrindle, David R.S. Cumming.
Institutions: University of Glasgow.
Metamaterials (MM), artificial materials engineered to have properties that may not be found in nature, have been widely explored since the first theoretical1 and experimental demonstration2 of their unique properties. MMs can provide a highly controllable electromagnetic response, and to date have been demonstrated in every technologically relevant spectral range including the optical3, near IR4, mid IR5 , THz6 , mm-wave7 , microwave8 and radio9 bands. Applications include perfect lenses10, sensors11, telecommunications12, invisibility cloaks13 and filters14,15. We have recently developed single band16, dual band17 and broadband18 THz metamaterial absorber devices capable of greater than 80% absorption at the resonance peak. The concept of a MM absorber is especially important at THz frequencies where it is difficult to find strong frequency selective THz absorbers19. In our MM absorber the THz radiation is absorbed in a thickness of ~ λ/20, overcoming the thickness limitation of traditional quarter wavelength absorbers. MM absorbers naturally lend themselves to THz detection applications, such as thermal sensors, and if integrated with suitable THz sources (e.g. QCLs), could lead to compact, highly sensitive, low cost, real time THz imaging systems.
Materials Science, Issue 70, Physics, Engineering, Metamaterial, terahertz, sensing, fabrication, clean room, simulation, FTIR, spectroscopy
50114
Play Button
Patient-specific Modeling of the Heart: Estimation of Ventricular Fiber Orientations
Authors: Fijoy Vadakkumpadan, Hermenegild Arevalo, Natalia A. Trayanova.
Institutions: Johns Hopkins University.
Patient-specific simulations of heart (dys)function aimed at personalizing cardiac therapy are hampered by the absence of in vivo imaging technology for clinically acquiring myocardial fiber orientations. The objective of this project was to develop a methodology to estimate cardiac fiber orientations from in vivo images of patient heart geometries. An accurate representation of ventricular geometry and fiber orientations was reconstructed, respectively, from high-resolution ex vivo structural magnetic resonance (MR) and diffusion tensor (DT) MR images of a normal human heart, referred to as the atlas. Ventricular geometry of a patient heart was extracted, via semiautomatic segmentation, from an in vivo computed tomography (CT) image. Using image transformation algorithms, the atlas ventricular geometry was deformed to match that of the patient. Finally, the deformation field was applied to the atlas fiber orientations to obtain an estimate of patient fiber orientations. The accuracy of the fiber estimates was assessed using six normal and three failing canine hearts. The mean absolute difference between inclination angles of acquired and estimated fiber orientations was 15.4 °. Computational simulations of ventricular activation maps and pseudo-ECGs in sinus rhythm and ventricular tachycardia indicated that there are no significant differences between estimated and acquired fiber orientations at a clinically observable level.The new insights obtained from the project will pave the way for the development of patient-specific models of the heart that can aid physicians in personalized diagnosis and decisions regarding electrophysiological interventions.
Bioengineering, Issue 71, Biomedical Engineering, Medicine, Anatomy, Physiology, Cardiology, Myocytes, Cardiac, Image Processing, Computer-Assisted, Magnetic Resonance Imaging, MRI, Diffusion Magnetic Resonance Imaging, Cardiac Electrophysiology, computerized simulation (general), mathematical modeling (systems analysis), Cardiomyocyte, biomedical image processing, patient-specific modeling, Electrophysiology, simulation
50125
Play Button
Applications of EEG Neuroimaging Data: Event-related Potentials, Spectral Power, and Multiscale Entropy
Authors: Jennifer J. Heisz, Anthony R. McIntosh.
Institutions: Baycrest.
When considering human neuroimaging data, an appreciation of signal variability represents a fundamental innovation in the way we think about brain signal. Typically, researchers represent the brain's response as the mean across repeated experimental trials and disregard signal fluctuations over time as "noise". However, it is becoming clear that brain signal variability conveys meaningful functional information about neural network dynamics. This article describes the novel method of multiscale entropy (MSE) for quantifying brain signal variability. MSE may be particularly informative of neural network dynamics because it shows timescale dependence and sensitivity to linear and nonlinear dynamics in the data.
Neuroscience, Issue 76, Neurobiology, Anatomy, Physiology, Medicine, Biomedical Engineering, Electroencephalography, EEG, electroencephalogram, Multiscale entropy, sample entropy, MEG, neuroimaging, variability, noise, timescale, non-linear, brain signal, information theory, brain, imaging
50131
Play Button
Fabrication And Characterization Of Photonic Crystal Slow Light Waveguides And Cavities
Authors: Christopher Paul Reardon, Isabella H. Rey, Karl Welna, Liam O'Faolain, Thomas F. Krauss.
Institutions: University of St Andrews.
Slow light has been one of the hot topics in the photonics community in the past decade, generating great interest both from a fundamental point of view and for its considerable potential for practical applications. Slow light photonic crystal waveguides, in particular, have played a major part and have been successfully employed for delaying optical signals1-4 and the enhancement of both linear5-7 and nonlinear devices.8-11 Photonic crystal cavities achieve similar effects to that of slow light waveguides, but over a reduced band-width. These cavities offer high Q-factor/volume ratio, for the realization of optically12 and electrically13 pumped ultra-low threshold lasers and the enhancement of nonlinear effects.14-16 Furthermore, passive filters17 and modulators18-19 have been demonstrated, exhibiting ultra-narrow line-width, high free-spectral range and record values of low energy consumption. To attain these exciting results, a robust repeatable fabrication protocol must be developed. In this paper we take an in-depth look at our fabrication protocol which employs electron-beam lithography for the definition of photonic crystal patterns and uses wet and dry etching techniques. Our optimised fabrication recipe results in photonic crystals that do not suffer from vertical asymmetry and exhibit very good edge-wall roughness. We discuss the results of varying the etching parameters and the detrimental effects that they can have on a device, leading to a diagnostic route that can be taken to identify and eliminate similar issues. The key to evaluating slow light waveguides is the passive characterization of transmission and group index spectra. Various methods have been reported, most notably resolving the Fabry-Perot fringes of the transmission spectrum20-21 and interferometric techniques.22-25 Here, we describe a direct, broadband measurement technique combining spectral interferometry with Fourier transform analysis.26 Our method stands out for its simplicity and power, as we can characterise a bare photonic crystal with access waveguides, without need for on-chip interference components, and the setup only consists of a Mach-Zehnder interferometer, with no need for moving parts and delay scans. When characterising photonic crystal cavities, techniques involving internal sources21 or external waveguides directly coupled to the cavity27 impact on the performance of the cavity itself, thereby distorting the measurement. Here, we describe a novel and non-intrusive technique that makes use of a cross-polarised probe beam and is known as resonant scattering (RS), where the probe is coupled out-of plane into the cavity through an objective. The technique was first demonstrated by McCutcheon et al.28 and further developed by Galli et al.29
Physics, Issue 69, Optics and Photonics, Astronomy, light scattering, light transmission, optical waveguides, photonics, photonic crystals, Slow-light, Cavities, Waveguides, Silicon, SOI, Fabrication, Characterization
50216
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
50341
Play Button
Microwave Photonics Systems Based on Whispering-gallery-mode Resonators
Authors: Aurélien Coillet, Rémi Henriet, Kien Phan Huy, Maxime Jacquot, Luca Furfaro, Irina Balakireva, Laurent Larger, Yanne K. Chembo.
Institutions: FEMTO-ST Institute.
Microwave photonics systems rely fundamentally on the interaction between microwave and optical signals. These systems are extremely promising for various areas of technology and applied science, such as aerospace and communication engineering, sensing, metrology, nonlinear photonics, and quantum optics. In this article, we present the principal techniques used in our lab to build microwave photonics systems based on ultra-high Q whispering gallery mode resonators. First detailed in this article is the protocol for resonator polishing, which is based on a grind-and-polish technique close to the ones used to polish optical components such as lenses or telescope mirrors. Then, a white light interferometric profilometer measures surface roughness, which is a key parameter to characterize the quality of the polishing. In order to launch light in the resonator, a tapered silica fiber with diameter in the micrometer range is used. To reach such small diameters, we adopt the "flame-brushing" technique, using simultaneously computer-controlled motors to pull the fiber apart, and a blowtorch to heat the fiber area to be tapered. The resonator and the tapered fiber are later approached to one another to visualize the resonance signal of the whispering gallery modes using a wavelength-scanning laser. By increasing the optical power in the resonator, nonlinear phenomena are triggered until the formation of a Kerr optical frequency comb is observed with a spectrum made of equidistant spectral lines. These Kerr comb spectra have exceptional characteristics that are suitable for several applications in science and technology. We consider the application related to ultra-stable microwave frequency synthesis and demonstrate the generation of a Kerr comb with GHz intermodal frequency.
Physics, Issue 78, Optics, Engineering, Electrical Engineering, Mechanical Engineering, Microwaves, nonlinear optics, optical fibers, microwave photonics, whispering-gallery-mode resonator, resonator
50423
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Gradient Echo Quantum Memory in Warm Atomic Vapor
Authors: Olivier Pinel, Mahdi Hosseini, Ben M. Sparkes, Jesse L. Everett, Daniel Higginbottom, Geoff T. Campbell, Ping Koy Lam, Ben C. Buchler.
Institutions: The Australian National University.
Gradient echo memory (GEM) is a protocol for storing optical quantum states of light in atomic ensembles. The primary motivation for such a technology is that quantum key distribution (QKD), which uses Heisenberg uncertainty to guarantee security of cryptographic keys, is limited in transmission distance. The development of a quantum repeater is a possible path to extend QKD range, but a repeater will need a quantum memory. In our experiments we use a gas of rubidium 87 vapor that is contained in a warm gas cell. This makes the scheme particularly simple. It is also a highly versatile scheme that enables in-memory refinement of the stored state, such as frequency shifting and bandwidth manipulation. The basis of the GEM protocol is to absorb the light into an ensemble of atoms that has been prepared in a magnetic field gradient. The reversal of this gradient leads to rephasing of the atomic polarization and thus recall of the stored optical state. We will outline how we prepare the atoms and this gradient and also describe some of the pitfalls that need to be avoided, in particular four-wave mixing, which can give rise to optical gain.
Physics, Issue 81, quantum memory, photon echo, rubidium vapor, gas cell, optical memory, gradient echo memory (GEM)
50552
Play Button
From Fast Fluorescence Imaging to Molecular Diffusion Law on Live Cell Membranes in a Commercial Microscope
Authors: Carmine Di Rienzo, Enrico Gratton, Fabio Beltram, Francesco Cardarelli.
Institutions: Scuola Normale Superiore, Instituto Italiano di Tecnologia, University of California, Irvine.
It has become increasingly evident that the spatial distribution and the motion of membrane components like lipids and proteins are key factors in the regulation of many cellular functions. However, due to the fast dynamics and the tiny structures involved, a very high spatio-temporal resolution is required to catch the real behavior of molecules. Here we present the experimental protocol for studying the dynamics of fluorescently-labeled plasma-membrane proteins and lipids in live cells with high spatiotemporal resolution. Notably, this approach doesn’t need to track each molecule, but it calculates population behavior using all molecules in a given region of the membrane. The starting point is a fast imaging of a given region on the membrane. Afterwards, a complete spatio-temporal autocorrelation function is calculated correlating acquired images at increasing time delays, for example each 2, 3, n repetitions. It is possible to demonstrate that the width of the peak of the spatial autocorrelation function increases at increasing time delay as a function of particle movement due to diffusion. Therefore, fitting of the series of autocorrelation functions enables to extract the actual protein mean square displacement from imaging (iMSD), here presented in the form of apparent diffusivity vs average displacement. This yields a quantitative view of the average dynamics of single molecules with nanometer accuracy. By using a GFP-tagged variant of the Transferrin Receptor (TfR) and an ATTO488 labeled 1-palmitoyl-2-hydroxy-sn-glycero-3-phosphoethanolamine (PPE) it is possible to observe the spatiotemporal regulation of protein and lipid diffusion on µm-sized membrane regions in the micro-to-milli-second time range.
Bioengineering, Issue 92, fluorescence, protein dynamics, lipid dynamics, membrane heterogeneity, transient confinement, single molecule, GFP
51994
Play Button
Designing and Implementing Nervous System Simulations on LEGO Robots
Authors: Daniel Blustein, Nikolai Rosenthal, Joseph Ayers.
Institutions: Northeastern University, Bremen University of Applied Sciences.
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.1 The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Neuroscience, Issue 75, Neurobiology, Bioengineering, Behavior, Mechanical Engineering, Computer Science, Marine Biology, Biomimetics, Marine Science, Neurosciences, Synthetic Biology, Robotics, robots, Modeling, models, Sensory Fusion, nervous system, Educational Tools, programming, software, lobster, Homarus americanus, animal model
50519
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.