Transfection of DNA has been invaluable for biological sciences and with recent advances to organotypic brain slice preparations, the effect of various heterologous genes could thus be investigated easily while maintaining many aspects of in vivo biology. There has been increasing interest to transfect terminally differentiated neurons for which conventional transfection methods have been fraught with difficulties such as low yields and significant losses in viability. Biolistic transfection can circumvent many of these difficulties yet only recently has this technique been modified so that it is amenable for use in mammalian tissues.
New modifications to the accelerator chamber have enhanced the gene gun's firing accuracy and increased its depths of penetration while also allowing the use of lower gas pressure (50 psi) without loss of transfection efficiency as well as permitting a focused regioselective spread of the particles to within 3 mm. In addition, this technique is straight forward and faster to perform than tedious microinjections. Both transient and stable expression are possible with nanoparticle bombardment where episomal expression can be detected within 24 hr and the cell survival was shown to be better than, or at least equal to, conventional methods. This technique has however one crucial advantage: it permits the transfection to be localized within a single restrained radius thus enabling the user to anatomically isolate the heterologous gene's effects. Here we present an in-depth protocol to prepare viable adult organotypic slices and submit them to regioselective transfection using an improved gene gun.
27 Related JoVE Articles!
Customization of Aspergillus niger Morphology Through Addition of Talc Micro Particles
Institutions: Technische Universität Braunschweig.
The filamentous fungus A. niger
is a widely used strain in a broad range of industrial processes from food to pharmaceutical industry. One of the most intriguing and often uncontrollable characteristics of this filamentous organism is its complex morphology. It ranges from dense spherical pellets to viscous mycelia (Figure 1
). Various process parameters and ingredients are known to influence fungal morphology 1
. Since optimal productivity correlates strongly with a specific morphological form, the fungal morphology often represents the bottleneck of productivity in industrial production.
A straight forward and elegant approach to precisely control morphological shape is the addition of inorganic insoluble micro particles (like hydrous magnesium silicate, aluminum oxide or titanium silicate oxide) to the culture medium contributing to increased enzyme production 2-6
. Since there is an obvious correlation between micro particle dependent morphology and enzyme production it is desirable to mathematically link productivity and morphological appearance. Therefore a quantitative precise and holistic morphological description is targeted.
Thus, we present a method to generate and characterize micro particle dependent morphological structures and to correlate fungal morphology with productivity (Figure 1
) which possibly contributes to a better understanding of the morphogenesis of filamentous microorganisms.
The recombinant strain A. niger
SKAn1015 is cultivated for 72 h in a 3 L stirred tank bioreactor. By addition of talc micro particles in concentrations of 1 g/L, 3 g/L and 10 g/L prior to inoculation a variety of morphological structures is reproducibly generated. Sterile samples are taken after 24, 48 and 72 hours for determination of growth progress and activity of the produced enzyme. The formed product is the high-value enzyme β-fructofuranosidase, an important biocatalyst for neo-sugar formation in food or pharmaceutical industry, which catalyzes among others the reaction of sucrose to glucose 7-9
. Therefore, the quantification of glucose after adding sucrose implies the amount of produced β-fructofuranosidase. Glucose quantification is made by a GOD/POD-Assay 10
, which is modified for high-throughput analysis in 96-well micro titer plates.
Fungal morphology after 72 hours is examined by microscope and characterized by digital image analysis. In doing so, particle shape factors for fungal macro morphology like Feret's diameter, projected area, perimeter, circularity, aspect ratio, roundness und solidity are calculated with the open source image processing program ImageJ. Relevant parameters are combined to a dimensionless Morphology number (Mn) 11
, which enables a comprehensive characterization of fungal morphology. The close correlation of the Morphology number and productivity are highlighted by mathematical regression.
Immunology, Issue 61, morphology engineering, Morphology number (Mn), filamentous fungi, fructofuranosidase, micro particles, image analysis
Polycrystalline Silicon Thin-film Solar cells with Plasmonic-enhanced Light-trapping
Institutions: University of New South Wales .
One of major approaches to cheaper solar cells is reducing the amount of semiconductor material used for their fabrication and making cells thinner. To compensate for lower light absorption such physically thin devices have to incorporate light-trapping which increases their optical thickness. Light scattering by textured surfaces is a common technique but it cannot be universally applied to all solar cell technologies. Some cells, for example those made of evaporated silicon, are planar as produced and they require an alternative light-trapping means suitable for planar devices. Metal nanoparticles formed on planar silicon cell surface and capable of light scattering due to surface plasmon resonance is an effective approach.
The paper presents a fabrication procedure of evaporated polycrystalline silicon solar cells with plasmonic light-trapping and demonstrates how the cell quantum efficiency improves due to presence of metal nanoparticles.
To fabricate the cells a film consisting of alternative boron and phosphorous doped silicon layers is deposited on glass substrate by electron beam evaporation. An Initially amorphous film is crystallised and electronic defects are mitigated by annealing and hydrogen passivation. Metal grid contacts are applied to the layers of opposite polarity to extract electricity generated by the cell. Typically, such a ~2 μm thick cell has a short-circuit current density (Jsc
) of 14-16 mA/cm2
, which can be increased up to 17-18 mA/cm2
(~25% higher) after application of a simple diffuse back reflector made of a white paint.
To implement plasmonic light-trapping a silver nanoparticle array is formed on the metallised cell silicon surface. A precursor silver film is deposited on the cell by thermal evaporation and annealed at 23°C to form silver nanoparticles. Nanoparticle size and coverage, which affect plasmonic light-scattering, can be tuned for enhanced cell performance by varying the precursor film thickness and its annealing conditions. An optimised nanoparticle array alone results in cell Jsc
enhancement of about 28%, similar to the effect of the diffuse reflector. The photocurrent can be further increased by coating the nanoparticles by a low refractive index dielectric, like MgF2
, and applying the diffused reflector. The complete plasmonic cell structure comprises the polycrystalline silicon film, a silver nanoparticle array, a layer of MgF2
, and a diffuse reflector. The Jsc
for such cell is 21-23 mA/cm2
, up to 45% higher than Jsc
of the original cell without light-trapping or ~25% higher than Jsc
for the cell with the diffuse reflector only.
Light-trapping in silicon solar cells is commonly achieved via light scattering at textured interfaces. Scattered light travels through a cell at oblique angles for a longer distance and when such angles exceed the critical angle at the cell interfaces the light is permanently trapped in the cell by total internal reflection (Animation 1: Light-trapping)
. Although this scheme works well for most solar cells, there are developing technologies where ultra-thin Si layers are produced planar (e.g. layer-transfer technologies and epitaxial c-Si layers) 1
and or when such layers are not compatible with textures substrates (e.g. evaporated silicon) 2
. For such originally planar Si layer alternative light trapping approaches, such as diffuse white paint reflector 3
, silicon plasma texturing 4
or high refractive index nanoparticle reflector 5
have been suggested.
Metal nanoparticles can effectively scatter incident light into a higher refractive index material, like silicon, due to the surface plasmon resonance effect 6
. They also can be easily formed on the planar silicon cell surface thus offering a light-trapping approach alternative to texturing. For a nanoparticle located at the air-silicon interface the scattered light fraction coupled into silicon exceeds 95% and a large faction of that light is scattered at angles above critical providing nearly ideal light-trapping condition (Animation 2: Plasmons on NP)
. The resonance can be tuned to the wavelength region, which is most important for a particular cell material and design, by varying the nanoparticle average size, surface coverage and local dielectric environment 6,7
. Theoretical design principles of plasmonic nanoparticle solar cells have been suggested 8
. In practice, Ag nanoparticle array is an ideal light-trapping partner for poly-Si thin-film solar cells because most of these design principle are naturally met. The simplest way of forming nanoparticles by thermal annealing of a thin precursor Ag film results in a random array with a relatively wide size and shape distribution, which is particularly suitable for light-trapping because such an array has a wide resonance peak, covering the wavelength range of 700-900 nm, important for poly-Si solar cell performance. The nanoparticle array can only be located on the rear poly-Si cell surface thus avoiding destructive interference between incident and scattered light which occurs for front-located nanoparticles 9
. Moreover, poly-Si thin-film cells do not requires a passivating layer and the flat base-shaped nanoparticles (that naturally result from thermal annealing of a metal film) can be directly placed on silicon further increases plasmonic scattering efficiency due to surface plasmon-polariton resonance 10
The cell with the plasmonic nanoparticle array as described above can have a photocurrent about 28% higher than the original cell. However, the array still transmits a significant amount of light which escapes through the rear of the cell and does not contribute into the current. This loss can be mitigated by adding a rear reflector to allow catching transmitted light and re-directing it back to the cell. Providing sufficient distance between the reflector and the nanoparticles (a few hundred nanometers) the reflected light will then experience one more plasmonic scattering event while passing through the nanoparticle array on re-entering the cell and the reflector itself can be made diffuse - both effects further facilitating light scattering and hence light-trapping. Importantly, the Ag nanoparticles have to be encapsulated with an inert and low refractive index dielectric, like MgF2
, from the rear reflector to avoid mechanical and chemical damage 7
. Low refractive index for this cladding layer is required to maintain a high coupling fraction into silicon and larger scattering angles, which are ensured by the high optical contrast between the media on both sides of the nanoparticle, silicon and dielectric 6
. The photocurrent of the plasmonic cell with the diffuse rear reflector can be up to 45% higher than the current of the original cell or up to 25% higher than the current of an equivalent cell with the diffuse reflector only.
Physics, Issue 65, Materials Science, Photovoltaics, Silicon thin-film solar cells, light-trapping, metal nanoparticles, surface plasmons
Micro-drive Array for Chronic in vivo Recording: Tetrode Assembly
Institutions: MIT - Massachusetts Institute of Technology, MIT - Massachusetts Institute of Technology.
The tetrode, a bundle of four electrodes, has proven to be a valuable tool for the simultaneous recording of multiple neurons in-vivo. The differential amplitude of action potential signatures over the channels of a tetrode allows for the isolation of single-unit activity from multi-unit signals. The ability to precisely control the stereotaxic location and depth of the tetrode is critical for studying coordinated neural activity across brain regions. In combination with a micro-drive array, it is possible to achieve precise placement and stable control of many tetrodes over the course of days to weeks. In this protocol, we demonstrate how to fabricate and condition tetrodes using basic tools and materials, install the tetrodes into a multi-drive tetrode array for chronic in-vivo recording in the rat, make ground wire connections to the micro-drive array, and attach a protective cone onto the micro-drive array in order to protect the tetrodes from physical contact with the environment.
Neuroscience, Issue 26, fabrication, micro-drive array, tetrode, electrophysiology, multiple neuronal recordings, in vivo recording, systems neuroscience, hippocampus, coordinated neural activity, cortex, rat brain
Knowing What Counts: Unbiased Stereology in the Non-human Primate Brain
Institutions: University of Montreal, University of Montreal, Stereology Resource Center.
The non-human primate is an important translational species for understanding the normal function and disease processes of the human brain. Unbiased stereology, the method accepted as state-of-the-art for quantification of biological objects in tissue sections2
, generates reliable structural data for biological features in the mammalian brain3
. The key components of the approach are unbiased (systematic-random) sampling of anatomically defined structures (reference spaces), combined with quantification of cell numbers and size, fiber and capillary lengths, surface areas, regional volumes and spatial distributions of biological objects within the reference space4
. Among the advantages of these stereological approaches over previous methods is the avoidance of all known sources of systematic (non-random) error arising from faulty assumptions and non-verifiable models. This study documents a biological application of computerized stereology to estimate the total neuronal population in the frontal cortex of the vervet monkey brain (Chlorocebus aethiops sabeus
), with assistance from two commercially available stereology programs, BioQuant Life Sciences and Stereologer (Figure 1). In addition to contrast and comparison of results from both the BioQuant and Stereologer
systems, this study provides a detailed protocol for the Stereologer
Neuroscience, Issue 27, Stereology, brain bank, systematic sampling, non-human primate, cryostat, antigen preserve
Quantification of Global Diastolic Function by Kinematic Modeling-based Analysis of Transmitral Flow via the Parametrized Diastolic Filling Formalism
Institutions: Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis.
Quantitative cardiac function assessment remains a challenge for physiologists and clinicians. Although historically invasive methods have comprised the only means available, the development of noninvasive imaging modalities (echocardiography, MRI, CT) having high temporal and spatial resolution provide a new window for quantitative diastolic function assessment. Echocardiography is the agreed upon standard for diastolic function assessment, but indexes in current clinical use merely utilize selected features of chamber dimension (M-mode) or blood/tissue motion (Doppler) waveforms without incorporating the physiologic causal determinants of the motion itself. The recognition that all left ventricles (LV) initiate filling by serving as mechanical suction pumps allows global diastolic function to be assessed based on laws of motion that apply to all chambers. What differentiates one heart from another are the parameters of the equation of motion that governs filling. Accordingly, development of the Parametrized Diastolic Filling (PDF) formalism has shown that the entire range of clinically observed early transmitral flow (Doppler E-wave) patterns are extremely well fit by the laws of damped oscillatory motion. This permits analysis of individual E-waves in accordance with a causal mechanism (recoil-initiated suction) that yields three (numerically) unique lumped parameters whose physiologic analogues are chamber stiffness (k
), viscoelasticity/relaxation (c
), and load (xo
). The recording of transmitral flow (Doppler E-waves) is standard practice in clinical cardiology and, therefore, the echocardiographic recording method is only briefly reviewed. Our focus is on determination of the PDF parameters from routinely recorded E-wave data. As the highlighted results indicate, once the PDF parameters have been obtained from a suitable number of load varying E-waves, the investigator is free to use the parameters or construct indexes from the parameters (such as stored energy 1/2kxo2
, maximum A-V pressure gradient kxo
, load independent index of diastolic function, etc
.) and select the aspect of physiology or pathophysiology to be quantified.
Bioengineering, Issue 91, cardiovascular physiology, ventricular mechanics, diastolic function, mathematical modeling, Doppler echocardiography, hemodynamics, biomechanics
A Toolkit to Enable Hydrocarbon Conversion in Aqueous Environments
Institutions: Delft University of Technology, Delft University of Technology.
This work puts forward a toolkit that enables the conversion of alkanes by Escherichia coli
and presents a proof of principle of its applicability. The toolkit consists of multiple standard interchangeable parts (BioBricks)9
addressing the conversion of alkanes, regulation of gene expression and survival in toxic hydrocarbon-rich environments.
A three-step pathway for alkane degradation was implemented in E. coli
to enable the conversion of medium- and long-chain alkanes to their respective alkanols, alkanals and ultimately alkanoic-acids. The latter were metabolized via the native β-oxidation pathway. To facilitate the oxidation of medium-chain alkanes (C5-C13) and cycloalkanes (C5-C8), four genes (alkB2
) of the alkane hydroxylase system from Gordonia
were transformed into E. coli
. For the conversion of long-chain alkanes (C15-C36), theladA
gene from Geobacillus thermodenitrificans
was implemented. For the required further steps of the degradation process, ADH
and ALDH (
originating from G. thermodenitrificans
) were introduced10,11
. The activity was measured by resting cell assays. For each oxidative step, enzyme activity was observed.
To optimize the process efficiency, the expression was only induced under low glucose conditions: a substrate-regulated promoter, pCaiF, was used. pCaiF is present in E. coli
K12 and regulates the expression of the genes involved in the degradation of non-glucose carbon sources.
The last part of the toolkit - targeting survival - was implemented using solvent tolerance genes, PhPFDα and β, both from Pyrococcus horikoshii
OT3. Organic solvents can induce cell stress and decreased survivability by negatively affecting protein folding. As chaperones, PhPFDα and β improve the protein folding process e.g.
under the presence of alkanes. The expression of these genes led to an improved hydrocarbon tolerance shown by an increased growth rate (up to 50%) in the presences of 10% n
-hexane in the culture medium were observed.
Summarizing, the results indicate that the toolkit enables E. coli
to convert and tolerate hydrocarbons in aqueous environments. As such, it represents an initial step towards a sustainable solution for oil-remediation using a synthetic biology approach.
Bioengineering, Issue 68, Microbiology, Biochemistry, Chemistry, Chemical Engineering, Oil remediation, alkane metabolism, alkane hydroxylase system, resting cell assay, prefoldin, Escherichia coli, synthetic biology, homologous interaction mapping, mathematical model, BioBrick, iGEM
Creating Dynamic Images of Short-lived Dopamine Fluctuations with lp-ntPET: Dopamine Movies of Cigarette Smoking
Institutions: Yale University, Yale University, Yale University, Yale University, Massachusetts General Hospital, University of California, Irvine.
We describe experimental and statistical steps for creating dopamine movies of the brain from dynamic PET data. The movies represent minute-to-minute fluctuations of dopamine induced by smoking a cigarette. The smoker is imaged during a natural smoking experience while other possible confounding effects (such as head motion, expectation, novelty, or aversion to smoking repeatedly) are minimized.
We present the details of our unique analysis. Conventional methods for PET analysis estimate time-invariant kinetic model parameters which cannot capture short-term fluctuations in neurotransmitter release. Our analysis - yielding a dopamine movie - is based on our work with kinetic models and other decomposition techniques that allow for time-varying parameters 1-7
. This aspect of the analysis - temporal-variation - is key to our work. Because our model is also linear in parameters, it is practical, computationally, to apply at the voxel level. The analysis technique is comprised of five main steps: pre-processing, modeling, statistical comparison, masking and visualization. Preprocessing is applied to the PET data with a unique 'HYPR' spatial filter 8
that reduces spatial noise but preserves critical temporal information. Modeling identifies the time-varying function that best describes the dopamine effect on 11
C-raclopride uptake. The statistical step compares the fit of our (lp-ntPET) model 7
to a conventional model 9
. Masking restricts treatment to those voxels best described by the new model. Visualization maps the dopamine function at each voxel to a color scale and produces a dopamine movie. Interim results and sample dopamine movies of cigarette smoking are presented.
Behavior, Issue 78, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Medicine, Anatomy, Physiology, Image Processing, Computer-Assisted, Receptors, Dopamine, Dopamine, Functional Neuroimaging, Binding, Competitive, mathematical modeling (systems analysis), Neurotransmission, transient, dopamine release, PET, modeling, linear, time-invariant, smoking, F-test, ventral-striatum, clinical techniques
Sputter Growth and Characterization of Metamagnetic B2-ordered FeRh Epilayers
Institutions: University of Leeds, University of Leeds, University of Edinburgh, Northeastern University, Northeastern University.
Chemically ordered alloys are useful in a variety of magnetic nanotechnologies. They are most conveniently prepared at an industrial scale using sputtering techniques. Here we describe a method for preparing epitaxial thin films of B2-ordered FeRh by sputter deposition onto single crystal MgO substrates. Deposition at a slow rate onto a heated substrate allows time for the adatoms to both settle into a lattice with a well-defined epitaxial relationship with the substrate and also to find their proper places in the Fe and Rh sublattices of the B2 structure. The structure is conveniently characterized with X-ray reflectometry and diffraction and can be visualised directly using transmission electron micrograph cross-sections. B2-ordered FeRh exhibits an unusual metamagnetic phase transition: the ground state is antiferromagnetic but the alloy transforms into a ferromagnet on heating with a typical transition temperature of about 380 K. This is accompanied by a 1% volume expansion of the unit cell: isotropic in bulk, but laterally clamped in an epilayer. The presence of the antiferromagnetic ground state and the associated first order phase transition is very sensitive to the correct equiatomic stoichiometry and proper B2 ordering, and so is a convenient means to demonstrate the quality of the layers that can be deposited with this approach. We also give some examples of the various techniques by which the change in phase can be detected.
Physics, Issue 80, Sputtering, epitaxial growth, magnetism, ordered alloys
High Resolution Electron Microscopy of the Helicobacter pylori Cag Type IV Secretion System Pili Produced in Varying Conditions of Iron Availability
Institutions: Vanderbilt University School of Medicine, U. S. Dept. of Veterans Affairs.
is a helical-shaped, gram negative bacterium that colonizes the human gastric niche of half of the human population1,2
. H. pylori
is the primary cause of gastric cancer, the second leading cause of cancer-related deaths worldwide3
. One virulence factor that has been associated with increased risk of gastric disease is the Cag-pathogenicity island, a 40-kb region within the chromosome of H. pylori
that encodes a type IV secretion system and the cognate effector molecule, CagA4,5
. The Cag-T4SS is responsible for translocating CagA and peptidoglycan into host epithelial cells5,6
. The activity of the Cag-T4SS results in numerous changes in host cell biology including upregulation of cytokine expression, activation of proinflammatory pathways, cytoskeletal remodeling, and induction of oncogenic cell-signaling networks5-8
. The Cag-T4SS is a macromolecular machine comprised of sub-assembly components spanning the inner and outer membrane and extending outward from the cell into the extracellular space. The extracellular portion of the Cag-T4SS is referred to as the “pilus”5
. Numerous studies have demonstrated that the Cag-T4SS pili are formed at the host-pathogen interface9,10
. However, the environmental features that regulate the biogenesis of this important organelle remain largely obscure. Recently, we reported that conditions of low iron availability increased the Cag-T4SS activity and pilus biogenesis. Here we present an optimized protocol to grow H. pylori
in varying conditions of iron availability prior to co-culture with human gastric epithelial cells. Further, we present the comprehensive protocol for visualization of the hyper-piliated phenotype exhibited in iron restricted conditions by high resolution scanning electron microscopy analyses.
Infection, Issue 93, Helicobacter pylori, iron acquisition, cag pathogenicity island, type IV secretion, pili
Fabricating Metamaterials Using the Fiber Drawing Method
Institutions: University of Sydney .
Metamaterials are man-made composite materials, fabricated by assembling components much smaller than the wavelength at which they operate 1
. They owe their electromagnetic properties to the structure of their constituents, instead of the atoms that compose them. For example, sub-wavelength metal wires can be arranged to possess an effective electric permittivity that is either positive or negative at a given frequency, in contrast to the metals themselves 2
. This unprecedented control over the behaviour of light can potentially lead to a number of novel devices, such as invisibility cloaks 3
, negative refractive index materials 4
, and lenses that resolve objects below the diffraction limit 5
. However, metamaterials operating at optical, mid-infrared and terahertz frequencies are conventionally made using nano- and micro-fabrication techniques that are expensive and produce samples that are at most a few centimetres in size 6-7
. Here we present a fabrication method to produce hundreds of meters of metal wire metamaterials in fiber form, which exhibit a terahertz plasmonic response 8
. We combine the stack-and-draw technique used to produce microstructured polymer optical fiber 9
with the Taylor-wire process 10
, using indium wires inside polymethylmethacrylate (PMMA) tubes. PMMA is chosen because it is an easy to handle, drawable dielectric with suitable optical properties in the terahertz region; indium because it has a melting temperature of 156.6 °C which is appropriate for codrawing with PMMA. We include an indium wire of 1 mm diameter and 99.99% purity in a PMMA tube with 1 mm inner diameter (ID) and 12 mm outside diameter (OD) which is sealed at one end. The tube is evacuated and drawn down to an outer diameter of 1.2 mm. The resulting fiber is then cut into smaller pieces, and stacked into a larger PMMA tube. This stack is sealed at one end and fed into a furnace while being rapidly drawn, reducing the diameter of the structure by a factor of 10, and increasing the length by a factor of 100. Such fibers possess features on the micro- and nano- scale, are inherently flexible, mass-producible, and can be woven to exhibit electromagnetic properties that are not found in nature. They represent a promising platform for a number of novel devices from terahertz to optical frequencies, such as invisible fibers, woven negative refractive index cloths, and super-resolving lenses.
Physics, Issue 68, Optics, Photonics, Materials Science, Fiber drawing, metamaterials, polymer optical fiber, microstructured fibers
Nanofabrication of Gate-defined GaAs/AlGaAs Lateral Quantum Dots
Institutions: Université de Sherbrooke.
A quantum computer is a computer composed of quantum bits (qubits) that takes advantage of quantum effects, such as superposition of states and entanglement, to solve certain problems exponentially faster than with the best known algorithms on a classical computer. Gate-defined lateral quantum dots on GaAs/AlGaAs are one of many avenues explored for the implementation of a qubit. When properly fabricated, such a device is able to trap a small number of electrons in a certain region of space. The spin states of these electrons can then be used to implement the logical 0 and 1 of the quantum bit. Given the nanometer scale of these quantum dots, cleanroom facilities offering specialized equipment- such as scanning electron microscopes and e-beam evaporators- are required for their fabrication. Great care must be taken throughout the fabrication process to maintain cleanliness of the sample surface and to avoid damaging the fragile gates of the structure. This paper presents the detailed fabrication protocol of gate-defined lateral quantum dots from the wafer to a working device. Characterization methods and representative results are also briefly discussed. Although this paper concentrates on double quantum dots, the fabrication process remains the same for single or triple dots or even arrays of quantum dots. Moreover, the protocol can be adapted to fabricate lateral quantum dots on other substrates, such as Si/SiGe.
Physics, Issue 81, Nanostructures, Quantum Dots, Nanotechnology, Electronics, microelectronics, solid state physics, Nanofabrication, Nanoelectronics, Spin qubit, Lateral quantum dot
Bimolecular Fluorescence Complementation (BiFC) Assay for Protein-Protein Interaction in Onion Cells Using the Helios Gene Gun
Institutions: University of Maryland.
Investigation of gene function in diverse organisms relies on knowledge of how the gene products interact with each other in their normal cellular environment. The Bimolecular Fluorescence Complementation (BiFC) Assay1
allows researchers to visualize protein-protein interactions in living cells and has become an essential research tool. This assay is based on the facilitated association of two fragments of a fluorescent protein (GFP) that are each fused to a potential interacting protein partner. The interaction of the two protein partners would facilitate the association of the N-terminal and C-terminal fragment of GFP, leading to fluorescence. For plant researchers, onion epidermal cells are an ideal experimental system for conducting the BiFC assay because of the ease in obtaining and preparing onion tissues and the direct visualization of fluorescence with minimal background fluorescence. The Helios Gene Gun (BioRad) is commonly used for bombarding plasmid DNA into onion cells. We demonstrate the use of Helios Gene Gun to introduce plasmid constructs for two interacting Arabidopsis thaliana
transcription factors, SEUSS (SEU) and LEUNIG HOMOLOG (LUH)2
and the visualization of their interactions mediated by BiFC in onion epidermal cells.
Plant Biology, Issue 40, Bimolecular Fluorescence Complementation (BiFC), particle bombardment, protein-protein interaction, onion cells, Helios Gene Gun
Preparation of Gene Gun Bullets and Biolistic Transfection of Neurons in Slice Culture
Institutions: University of California, Davis.
Biolistic transfection is a physical means of transfecting cells by bombarding tissue with high velocity DNA coated particles. We provide a detailed protocol for biolistic transfection of rat hippocampal slices, from the initial preparation of DNA coated bullets to the final shooting of the organotypic slice cultures using a gene gun. Gene gun transfection is an efficient and easy means of transfecting neurons and is especially useful for fluorescently labeling a small subset of cells in tissue slice. In this video, we first outline the steps required to coat gold particles with DNA. We next demonstrate how to line the inside of plastic tubing with the gold/DNA bullets, and how to cut this tubing to obtain the plastic cartridges for loading into the gene gun. Finally, we perform biolistic transfection of rat hippocampal slice cultures, demonstrating handling of the Bio-Rad Helios gene gun, and offering trouble shooting advice to obtain healthy and optimally transfected tissue slices.
neuroscience, issue 12, biolistics, cotransfection, hippocampal slice, organotypic, fluorescence microscopy, live imaging
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Atomically Defined Templates for Epitaxial Growth of Complex Oxide Thin Films
Institutions: University of Twente.
Atomically defined substrate surfaces are prerequisite for the epitaxial growth of complex oxide thin films. In this protocol, two approaches to obtain such surfaces are described. The first approach is the preparation of single terminated perovskite SrTiO3
(001) and DyScO3
(110) substrates. Wet etching was used to selectively remove one of the two possible surface terminations, while an annealing step was used to increase the smoothness of the surface. The resulting single terminated surfaces allow for the heteroepitaxial growth of perovskite oxide thin films with high crystalline quality and well-defined interfaces between substrate and film. In the second approach, seed layers for epitaxial film growth on arbitrary substrates were created by Langmuir-Blodgett (LB) deposition of nanosheets. As model system Ca2
nanosheets were used, prepared by delamination of their layered parent compound HCa2
. A key advantage of creating seed layers with nanosheets is that relatively expensive and size-limited single crystalline substrates can be replaced by virtually any substrate material.
Chemistry, Issue 94, Substrates, oxides, perovskites, epitaxy, thin films, single termination, surface treatment, nanosheets, Langmuir-Blodgett
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+
release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Fundamental Technical Elements of Freeze-fracture/Freeze-etch in Biological Electron Microscopy
Institutions: The University of North Carolina at Chapel Hill.
Freeze-fracture/freeze-etch describes a process whereby specimens, typically biological or nanomaterial in nature, are frozen, fractured, and replicated to generate a carbon/platinum “cast” intended for examination by transmission electron microscopy. Specimens are subjected to ultrarapid freezing rates, often in the presence of cryoprotective agents to limit ice crystal formation, with subsequent fracturing of the specimen at liquid nitrogen cooled temperatures under high vacuum. The resultant fractured surface is replicated and stabilized by evaporation of carbon and platinum from an angle that confers surface three-dimensional detail to the cast. This technique has proved particularly enlightening for the investigation of cell membranes and their specializations and has contributed considerably to the understanding of cellular form to related cell function. In this report, we survey the instrument requirements and technical protocol for performing freeze-fracture, the associated nomenclature and characteristics of fracture planes, variations on the conventional procedure, and criteria for interpretation of freeze-fracture images. This technique has been widely used for ultrastructural investigation in many areas of cell biology and holds promise as an emerging imaging technique for molecular, nanotechnology, and materials science studies.
Biophysics, Issue 91, Freeze-fracture; Freeze-etch; Membranes; Intercellular junctions; Materials science; Nanotechnology; Electron microscopy
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e.
C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample).
Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e.
colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ
soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
Transient Gene Expression in Tobacco using Gibson Assembly and the Gene Gun
Institutions: Harvard University, Harvard Medical School, Delft University of Technology.
In order to target a single protein to multiple subcellular organelles, plants typically duplicate the relevant genes, and express each gene separately using complex regulatory strategies including differential promoters and/or signal sequences. Metabolic engineers and synthetic biologists interested in targeting enzymes to a particular organelle are faced with a challenge: For a protein that is to be localized to more than one organelle, the engineer must clone the same gene multiple times. This work presents a solution to this strategy: harnessing alternative splicing of mRNA. This technology takes advantage of established chloroplast and peroxisome targeting sequences and combines them into a single mRNA that is alternatively spliced. Some splice variants are sent to the chloroplast, some to the peroxisome, and some to the cytosol. Here the system is designed for multiple-organelle targeting with alternative splicing. In this work, GFP was expected to be expressed in the chloroplast, cytosol, and peroxisome by a series of rationally designed 5’ mRNA tags. These tags have the potential to reduce the amount of cloning required when heterologous genes need to be expressed in multiple subcellular organelles. The constructs were designed in previous work11
, and were cloned using Gibson assembly, a ligation independent cloning method that does not require restriction enzymes. The resultant plasmids were introduced into Nicotiana benthamiana
epidermal leaf cells with a modified Gene Gun protocol. Finally, transformed leaves were observed with confocal microscopy.
Environmental Sciences, Issue 86, Plant Leaves, Synthetic Biology, Plants, Genetically Modified, DNA, Plant, RNA, Gene Targeting, Plant Physiological Processes, Genes, Gene gun, Gibson assembly, Nicotiana benthamiana, Alternative splicing, confocal microscopy, chloroplast, peroxisome
Localized RNAi and Ectopic Gene Expression in the Medicinal Leech
Institutions: University of California San Diego - UCSD, University of California San Diego - UCSD.
In this video, we show the use of a pneumatic capillary gun for the accurate biolistic delivery of reagents into live tissue. We use the procedure to perturb gene expression patterns in selected segments of leech embryos, leaving the untreated segments as internal controls.
The pneumatic capillary gun can be used to reach internal layers of cells at early stages of development without opening the specimen. As a method for localized introduction of substances into living tissues, the biolistic delivery with the gun has several advantages: it is fast, contact-free and non-destructive. In addition, a single capillary gun can be used for independent delivery of different substances. The delivery region can have lateral dimensions of ~50-150 µm and extends over ~15 µm around the mean penetration depth, which is adjustable between 0 and 50 µm. This delivery has the advantage of being able to target a limited number of cells in a selected location intermediate between single cell knock down by microinjection and systemic knockdown through extracellular injections or by means of genetic approaches.
For knocking down or knocking in the expression of the axon guidance molecule Netrin, which is naturally expressed by some central neurons and in the ventral body wall, but not the dorsal domain, we deliver molecules of dsRNA or plasmid-DNA into the body wall and central ganglia. This procedure includes the following steps: (i) preparation of the experimental setup for a specific assay (adjusting the accelerating pressure), (ii) coating the particles with molecules of dsRNA or DNA, (iii) loading the coated particles into the gun, up to two reagents in one assay, (iv) preparing the animals for the particle delivery, (v) delivery of coated particles into the target tissue (body wall or ganglia), and (vi) processing the embryos (immunostaining, immunohistochemistry and neuronal labeling) to visualize the results, usually 2 to 3 days after the delivery.
When the particles were coated with netrin dsRNA, they caused clearly visible knock-down of netrin expression that only occurred in cells containing particles (usually, 1-2 particles per cell). Particles coated with a plasmid encoding EGFP induced fluorescence in neuronal cells when they stopped in their nuclei.
Neuroscience, Issue 14, leech, netrin, axon guidance, development, mechanosensory neurons, gene gun, RNAi
Organotypic Culture of Adult Rabbit Retina
Institutions: MGH - Massachusetts General Hospital.
Organotypic culture systems of functional neural tissues are important tools in neurobiological research. Ideally, such a system should be compatible with imaging techniques, genetic manipulation, and electrophysiological recording. Here we present a simple interphase tissue culture system for adult rabbit retina that requires no specialized equipment and very little maintenance. We demonstrate the dissection and incubation of rabbit retina and particle-mediated gene transfer of plasmids encoding GFP or a variety of subcellular markers into retinal ganglion cells. Rabbit retinas cultured this way can be kept alive for up to 6 days with very little changes of the overall anatomical structure or the morphology of individual ganglion- and amacrine cells.
Neuroscience, Issue 3, retina, dissection, neuron, ganglion
Basics of Multivariate Analysis in Neuroimaging Data
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9
. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Gene-gun Transfection of Hippocampal Neurons
Institutions: Brown University.
Neuroscience, Issue 1, brain, hippocampus, neuron, transfection, gene-gun
Predicting the Effectiveness of Population Replacement Strategy Using Mathematical Modeling
Institutions: University of California, Los Angeles.
Charles Taylor and John Marshall explain the utility of mathematical modeling for evaluating the effectiveness of population replacement strategy. Insight is given into how computational models can provide information on the population dynamics of mosquitoes and the spread of transposable elements through A. gambiae subspecies. The ethical considerations of releasing genetically modified mosquitoes into the wild are discussed.
Cellular Biology, Issue 5, mosquito, malaria, popuulation, replacement, modeling, infectious disease