JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Leuconostoc mesenteroides growth in food products: prediction and sensitivity analysis by adaptive-network-based fuzzy inference systems.
PUBLISHED: 01-01-2013
An adaptive-network-based fuzzy inference system (ANFIS) was compared with an artificial neural network (ANN) in terms of accuracy in predicting the combined effects of temperature (10.5 to 24.5°C), pH level (5.5 to 7.5), sodium chloride level (0.25% to 6.25%) and sodium nitrite level (0 to 200 ppm) on the growth rate of Leuconostoc mesenteroides under aerobic and anaerobic conditions.
Authors: Mackenzie J. Denyes, Michèle A. Parisien, Allison Rutter, Barbara A. Zeeb.
Published: 11-28-2014
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
28 Related JoVE Articles!
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
A Strategy for Sensitive, Large Scale Quantitative Metabolomics
Authors: Xiaojing Liu, Zheng Ser, Ahmad A. Cluntun, Samantha J. Mentch, Jason W. Locasale.
Institutions: Cornell University, Cornell University.
Metabolite profiling has been a valuable asset in the study of metabolism in health and disease. However, current platforms have different limiting factors, such as labor intensive sample preparations, low detection limits, slow scan speeds, intensive method optimization for each metabolite, and the inability to measure both positively and negatively charged ions in single experiments. Therefore, a novel metabolomics protocol could advance metabolomics studies. Amide-based hydrophilic chromatography enables polar metabolite analysis without any chemical derivatization. High resolution MS using the Q-Exactive (QE-MS) has improved ion optics, increased scan speeds (256 msec at resolution 70,000), and has the capability of carrying out positive/negative switching. Using a cold methanol extraction strategy, and coupling an amide column with QE-MS enables robust detection of 168 targeted polar metabolites and thousands of additional features simultaneously.  Data processing is carried out with commercially available software in a highly efficient way, and unknown features extracted from the mass spectra can be queried in databases.
Chemistry, Issue 87, high-resolution mass spectrometry, metabolomics, positive/negative switching, low mass calibration, Orbitrap
Play Button
Enteric Bacterial Invasion Of Intestinal Epithelial Cells In Vitro Is Dramatically Enhanced Using a Vertical Diffusion Chamber Model
Authors: Neveda Naz, Dominic C. Mills, Brendan W. Wren, Nick Dorrell.
Institutions: London School of Hygiene & Tropical Medicine.
The interactions of bacterial pathogens with host cells have been investigated extensively using in vitro cell culture methods. However as such cell culture assays are performed under aerobic conditions, these in vitro models may not accurately represent the in vivo environment in which the host-pathogen interactions take place. We have developed an in vitro model of infection that permits the coculture of bacteria and host cells under different medium and gas conditions. The Vertical Diffusion Chamber (VDC) model mimics the conditions in the human intestine where bacteria will be under conditions of very low oxygen whilst tissue will be supplied with oxygen from the blood stream. Placing polarized intestinal epithelial cell (IEC) monolayers grown in Snapwell inserts into a VDC creates separate apical and basolateral compartments. The basolateral compartment is filled with cell culture medium, sealed and perfused with oxygen whilst the apical compartment is filled with broth, kept open and incubated under microaerobic conditions. Both Caco-2 and T84 IECs can be maintained in the VDC under these conditions without any apparent detrimental effects on cell survival or monolayer integrity. Coculturing experiments performed with different C. jejuni wild-type strains and different IEC lines in the VDC model with microaerobic conditions in the apical compartment reproducibly result in an increase in the number of interacting (almost 10-fold) and intracellular (almost 100-fold) bacteria compared to aerobic culture conditions1. The environment created in the VDC model more closely mimics the environment encountered by C. jejuni in the human intestine and highlights the importance of performing in vitro infection assays under conditions that more closely mimic the in vivo reality. We propose that use of the VDC model will allow new interpretations of the interactions between bacterial pathogens and host cells.
Infection, Issue 80, Gram-Negative Bacteria, Bacterial Infections, Gastrointestinal Diseases, Campylobacter jejuni, bacterial invasion, intestinal epithelial cells, models of infection
Play Button
Culturing and Maintaining Clostridium difficile in an Anaerobic Environment
Authors: Adrianne N. Edwards, Jose M. Suárez, Shonna M. McBride.
Institutions: Emory University School of Medicine.
Clostridium difficile is a Gram-positive, anaerobic, sporogenic bacterium that is primarily responsible for antibiotic associated diarrhea (AAD) and is a significant nosocomial pathogen. C. difficile is notoriously difficult to isolate and cultivate and is extremely sensitive to even low levels of oxygen in the environment. Here, methods for isolating C. difficile from fecal samples and subsequently culturing C. difficile for preparation of glycerol stocks for long-term storage are presented. Techniques for preparing and enumerating spore stocks in the laboratory for a variety of downstream applications including microscopy and animal studies are also described. These techniques necessitate an anaerobic chamber, which maintains a consistent anaerobic environment to ensure proper conditions for optimal C. difficile growth. We provide protocols for transferring materials in and out of the chamber without causing significant oxygen contamination along with suggestions for regular maintenance required to sustain the appropriate anaerobic environment for efficient and consistent C. difficile cultivation.
Immunology, Issue 79, Genetics, Bacteria, Anaerobic, Gram-Positive Endospore-Forming Rods, Spores, Bacterial, Gram-Positive Bacterial Infections, Clostridium Infections, Bacteriology, Clostridium difficile, Gram-positive, anaerobic chamber, spore, culturing, maintenance, cell culture
Play Button
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Authors: Amy H. Van Hove, Brandon D. Wilson, Danielle S. W. Benoit.
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g. primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
Play Button
Training Synesthetic Letter-color Associations by Reading in Color
Authors: Olympia Colizoli, Jaap M. J. Murre, Romke Rouw.
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Play Button
Mouse Genome Engineering Using Designer Nucleases
Authors: Mario Hermann, Tomas Cermak, Daniel F. Voytas, Pawel Pelczar.
Institutions: University of Zurich, University of Minnesota.
Transgenic mice carrying site-specific genome modifications (knockout, knock-in) are of vital importance for dissecting complex biological systems as well as for modeling human diseases and testing therapeutic strategies. Recent advances in the use of designer nucleases such as zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and the clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated (Cas) 9 system for site-specific genome engineering open the possibility to perform rapid targeted genome modification in virtually any laboratory species without the need to rely on embryonic stem (ES) cell technology. A genome editing experiment typically starts with identification of designer nuclease target sites within a gene of interest followed by construction of custom DNA-binding domains to direct nuclease activity to the investigator-defined genomic locus. Designer nuclease plasmids are in vitro transcribed to generate mRNA for microinjection of fertilized mouse oocytes. Here, we provide a protocol for achieving targeted genome modification by direct injection of TALEN mRNA into fertilized mouse oocytes.
Genetics, Issue 86, Oocyte microinjection, Designer nucleases, ZFN, TALEN, Genome Engineering
Play Button
A New Approach for the Comparative Analysis of Multiprotein Complexes Based on 15N Metabolic Labeling and Quantitative Mass Spectrometry
Authors: Kerstin Trompelt, Janina Steinbeck, Mia Terashima, Michael Hippler.
Institutions: University of Münster, Carnegie Institution for Science.
The introduced protocol provides a tool for the analysis of multiprotein complexes in the thylakoid membrane, by revealing insights into complex composition under different conditions. In this protocol the approach is demonstrated by comparing the composition of the protein complex responsible for cyclic electron flow (CEF) in Chlamydomonas reinhardtii, isolated from genetically different strains. The procedure comprises the isolation of thylakoid membranes, followed by their separation into multiprotein complexes by sucrose density gradient centrifugation, SDS-PAGE, immunodetection and comparative, quantitative mass spectrometry (MS) based on differential metabolic labeling (14N/15N) of the analyzed strains. Detergent solubilized thylakoid membranes are loaded on sucrose density gradients at equal chlorophyll concentration. After ultracentrifugation, the gradients are separated into fractions, which are analyzed by mass-spectrometry based on equal volume. This approach allows the investigation of the composition within the gradient fractions and moreover to analyze the migration behavior of different proteins, especially focusing on ANR1, CAS, and PGRL1. Furthermore, this method is demonstrated by confirming the results with immunoblotting and additionally by supporting the findings from previous studies (the identification and PSI-dependent migration of proteins that were previously described to be part of the CEF-supercomplex such as PGRL1, FNR, and cyt f). Notably, this approach is applicable to address a broad range of questions for which this protocol can be adopted and e.g. used for comparative analyses of multiprotein complex composition isolated from distinct environmental conditions.
Microbiology, Issue 85, Sucrose density gradients, Chlamydomonas, multiprotein complexes, 15N metabolic labeling, thylakoids
Play Button
Isolation and Quantification of Botulinum Neurotoxin From Complex Matrices Using the BoTest Matrix Assays
Authors: F. Mark Dunning, Timothy M. Piazza, Füsûn N. Zeytin, Ward C. Tucker.
Institutions: BioSentinel Inc., Madison, WI.
Accurate detection and quantification of botulinum neurotoxin (BoNT) in complex matrices is required for pharmaceutical, environmental, and food sample testing. Rapid BoNT testing of foodstuffs is needed during outbreak forensics, patient diagnosis, and food safety testing while accurate potency testing is required for BoNT-based drug product manufacturing and patient safety. The widely used mouse bioassay for BoNT testing is highly sensitive but lacks the precision and throughput needed for rapid and routine BoNT testing. Furthermore, the bioassay's use of animals has resulted in calls by drug product regulatory authorities and animal-rights proponents in the US and abroad to replace the mouse bioassay for BoNT testing. Several in vitro replacement assays have been developed that work well with purified BoNT in simple buffers, but most have not been shown to be applicable to testing in highly complex matrices. Here, a protocol for the detection of BoNT in complex matrices using the BoTest Matrix assays is presented. The assay consists of three parts: The first part involves preparation of the samples for testing, the second part is an immunoprecipitation step using anti-BoNT antibody-coated paramagnetic beads to purify BoNT from the matrix, and the third part quantifies the isolated BoNT's proteolytic activity using a fluorogenic reporter. The protocol is written for high throughput testing in 96-well plates using both liquid and solid matrices and requires about 2 hr of manual preparation with total assay times of 4-26 hr depending on the sample type, toxin load, and desired sensitivity. Data are presented for BoNT/A testing with phosphate-buffered saline, a drug product, culture supernatant, 2% milk, and fresh tomatoes and includes discussion of critical parameters for assay success.
Neuroscience, Issue 85, Botulinum, food testing, detection, quantification, complex matrices, BoTest Matrix, Clostridium, potency testing
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Measurement of Lifespan in Drosophila melanogaster
Authors: Nancy J. Linford, Ceyda Bilgir, Jennifer Ro, Scott D. Pletcher.
Institutions: University of Michigan , University of Michigan .
Aging is a phenomenon that results in steady physiological deterioration in nearly all organisms in which it has been examined, leading to reduced physical performance and increased risk of disease. Individual aging is manifest at the population level as an increase in age-dependent mortality, which is often measured in the laboratory by observing lifespan in large cohorts of age-matched individuals. Experiments that seek to quantify the extent to which genetic or environmental manipulations impact lifespan in simple model organisms have been remarkably successful for understanding the aspects of aging that are conserved across taxa and for inspiring new strategies for extending lifespan and preventing age-associated disease in mammals. The vinegar fly, Drosophila melanogaster, is an attractive model organism for studying the mechanisms of aging due to its relatively short lifespan, convenient husbandry, and facile genetics. However, demographic measures of aging, including age-specific survival and mortality, are extraordinarily susceptible to even minor variations in experimental design and environment, and the maintenance of strict laboratory practices for the duration of aging experiments is required. These considerations, together with the need to practice careful control of genetic background, are essential for generating robust measurements. Indeed, there are many notable controversies surrounding inference from longevity experiments in yeast, worms, flies and mice that have been traced to environmental or genetic artifacts1-4. In this protocol, we describe a set of procedures that have been optimized over many years of measuring longevity in Drosophila using laboratory vials. We also describe the use of the dLife software, which was developed by our laboratory and is available for download ( dLife accelerates throughput and promotes good practices by incorporating optimal experimental design, simplifying fly handling and data collection, and standardizing data analysis. We will also discuss the many potential pitfalls in the design, collection, and interpretation of lifespan data, and we provide steps to avoid these dangers.
Developmental Biology, Issue 71, Cellular Biology, Molecular Biology, Anatomy, Physiology, Entomology, longevity, lifespan, aging, Drosophila melanogaster, fruit fly, Drosophila, mortality, animal model
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Metabolomic Analysis of Rat Brain by High Resolution Nuclear Magnetic Resonance Spectroscopy of Tissue Extracts
Authors: Norbert W. Lutz, Evelyne Béraud, Patrick J. Cozzone.
Institutions: Aix-Marseille Université, Aix-Marseille Université.
Studies of gene expression on the RNA and protein levels have long been used to explore biological processes underlying disease. More recently, genomics and proteomics have been complemented by comprehensive quantitative analysis of the metabolite pool present in biological systems. This strategy, termed metabolomics, strives to provide a global characterization of the small-molecule complement involved in metabolism. While the genome and the proteome define the tasks cells can perform, the metabolome is part of the actual phenotype. Among the methods currently used in metabolomics, spectroscopic techniques are of special interest because they allow one to simultaneously analyze a large number of metabolites without prior selection for specific biochemical pathways, thus enabling a broad unbiased approach. Here, an optimized experimental protocol for metabolomic analysis by high-resolution NMR spectroscopy is presented, which is the method of choice for efficient quantification of tissue metabolites. Important strengths of this method are (i) the use of crude extracts, without the need to purify the sample and/or separate metabolites; (ii) the intrinsically quantitative nature of NMR, permitting quantitation of all metabolites represented by an NMR spectrum with one reference compound only; and (iii) the nondestructive nature of NMR enabling repeated use of the same sample for multiple measurements. The dynamic range of metabolite concentrations that can be covered is considerable due to the linear response of NMR signals, although metabolites occurring at extremely low concentrations may be difficult to detect. For the least abundant compounds, the highly sensitive mass spectrometry method may be advantageous although this technique requires more intricate sample preparation and quantification procedures than NMR spectroscopy. We present here an NMR protocol adjusted to rat brain analysis; however, the same protocol can be applied to other tissues with minor modifications.
Neuroscience, Issue 91, metabolomics, brain tissue, rodents, neurochemistry, tissue extracts, NMR spectroscopy, quantitative metabolite analysis, cerebral metabolism, metabolic profile
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Transgenic Rodent Assay for Quantifying Male Germ Cell Mutant Frequency
Authors: Jason M. O'Brien, Marc A. Beal, John D. Gingerich, Lynda Soper, George R. Douglas, Carole L. Yauk, Francesco Marchetti.
Institutions: Environmental Health Centre.
De novo mutations arise mostly in the male germline and may contribute to adverse health outcomes in subsequent generations. Traditional methods for assessing the induction of germ cell mutations require the use of large numbers of animals, making them impractical. As such, germ cell mutagenicity is rarely assessed during chemical testing and risk assessment. Herein, we describe an in vivo male germ cell mutation assay using a transgenic rodent model that is based on a recently approved Organisation for Economic Co-operation and Development (OECD) test guideline. This method uses an in vitro positive selection assay to measure in vivo mutations induced in a transgenic λgt10 vector bearing a reporter gene directly in the germ cells of exposed males. We further describe how the detection of mutations in the transgene recovered from germ cells can be used to characterize the stage-specific sensitivity of the various spermatogenic cell types to mutagen exposure by controlling three experimental parameters: the duration of exposure (administration time), the time between exposure and sample collection (sampling time), and the cell population collected for analysis. Because a large number of germ cells can be assayed from a single male, this method has superior sensitivity compared with traditional methods, requires fewer animals and therefore much less time and resources.
Genetics, Issue 90, sperm, spermatogonia, male germ cells, spermatogenesis, de novo mutation, OECD TG 488, transgenic rodent mutation assay, N-ethyl-N-nitrosourea, genetic toxicology
Play Button
Intravital Microscopy of the Inguinal Lymph Node
Authors: Stephanie L. Sellers, Geoffrey W. Payne.
Institutions: University of Northern British Columbia, University of Northern British Columbia.
Lymph nodes (LN's), located throughout the body, are an integral component of the immune system. They serve as a site for induction of adaptive immune response and therefore, the development of effector cells. As such, LNs are key to fighting invading pathogens and maintaining health. The choice of LN to study is dictated by accessibility and the desired model; the inguinal lymph node is well situated and easily supports studies of biologically relevant models of skin and genital mucosal infection. The inguinal LN, like all LNs, has an extensive microvascular network supplying it with blood. In general, this microvascular network includes the main feed arteriole of the LN that subsequently branches and feeds high endothelial venules (HEVs). HEVs are specialized for facilitating the trafficking of immune cells into the LN during both homeostasis and infection. How HEVs regulate trafficking into the LN under both of these circumstances is an area of intense exploration. The LN feed arteriole, has direct upstream influence on the HEVs and is the main supply of nutrients and cell rich blood into the LN. Furthermore, changes in the feed arteriole are implicated in facilitating induction of adaptive immune response. The LN microvasculature has obvious importance in maintaining an optimal blood supply to the LN and regulating immune cell influx into the LN, which are crucial elements in proper LN function and subsequently immune response. The ability to study the LN microvasculature in vivo is key to elucidating how the immune system and the microvasculature interact and influence one another within the LN. Here, we present a method for in vivo imaging of the inguinal lymph node. We focus on imaging of the microvasculature of the LN, paying particular attention to methods that ensure the study of healthy vessels, the ability to maintain imaging of viable vessels over a number of hours, and quantification of vessel magnitude. Methods for perfusion of the microvasculature with vasoactive drugs as well as the potential to trace and quantify cellular traffic are also presented. Intravital microscopy of the inguinal LN allows direct evaluation of microvascular functionality and real-time interface of the direct interface between immune cells, the LN, and the microcirculation. This technique potential to be combined with many immunological techniques and fluorescent cell labelling as well as manipulated to study vasculature of other LNs.
Immunology, Issue 50, Intravital vital microscopy, lymph node, arteriole, vasculature, cellular trafficking, immune response
Play Button
Modeling Neural Immune Signaling of Episodic and Chronic Migraine Using Spreading Depression In Vitro
Authors: Aya D. Pusic, Yelena Y. Grinberg, Heidi M. Mitchell, Richard P. Kraig.
Institutions: The University of Chicago Medical Center, The University of Chicago Medical Center.
Migraine and its transformation to chronic migraine are healthcare burdens in need of improved treatment options. We seek to define how neural immune signaling modulates the susceptibility to migraine, modeled in vitro using spreading depression (SD), as a means to develop novel therapeutic targets for episodic and chronic migraine. SD is the likely cause of migraine aura and migraine pain. It is a paroxysmal loss of neuronal function triggered by initially increased neuronal activity, which slowly propagates within susceptible brain regions. Normal brain function is exquisitely sensitive to, and relies on, coincident low-level immune signaling. Thus, neural immune signaling likely affects electrical activity of SD, and therefore migraine. Pain perception studies of SD in whole animals are fraught with difficulties, but whole animals are well suited to examine systems biology aspects of migraine since SD activates trigeminal nociceptive pathways. However, whole animal studies alone cannot be used to decipher the cellular and neural circuit mechanisms of SD. Instead, in vitro preparations where environmental conditions can be controlled are necessary. Here, it is important to recognize limitations of acute slices and distinct advantages of hippocampal slice cultures. Acute brain slices cannot reveal subtle changes in immune signaling since preparing the slices alone triggers: pro-inflammatory changes that last days, epileptiform behavior due to high levels of oxygen tension needed to vitalize the slices, and irreversible cell injury at anoxic slice centers. In contrast, we examine immune signaling in mature hippocampal slice cultures since the cultures closely parallel their in vivo counterpart with mature trisynaptic function; show quiescent astrocytes, microglia, and cytokine levels; and SD is easily induced in an unanesthetized preparation. Furthermore, the slices are long-lived and SD can be induced on consecutive days without injury, making this preparation the sole means to-date capable of modeling the neuroimmune consequences of chronic SD, and thus perhaps chronic migraine. We use electrophysiological techniques and non-invasive imaging to measure neuronal cell and circuit functions coincident with SD. Neural immune gene expression variables are measured with qPCR screening, qPCR arrays, and, importantly, use of cDNA preamplification for detection of ultra-low level targets such as interferon-gamma using whole, regional, or specific cell enhanced (via laser dissection microscopy) sampling. Cytokine cascade signaling is further assessed with multiplexed phosphoprotein related targets with gene expression and phosphoprotein changes confirmed via cell-specific immunostaining. Pharmacological and siRNA strategies are used to mimic and modulate SD immune signaling.
Neuroscience, Issue 52, innate immunity, hormesis, microglia, T-cells, hippocampus, slice culture, gene expression, laser dissection microscopy, real-time qPCR, interferon-gamma
Play Button
Creating Objects and Object Categories for Studying Perception and Perceptual Learning
Authors: Karin Hauffen, Eugene Bart, Mark Brady, Daniel Kersten, Jay Hegdé.
Institutions: Georgia Health Sciences University, Georgia Health Sciences University, Georgia Health Sciences University, Palo Alto Research Center, Palo Alto Research Center, University of Minnesota .
In order to quantitatively study object perception, be it perception by biological systems or by machines, one needs to create objects and object categories with precisely definable, preferably naturalistic, properties1. Furthermore, for studies on perceptual learning, it is useful to create novel objects and object categories (or object classes) with such properties2. Many innovative and useful methods currently exist for creating novel objects and object categories3-6 (also see refs. 7,8). However, generally speaking, the existing methods have three broad types of shortcomings. First, shape variations are generally imposed by the experimenter5,9,10, and may therefore be different from the variability in natural categories, and optimized for a particular recognition algorithm. It would be desirable to have the variations arise independently of the externally imposed constraints. Second, the existing methods have difficulty capturing the shape complexity of natural objects11-13. If the goal is to study natural object perception, it is desirable for objects and object categories to be naturalistic, so as to avoid possible confounds and special cases. Third, it is generally hard to quantitatively measure the available information in the stimuli created by conventional methods. It would be desirable to create objects and object categories where the available information can be precisely measured and, where necessary, systematically manipulated (or 'tuned'). This allows one to formulate the underlying object recognition tasks in quantitative terms. Here we describe a set of algorithms, or methods, that meet all three of the above criteria. Virtual morphogenesis (VM) creates novel, naturalistic virtual 3-D objects called 'digital embryos' by simulating the biological process of embryogenesis14. Virtual phylogenesis (VP) creates novel, naturalistic object categories by simulating the evolutionary process of natural selection9,12,13. Objects and object categories created by these simulations can be further manipulated by various morphing methods to generate systematic variations of shape characteristics15,16. The VP and morphing methods can also be applied, in principle, to novel virtual objects other than digital embryos, or to virtual versions of real-world objects9,13. Virtual objects created in this fashion can be rendered as visual images using a conventional graphical toolkit, with desired manipulations of surface texture, illumination, size, viewpoint and background. The virtual objects can also be 'printed' as haptic objects using a conventional 3-D prototyper. We also describe some implementations of these computational algorithms to help illustrate the potential utility of the algorithms. It is important to distinguish the algorithms from their implementations. The implementations are demonstrations offered solely as a 'proof of principle' of the underlying algorithms. It is important to note that, in general, an implementation of a computational algorithm often has limitations that the algorithm itself does not have. Together, these methods represent a set of powerful and flexible tools for studying object recognition and perceptual learning by biological and computational systems alike. With appropriate extensions, these methods may also prove useful in the study of morphogenesis and phylogenesis.
Neuroscience, Issue 69, machine learning, brain, classification, category learning, cross-modal perception, 3-D prototyping, inference
Play Button
Determining the Contribution of the Energy Systems During Exercise
Authors: Guilherme G. Artioli, Rômulo C. Bertuzzi, Hamilton Roschel, Sandro H. Mendes, Antonio H. Lancha Jr., Emerson Franchini.
Institutions: University of Sao Paulo, University of Sao Paulo, University of Sao Paulo, University of Sao Paulo.
One of the most important aspects of the metabolic demand is the relative contribution of the energy systems to the total energy required for a given physical activity. Although some sports are relatively easy to be reproduced in a laboratory (e.g., running and cycling), a number of sports are much more difficult to be reproduced and studied in controlled situations. This method presents how to assess the differential contribution of the energy systems in sports that are difficult to mimic in controlled laboratory conditions. The concepts shown here can be adapted to virtually any sport. The following physiologic variables will be needed: rest oxygen consumption, exercise oxygen consumption, post-exercise oxygen consumption, rest plasma lactate concentration and post-exercise plasma peak lactate. To calculate the contribution of the aerobic metabolism, you will need the oxygen consumption at rest and during the exercise. By using the trapezoidal method, calculate the area under the curve of oxygen consumption during exercise, subtracting the area corresponding to the rest oxygen consumption. To calculate the contribution of the alactic anaerobic metabolism, the post-exercise oxygen consumption curve has to be adjusted to a mono or a bi-exponential model (chosen by the one that best fits). Then, use the terms of the fitted equation to calculate anaerobic alactic metabolism, as follows: ATP-CP metabolism = A1 (mL . s-1) x t1 (s). Finally, to calculate the contribution of the lactic anaerobic system, multiply peak plasma lactate by 3 and by the athlete’s body mass (the result in mL is then converted to L and into kJ). The method can be used for both continuous and intermittent exercise. This is a very interesting approach as it can be adapted to exercises and sports that are difficult to be mimicked in controlled environments. Also, this is the only available method capable of distinguishing the contribution of three different energy systems. Thus, the method allows the study of sports with great similarity to real situations, providing desirable ecological validity to the study.
Physiology, Issue 61, aerobic metabolism, anaerobic alactic metabolism, anaerobic lactic metabolism, exercise, athletes, mathematical model
Play Button
Analytical Techniques for Assaying Nitric Oxide Bioactivity
Authors: Hong Jiang, Deepa Parthasarathy, Ashley C. Torregrossa, Asad Mian, Nathan S. Bryan.
Institutions: University of Texas Health Science Center at Houston , Baylor College of Medicine .
Nitric oxide (NO) is a diatomic free radical that is extremely short lived in biological systems (less than 1 second in circulating blood)1. NO may be considered one of the most important signaling molecules produced in our body, regulating essential functions including but not limited to regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification in biological matrices is critical to understanding the role of NO in health and disease. With such a short physiological half life of NO, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of relevant NO metabolites in multiple biological compartments provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. The ability to compare blood with select tissues in experimental animals will help bridge the gap between basic science and clinical medicine as far as diagnostic and prognostic utility of NO biomarkers in health and disease. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The established paradigm of NO biochemistry from production by NO synthases to activation of soluble guanylyl cyclase (sGC) to eventual oxidation to nitrite (NO2-) and nitrate (NO3-) may only represent part of NO's effects in vivo. The interaction of NO and NO-derived metabolites with protein thiols, secondary amines, and metals to form S-nitrosothiols (RSNOs), N-nitrosamines (RNNOs), and nitrosyl-heme respectively represent cGMP-independent effects of NO and are likely just as important physiologically as activation of sGC by NO. A true understanding of NO in physiology is derived from in vivo experiments sampling multiple compartments simultaneously. Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. The elucidation of new mechanisms and signaling pathways involving NO hinges on our ability to specifically, selectively and sensitively detect and quantify NO and all relevant NO products and metabolites in complex biological matrices. Here, we present a method for the rapid and sensitive analysis of nitrite and nitrate by HPLC as well as detection of free NO in biological samples using in vitro ozone based chemiluminescence with chemical derivitazation to determine molecular source of NO as well as ex vivo with organ bath myography.
Medicine, Issue 64, Molecular Biology, Nitric oxide, nitrite, nitrate, endothelium derived relaxing factor, HPLC, chemiluminscence
Play Button
Use of Artificial Sputum Medium to Test Antibiotic Efficacy Against Pseudomonas aeruginosa in Conditions More Relevant to the Cystic Fibrosis Lung
Authors: Sebastian Kirchner, Joanne L Fothergill, Elli A. Wright, Chloe E. James, Eilidh Mowat, Craig Winstanley.
Institutions: University of Liverpool , University of Liverpool .
There is growing concern about the relevance of in vitro antimicrobial susceptibility tests when applied to isolates of P. aeruginosa from cystic fibrosis (CF) patients. Existing methods rely on single or a few isolates grown aerobically and planktonically. Predetermined cut-offs are used to define whether the bacteria are sensitive or resistant to any given antibiotic1. However, during chronic lung infections in CF, P. aeruginosa populations exist in biofilms and there is evidence that the environment is largely microaerophilic2. The stark difference in conditions between bacteria in the lung and those during diagnostic testing has called into question the reliability and even relevance of these tests3. Artificial sputum medium (ASM) is a culture medium containing the components of CF patient sputum, including amino acids, mucin and free DNA. P. aeruginosa growth in ASM mimics growth during CF infections, with the formation of self-aggregating biofilm structures and population divergence4,5,6. The aim of this study was to develop a microtitre-plate assay to study antimicrobial susceptibility of P. aeruginosa based on growth in ASM, which is applicable to both microaerophilic and aerobic conditions. An ASM assay was developed in a microtitre plate format. P. aeruginosa biofilms were allowed to develop for 3 days prior to incubation with antimicrobial agents at different concentrations for 24 hours. After biofilm disruption, cell viability was measured by staining with resazurin. This assay was used to ascertain the sessile cell minimum inhibitory concentration (SMIC) of tobramycin for 15 different P. aeruginosa isolates under aerobic and microaerophilic conditions and SMIC values were compared to those obtained with standard broth growth. Whilst there was some evidence for increased MIC values for isolates grown in ASM when compared to their planktonic counterparts, the biggest differences were found with bacteria tested in microaerophilic conditions, which showed a much increased resistance up to a >128 fold, towards tobramycin in the ASM system when compared to assays carried out in aerobic conditions. The lack of association between current susceptibility testing methods and clinical outcome has questioned the validity of current methods3. Several in vitro models have been used previously to study P. aeruginosa biofilms7, 8. However, these methods rely on surface attached biofilms, whereas the ASM biofilms resemble those observed in the CF lung9 . In addition, reduced oxygen concentration in the mucus has been shown to alter the behavior of P. aeruginosa2 and affect antibiotic susceptibility10. Therefore using ASM under microaerophilic conditions may provide a more realistic environment in which to study antimicrobial susceptibility.
Immunology, Issue 64, Microbiology, Pseudomonas aeruginosa, antimicrobial susceptibility, artificial sputum media, lung infection, cystic fibrosis, diagnostics, plankton
Play Button
Detection of Bacteria Using Fluorogenic DNAzymes
Authors: Sergio D. Aguirre, M. Monsur Ali, Pushpinder Kanda, Yingfu Li.
Institutions: McMaster University , McMaster University .
Outbreaks linked to food-borne and hospital-acquired pathogens account for millions of deaths and hospitalizations as well as colossal economic losses each and every year. Prevention of such outbreaks and minimization of the impact of an ongoing epidemic place an ever-increasing demand for analytical methods that can accurately identify culprit pathogens at the earliest stage. Although there is a large array of effective methods for pathogen detection, none of them can satisfy all the following five premier requirements embodied for an ideal detection method: high specificity (detecting only the bacterium of interest), high sensitivity (capable of detecting as low as a single live bacterial cell), short time-to-results (minutes to hours), great operational simplicity (no need for lengthy sampling procedures and the use of specialized equipment), and cost effectiveness. For example, classical microbiological methods are highly specific but require a long time (days to weeks) to acquire a definitive result.1 PCR- and antibody-based techniques offer shorter waiting times (hours to days), but they require the use of expensive reagents and/or sophisticated equipment.2-4 Consequently, there is still a great demand for scientific research towards developing innovative bacterial detection methods that offer improved characteristics in one or more of the aforementioned requirements. Our laboratory is interested in examining the potential of DNAzymes as a novel class of molecular probes for biosensing applications including bacterial detection.5 DNAzymes (also known as deoxyribozymes or DNA enzymes) are man-made single-stranded DNA molecules with the capability of catalyzing chemical reactions.6-8 These molecules can be isolated from a vast random-sequence DNA pool (which contains as many as 1016 individual sequences) by a process known as "in vitro selection" or "SELEX" (systematic evolution of ligands by exponential enrichment).9-16 These special DNA molecules have been widely examined in recent years as molecular tools for biosensing applications.6-8 Our laboratory has established in vitro selection procedures for isolating RNA-cleaving fluorescent DNAzymes (RFDs; Fig. 1) and investigated the use of RFDs as analytical tools.17-29 RFDs catalyze the cleavage of a DNA-RNA chimeric substrate at a single ribonucleotide junction (R) that is flanked by a fluorophore (F) and a quencher (Q). The close proximity of F and Q renders the uncleaved substrate minimal fluorescence. However, the cleavage event leads to the separation of F and Q, which is accompanied by significant increase of fluorescence intensity. More recently, we developed a method of isolating RFDs for bacterial detection.5 These special RFDs were isolated to "light up" in the presence of the crude extracellular mixture (CEM) left behind by a specific type of bacteria in their environment or in the media they are cultured (Fig. 1). The use of crude mixture circumvents the tedious process of purifying and identifying a suitable target from the microbe of interest for biosensor development (which could take months or years to complete). The use of extracellular targets means the assaying procedure is simple because there is no need for steps to obtain intracellular targets. Using the above approach, we derived an RFD that cleaves its substrate (FS1; Fig. 2A) only in the presence of the CEM produced by E. coli (CEM-EC).5 This E. coli-sensing RFD, named RFD-EC1 (Fig. 2A), was found to be strictly responsive to CEM-EC but nonresponsive to CEMs from a host of other bacteria (Fig. 3). Here we present the key experimental procedures for setting up E. coli detection assays using RFD-EC1 and representative results.
Biochemistry, Issue 63, Immunology, Fluorogenic DNAzymes, E. coli, biosensor, bacterial detection
Play Button
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Authors: Rivkeh Y. Haryono, Madeline A. Sprajcer, Russell S. J. Keast.
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
Play Button
Designing and Implementing Nervous System Simulations on LEGO Robots
Authors: Daniel Blustein, Nikolai Rosenthal, Joseph Ayers.
Institutions: Northeastern University, Bremen University of Applied Sciences.
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.1 The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Neuroscience, Issue 75, Neurobiology, Bioengineering, Behavior, Mechanical Engineering, Computer Science, Marine Biology, Biomimetics, Marine Science, Neurosciences, Synthetic Biology, Robotics, robots, Modeling, models, Sensory Fusion, nervous system, Educational Tools, programming, software, lobster, Homarus americanus, animal model
Play Button
Freezing, Thawing, and Packaging Cells for Transport
Authors: Richard Ricardo, Katy Phelan.
Institutions: Molecular Pathology Laboratory Network, Inc.
Cultured mammalian cells are used extensively in cell biology studies. It requires a number of special skills in order to be able to preserve the structure, function, behavior, and biology of the cells in culture. This video describes the basic skills required to freeze and store cells and how to recover frozen stocks.
Basic Protocols, Issue 17, Current Protocols Wiley, Freezing Cells, Cell Culture, Thawing Cells, Storage of Cells, Suspension Cells, Adherent Cells
Play Button
Trypsinizing and Subculturing Mammalian Cells
Authors: Richard Ricardo, Katy Phelan.
Institutions: Molecular Pathology Laboratory Network, Inc.
As cells reach confluency, they must be subcultured or passaged. Failure to subculture confluent cells results in reduced mitotic index and eventually in cell death. The first step in subculturing is to detach cells from the surface of the primary culture vessel by trypsinization or mechanical means. The resultant cell suspension is then subdivided, or reseeded, into fresh cultures. Secondary cultures are checked for growth and fed periodically, and may be subsequently subcultured to produce tertiary cultures. The time between passaging of cells varies with the cell line and depends on the growth rate.
Basic Protocols, Issue 16, Current Protocols Wiley, Cell Culture, Cell Passaging, Trypsinizing Cells, Adherent Cells, Suspension Cells
Play Button
Counting and Determining the Viability of Cultured Cells
Authors: Richard Ricardo, Katy Phelan.
Institutions: Molecular Pathology Laboratory Network, Inc.
Determining the number of cells in culture is important in standardization of culture conditions and in performing accurate quantitation experiments. A hemacytometer is a thick glass slide with a central area designed as a counting chamber. Cell suspension is applied to a defined area and counted so cell density can be calculated.
Basic Protocols, Issue 16, Current Protocols Wiley, Cell Counting, Cell Culture, Trypan Blue, Cell Viability
Play Button
Brain Slice Stimulation Using a Microfluidic Network and Standard Perfusion Chamber
Authors: Javeed Shaikh Mohammed, Hugo Caicedo, Christopher P. Fall, David T. Eddington.
Institutions: University of Illinois, Chicago, University of Illinois, Chicago.
We have demonstrated the fabrication of a two-level microfluidic device that can be easily integrated with existing electrophysiology setups. The two-level microfluidic device is fabricated using a two-step standard negative resist lithography process 1. The first level contains microchannels with inlet and outlet ports at each end. The second level contains microscale circular holes located midway of the channel length and centered along with channel width. Passive pumping method is used to pump fluids from the inlet port to the outlet port 2. The microfluidic device is integrated with off-the-shelf perfusion chambers and allows seamless integration with the electrophysiology setup. The fluids introduced at the inlet ports flow through the microchannels towards the outlet ports and also escape through the circular openings located on top of the microchannels into the bath of the perfusion. Thus the bottom surface of the brain slice placed in the perfusion chamber bath and above the microfluidic device can be exposed with different neurotransmitters. The microscale thickness of the microfluidic device and the transparent nature of the materials [glass coverslip and PDMS (polydimethylsiloxane)] used to make the microfluidic device allow microscopy of the brain slice. The microfluidic device allows modulation (both spatial and temporal) of the chemical stimuli introduced to the brain slice microenvironments.
Neuroscience, Issue 8, Biomedical Engineering, Microfluidics, Slice Recording, Soft Lithography, Electrophysiology, Neurotransmitter, Bioengineering
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.