The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g. carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
26 Related JoVE Articles!
Accuracy in Dental Medicine, A New Way to Measure Trueness and Precision
Institutions: University of Zürich.
Reference scanners are used in dental medicine to verify a lot of procedures. The main interest is to verify impression methods as they serve as a base for dental restorations. The current limitation of many reference scanners is the lack of accuracy scanning large objects like full dental arches, or the limited possibility to assess detailed tooth surfaces. A new reference scanner, based on focus variation scanning technique, was evaluated with regards to highest local and general accuracy. A specific scanning protocol was tested to scan original tooth surface from dental impressions. Also, different model materials were verified. The results showed a high scanning accuracy of the reference scanner with a mean deviation of 5.3 ± 1.1 µm for trueness and 1.6 ± 0.6 µm for precision in case of full arch scans. Current dental impression methods showed much higher deviations (trueness: 20.4 ± 2.2 µm, precision: 12.5 ± 2.5 µm) than the internal scanning accuracy of the reference scanner. Smaller objects like single tooth surface can be scanned with an even higher accuracy, enabling the system to assess erosive and abrasive tooth surface loss. The reference scanner can be used to measure differences for a lot of dental research fields. The different magnification levels combined with a high local and general accuracy can be used to assess changes of single teeth or restorations up to full arch changes.
Medicine, Issue 86, Laboratories, Dental, Calibration, Technology, Dental impression, Accuracy, Trueness, Precision, Full arch scan, Abrasion
Imaging Cleared Intact Biological Systems at a Cellular Level by 3DISCO
Institutions: Genentech, Inc., Genentech, Inc., Genentech, Inc..
Tissue clearing and subsequent imaging of transparent organs is a powerful method to analyze fluorescently labeled cells and molecules in 3D, in intact organs. Unlike traditional histological methods, where the tissue of interest is sectioned for fluorescent imaging, 3D imaging of cleared tissue allows examination of labeled cells and molecules in the entire specimen. To this end, optically opaque tissues should be rendered transparent by matching the refractory indices throughout the tissue. Subsequently, the tissue can be imaged at once using laser-scanning microscopes to obtain a complete high-resolution 3D image of the specimen. A growing list of tissue clearing protocols including 3DISCO, CLARITY, Sca/e, ClearT2, and SeeDB provide new ways for researchers to image their tissue of interest as a whole. Among them, 3DISCO is a highly reproducible and straightforward method, which can clear different types of tissues and can be utilized with various microscopy techniques. This protocol describes this straightforward procedure and presents its various applications. It also discusses the limitations and possible difficulties and how to overcome them.
Neuroscience, Issue 89, 3D imaging, tissue clearing, transparent tissue, intact organs, optical clearing, histology, laser scanning, light-sheet microscopy, fluorescent imaging, 3DISCO, ultramicroscope
Coordinate Mapping of Hyolaryngeal Mechanics in Swallowing
Institutions: Georgia Regents University, New York University, Georgia Regents University, Georgia Regents University.
Characterizing hyolaryngeal movement is important to dysphagia research. Prior methods require multiple measurements to obtain one kinematic measurement whereas coordinate mapping of hyolaryngeal mechanics using Modified Barium Swallow (MBS) uses one set of coordinates to calculate multiple variables of interest. For demonstration purposes, ten kinematic measurements were generated from one set of coordinates to determine differences in swallowing two different bolus types. Calculations of hyoid excursion against the vertebrae and mandible are correlated to determine the importance of axes of reference.
To demonstrate coordinate mapping methodology, 40 MBS studies were randomly selected from a dataset of healthy normal subjects with no known swallowing impairment. A 5 ml thin-liquid bolus and a 5 ml pudding swallows were measured from each subject. Nine coordinates, mapping the cranial base, mandible, vertebrae and elements of the hyolaryngeal complex, were recorded at the frames of minimum and maximum hyolaryngeal excursion. Coordinates were mathematically converted into ten variables of hyolaryngeal mechanics.
Inter-rater reliability was evaluated by Intraclass correlation coefficients (ICC). Two-tailed t-tests were used to evaluate differences in kinematics by bolus viscosity. Hyoid excursion measurements against different axes of reference were correlated. Inter-rater reliability among six raters for the 18 coordinates ranged from ICC = 0.90 - 0.97. A slate of ten kinematic measurements was compared by subject between the six raters. One outlier was rejected, and the mean of the remaining reliability scores was ICC = 0.91, 0.84 - 0.96, 95% CI. Two-tailed t-tests with Bonferroni corrections comparing ten kinematic variables (5 ml thin-liquid vs. 5 ml pudding swallows) showed statistically significant differences in hyoid excursion, superior laryngeal movement, and pharyngeal shortening (p
< 0.005). Pearson correlations of hyoid excursion measurements from two different axes of reference were: r = 0.62, r2
= 0.38, (thin-liquid); r = 0.52, r2
= 0.27, (pudding).
Obtaining landmark coordinates is a reliable method to generate multiple kinematic variables from video fluoroscopic images useful in dysphagia research.
Medicine, Issue 87, videofluoroscopy, modified barium swallow studies, hyolaryngeal kinematics, deglutition, dysphagia, dysphagia research, hyolaryngeal complex
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Development of Amelogenin-chitosan Hydrogel for In Vitro Enamel Regrowth with a Dense Interface
Institutions: University of Southern California.
Biomimetic enamel reconstruction is a significant topic in material science and dentistry as a novel approach for the treatment of dental caries or erosion. Amelogenin has been proven to be a critical protein for controlling the organized growth of apatite crystals. In this paper, we present a detailed protocol for superficial enamel reconstruction by using a novel amelogenin-chitosan hydrogel. Compared to other conventional treatments, such as topical fluoride and mouthwash, this method not only has the potential to prevent the development of dental caries but also promotes significant and durable enamel restoration. The organized enamel-like microstructure regulated by amelogenin assemblies can significantly improve the mechanical properties of etched enamel, while the dense enamel-restoration interface formed by an in situ
regrowth of apatite crystals can improve the effectiveness and durability of restorations. Furthermore, chitosan hydrogel is easy to use and can suppress bacterial infection, which is the major risk factor for the occurrence of dental caries. Therefore, this biocompatible and biodegradable amelogenin-chitosan hydrogel shows promise as a biomaterial for the prevention, restoration, and treatment of defective enamel.
Bioengineering, Issue 89, Enamel, Amelogenin, Chitosan hydrogel, Apatite, Biomimetic, Erosion, Superficial enamel reconstruction, Dense interface
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
High-throughput Assay to Phenotype Salmonella enterica Typhimurium Association, Invasion, and Replication in Macrophages
Institutions: Texas A&M University, Texas A&M University System Health Science Center, University of California, Irvine, University of California, Davis.
species are zoonotic pathogens and leading causes of food borne illnesses in humans and livestock1
. Understanding the mechanisms underlying Salmonella
-host interactions are important to elucidate the molecular pathogenesis of Salmonella
infection. The Gentamicin protection assay to phenotype Salmonella
association, invasion and replication in phagocytic cells was adapted to allow high-throughput screening to define the roles of deletion mutants of Salmonella enterica
serotype Typhimurium in host interactions using RAW 264.7 murine macrophages.
Under this protocol, the variance in measurements is significantly reduced compared to the standard protocol, because wild-type and multiple mutant strains can be tested in the same culture dish and at the same time. The use of multichannel pipettes increases the throughput and enhances precision. Furthermore, concerns related to using less host cells per well in 96-well culture dish were addressed. Here, the protocol of the modified in vitro Salmonella
invasion assay using phagocytic cells was successfully employed to phenotype 38 individual Salmonella
deletion mutants for association, invasion and intracellular replication. The in vitro
phenotypes are presented, some of which were subsequently confirmed to have in vivo
phenotypes in an animal model. Thus, the modified, standardized assay to phenotype Salmonella
association, invasion and replication in macrophages with high-throughput capacity could be utilized more broadly to study bacterial-host interactions.
Infectious Diseases, Issue 90, Salmonella enterica Typhimurium, association, invasion, replication, phenotype, intracellular pathogens, macrophages
Behavioral and Locomotor Measurements Using an Open Field Activity Monitoring System for Skeletal Muscle Diseases
Institutions: Children's National Medical Center, George Washington University School of Medicine and Health Sciences.
The open field activity monitoring system comprehensively assesses locomotor and behavioral activity levels of mice. It is a useful tool for assessing locomotive impairment in animal models of neuromuscular disease and efficacy of therapeutic drugs that may improve locomotion and/or muscle function. The open field activity measurement provides a different measure than muscle strength, which is commonly assessed by grip strength measurements. It can also show how drugs may affect other body systems as well when used with additional outcome measures. In addition, measures such as total distance traveled mirror the 6 min walk test, a clinical trial outcome measure. However, open field activity monitoring is also associated with significant challenges: Open field activity measurements vary according to animal strain, age, sex, and circadian rhythm. In addition, room temperature, humidity, lighting, noise, and even odor can affect assessment outcomes. Overall, this manuscript provides a well-tested and standardized open field activity SOP for preclinical trials in animal models of neuromuscular diseases. We provide a discussion of important considerations, typical results, data analysis, and detail the strengths and weaknesses of open field testing. In addition, we provide recommendations for optimal study design when using open field activity in a preclinical trial.
Behavior, Issue 91, open field activity, functional testing, behavioral testing, skeletal muscle, congenital muscular dystrophy, muscular dystrophy
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
Confocal Time Lapse Imaging as an Efficient Method for the Cytocompatibility Evaluation of Dental Composites
Institutions: UMR CNRS 5615, Université Lyon1, Hospices Civils de Lyon, APHP, Hôpital Rothschild.
It is generally accepted that in vitro
cell material interaction is a useful criterion in the evaluation of dental material biocompatibility. The objective of this study was to use 3D CLSM time lapse confocal imaging to assess the in vitro
biocompatibility of dental composites. This method provides an accurate and sensitive indication of viable cell rate in contact with dental composite extracts. The ELS extra low shrinkage, a dental composite used for direct restoration, has been taken as example. In vitro
assessment was performed on cultured primary human gingival fibroblast cells using Live/Dead staining. Images were obtained with the FV10i confocal biological inverted system and analyzed with the FV10-ASW 3.1 Software. Image analysis showed a very slight cytotoxicity in the presence of the tested composite after 5 hours of time lapse. A slight decrease of cell viability was shown in contact with the tested composite extracts compared to control cells. The findings highlighted the use of 3D CLSM time lapse imaging as a sensitive method to qualitatively and quantitatively evaluate the biocompatibility behavior of dental composites.
Medicine, Issue 93, In vitro biocompatibility, dental composites, Live/Deadstaining, 3D imaging, Confocal Microscopy, Time lapse imaging
Measurement of Greenhouse Gas Flux from Agricultural Soils Using Static Chambers
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, University of Wisconsin-Madison, USDA-ARS Dairy Forage Research Center, USDA-ARS Pasture Systems Watershed Management Research Unit.
Measurement of greenhouse gas (GHG) fluxes between the soil and the atmosphere, in both managed and unmanaged ecosystems, is critical to understanding the biogeochemical drivers of climate change and to the development and evaluation of GHG mitigation strategies based on modulation of landscape management practices. The static chamber-based method described here is based on trapping gases emitted from the soil surface within a chamber and collecting samples from the chamber headspace at regular intervals for analysis by gas chromatography. Change in gas concentration over time is used to calculate flux. This method can be utilized to measure landscape-based flux of carbon dioxide, nitrous oxide, and methane, and to estimate differences between treatments or explore system dynamics over seasons or years. Infrastructure requirements are modest, but a comprehensive experimental design is essential. This method is easily deployed in the field, conforms to established guidelines, and produces data suitable to large-scale GHG emissions studies.
Environmental Sciences, Issue 90, greenhouse gas, trace gas, gas flux, static chamber, soil, field, agriculture, climate
Using Informational Connectivity to Measure the Synchronous Emergence of fMRI Multi-voxel Information Across Time
Institutions: University of Pennsylvania.
It is now appreciated that condition-relevant information can be present within distributed patterns of functional magnetic resonance imaging (fMRI) brain activity, even for conditions with similar levels of univariate activation. Multi-voxel pattern (MVP) analysis has been used to decode this information with great success. FMRI investigators also often seek to understand how brain regions interact in interconnected networks, and use functional connectivity (FC) to identify regions that have correlated responses over time. Just as univariate analyses can be insensitive to information in MVPs, FC may not fully characterize the brain networks that process conditions with characteristic MVP signatures. The method described here, informational connectivity (IC), can identify regions with correlated changes in MVP-discriminability across time, revealing connectivity that is not accessible to FC. The method can be exploratory, using searchlights to identify seed-connected areas, or planned, between pre-selected regions-of-interest. The results can elucidate networks of regions that process MVP-related conditions, can breakdown MVPA searchlight maps into separate networks, or can be compared across tasks and patient groups.
Neuroscience, Issue 89, fMRI, MVPA, connectivity, informational connectivity, functional connectivity, networks, multi-voxel pattern analysis, decoding, classification, method, multivariate
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Shrinkage of Dental Composite in Simulated Cavity Measured with Digital Image Correlation
Institutions: University of Minnesota.
Polymerization shrinkage of dental resin composites can lead to restoration debonding or cracked tooth tissues in composite-restored teeth. In order to understand where and how shrinkage strain and stress develop in such restored teeth, Digital Image Correlation (DIC) was used to provide a comprehensive view of the displacement and strain distributions within model restorations that had undergone polymerization shrinkage.
Specimens with model cavities were made of cylindrical glass rods with both diameter and length being 10 mm. The dimensions of the mesial-occlusal-distal (MOD) cavity prepared in each specimen measured 3 mm and 2 mm in width and depth, respectively. After filling the cavity with resin composite, the surface under observation was sprayed with first a thin layer of white paint and then fine black charcoal powder to create high-contrast speckles. Pictures of that surface were then taken before curing and 5 min after. Finally, the two pictures were correlated using DIC software to calculate the displacement and strain distributions.
The resin composite shrunk vertically towards the bottom of the cavity, with the top center portion of the restoration having the largest downward displacement. At the same time, it shrunk horizontally towards its vertical midline. Shrinkage of the composite stretched the material in the vicinity of the “tooth-restoration” interface, resulting in cuspal deflections and high tensile strains around the restoration. Material close to the cavity walls or floor had direct strains mostly in the directions perpendicular to the interfaces. Summation of the two direct strain components showed a relatively uniform distribution around the restoration and its magnitude equaled approximately to the volumetric shrinkage strain of the material.
Medicine, Issue 89, image processing, computer-assisted, polymer matrix composites, testing of materials (composite materials), dental composite restoration, polymerization shrinkage, digital image correlation, full-field strain measurement, interfacial debonding
Manufacturing and Using Piggy-back Multibarrel Electrodes for In vivo Pharmacological Manipulations of Neural Responses
Institutions: University of Colorado Medical Campus.
recordings from single neurons allow an investigator to examine the firing properties of neurons, for example in response to sensory stimuli. Neurons typically receive multiple excitatory and inhibitory afferent and/or efferent inputs that integrate with each other, and the ultimate measured response properties of the neuron are driven by the neural integrations of these inputs. To study information processing in neural systems, it is necessary to understand the various inputs to a neuron or neural system, and the specific properties of these inputs. A powerful and technically relatively simple method to assess the functional role of certain inputs that a given neuron is receiving is to dynamically and reversibly suppress or eliminate these inputs, and measure the changes in the neuron's output caused by this manipulation. This can be accomplished by pharmacologically altering the neuron's immediate environment with piggy-back multibarrel electrodes. These electrodes consist of a single barrel recording electrode and a multibarrel drug electrode that can carry up to 4 different synaptic agonists or antagonists. The pharmacological agents can be applied iontophoretically at desired times during the experiment, allowing for time-controlled delivery and reversible reconfiguration of synaptic inputs. As such, pharmacological manipulation of the microenvironment represents a powerful and unparalleled method to test specific hypotheses about neural circuit function.
Here we describe how piggy-back electrodes are manufactured, and how they are used during in vivo
experiments. The piggy-back system allows an investigator to combine a single barrel recording electrode of any arbitrary property (resistance, tip size, shape etc) with a multibarrel drug electrode. This is a major advantage over standard multi-electrodes, where all barrels have more or less similar shapes and properties. Multibarrel electrodes were first introduced over 40 years ago 1-3
, and have undergone a number of design improvements 2,3
until the piggy-back type was introduced in the 1980s 4,5
. Here we present a set of important improvements in the laboratory production of piggy-back electrodes that allow for deep brain penetration in intact in vivo
animal preparations due to a relatively thin electrode shaft that causes minimal damage. Furthermore these electrodes are characterized by low noise recordings, and have low resistance drug barrels for very effective iontophoresis of the desired pharmacological agents.
Neuroscience, Issue 71, Biophysics, Physiology, Neurobiology, Medicine, Pharmacology, Mechanical Engineering, Electrical Engineering, Piggyback electrode, iontophoresis, iontophoresis pump, single cell recording, neural excitation, neural inhibition, in vivo electrophysiology
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
C. elegans Chemotaxis Assay
Institutions: Queen's University , Queen's University , Queen's University .
Many organisms use chemotaxis to seek out food sources, avoid noxious substances, and find mates. Caenorhabditis elegans
has impressive chemotaxis behavior.
The premise behind testing the response of the worms to an odorant is to place them in an area and observe the movement evoked in response to an odorant. Even with the many available assays, optimizing worm starting location relative to both the control and test areas, while minimizing the interaction of worms with each other, while maintaining a significant sample size remains a work in progress 1-10
. The method described here aims to address these issues by modifying the assay developed by Bargmann et al
. A Petri dish is divided into four quadrants, two opposite quadrants marked "Test" and two are designated "Control". Anesthetic is placed in all test and control sites. The worms are placed in the center of the plate with a circle marked around the origin to ensure that non-motile worms will be ignored. Utilizing a four-quadrant system rather than one 2 or two 1 eliminates bias in the movement of the worms, as they are equidistant from test and control samples, regardless of which side of the origin they began. This circumvents the problem of worms being forced to travel through a cluster of other worms to respond to an odorant, which can delay worms or force them to take a more circuitous route, yielding an incorrect interpretation of their intended path. This method also shows practical advantages by having a larger sample size and allowing the researcher to run the assay unattended and score the worms once the allotted time has expired.
Behavior, Issue 74, Biochemistry, Cellular Biology, Molecular Biology, Developmental Biology, Physiology, Anatomy, Chemical Engineering, chemotaxis, Caenorhabditis elegans, C. elegans, chemotaxis assay, nematode, chemotactic index, worm movement, animal model
Intact Histological Characterization of Brain-implanted Microdevices and Surrounding Tissue
Institutions: Purdue University, Purdue University.
Research into the design and utilization of brain-implanted microdevices, such as microelectrode arrays, aims to produce clinically relevant devices that interface chronically with surrounding brain tissue. Tissue surrounding these implants is thought to react to the presence of the devices over time, which includes the formation of an insulating "glial scar" around the devices. However, histological analysis of these tissue changes is typically performed after explanting the device, in a process that can disrupt the morphology of the tissue of interest.
Here we demonstrate a protocol in which cortical-implanted devices are collected intact in surrounding rodent brain tissue. We describe how, once perfused with fixative, brains are removed and sliced in such a way as to avoid explanting devices. We outline fluorescent antibody labeling and optical clearing methods useful for producing an informative, yet thick tissue section. Finally, we demonstrate the mounting and imaging of these tissue sections in order to investigate the biological interface around brain-implanted devices.
Neurobiology, Issue 72, Neuroscience, Biomedical Engineering, Medicine, Central Nervous System, Brain, Neuroglia, Neurons, Immunohistochemistry (IHC), Histocytological Preparation Techniques, Microscopy, Confocal, nondestructive testing, bioengineering (man-machine systems), bionics, histology, brain implants, microelectrode arrays, immunohistochemistry, neuroprosthetics, brain machine interface, microscopy, thick tissue, optical clearing, animal model
Telomere Length and Telomerase Activity; A Yin and Yang of Cell Senescence
Institutions: Albert Einstein College of Medicine , Albert Einstein College of Medicine , Albert Einstein College of Medicine .
Telomeres are repeating DNA sequences at the tip ends of the chromosomes that are diverse in length and in humans can reach a length of 15,000 base pairs. The telomere serves as a bioprotective mechanism of chromosome attrition at each cell division. At a certain length, telomeres become too short to allow replication, a process that may lead to chromosome instability or cell death. Telomere length is regulated by two opposing mechanisms: attrition and elongation. Attrition occurs as each cell divides. In contrast, elongation is partially modulated by the enzyme telomerase, which adds repeating sequences to the ends of the chromosomes. In this way, telomerase could possibly reverse an aging mechanism and rejuvenates cell viability. These are crucial elements in maintaining cell life and are used to assess cellular aging. In this manuscript we will describe an accurate, short, sophisticated and cheap method to assess telomere length in multiple tissues and species. This method takes advantage of two key elements, the tandem repeat of the telomere sequence and the sensitivity of the qRT-PCR to detect differential copy numbers of tested samples. In addition, we will describe a simple assay to assess telomerase activity as a complementary backbone test for telomere length.
Genetics, Issue 75, Molecular Biology, Cellular Biology, Medicine, Biomedical Engineering, Genomics, Telomere length, telomerase activity, telomerase, telomeres, telomere, DNA, PCR, polymerase chain reaction, qRT-PCR, sequencing, aging, telomerase assay
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Measuring Frailty in HIV-infected Individuals. Identification of Frail Patients is the First Step to Amelioration and Reversal of Frailty
Institutions: University of Arizona, University of Arizona.
A simple, validated protocol consisting of a battery of tests is available to identify elderly patients with frailty syndrome. This syndrome of decreased reserve and resistance to stressors increases in incidence with increasing age. In the elderly, frailty may pursue a step-wise loss of function from non-frail to pre-frail to frail. We studied frailty in HIV-infected patients and found that ~20% are frail using the Fried phenotype using stringent criteria developed for the elderly1,2
. In HIV infection the syndrome occurs at a younger age.
HIV patients were checked for 1) unintentional weight loss; 2) slowness as determined by walking speed; 3) weakness as measured by a grip dynamometer; 4) exhaustion by responses to a depression scale; and 5) low physical activity was determined by assessing kilocalories expended in a week's time. Pre-frailty was present with any two of five criteria and frailty was present if any three of the five criteria were abnormal.
The tests take approximately 10-15 min to complete and they can be performed by medical assistants during routine clinic visits. Test results are scored by referring to standard tables. Understanding which of the five components contribute to frailty in an individual patient can allow the clinician to address relevant underlying problems, many of which are not evident in routine HIV clinic visits.
Medicine, Issue 77, Infection, Virology, Infectious Diseases, Anatomy, Physiology, Molecular Biology, Biomedical Engineering, Retroviridae Infections, Body Weight Changes, Diagnostic Techniques and Procedures, Physical Examination, Muscle Strength, Behavior, Virus Diseases, Pathological Conditions, Signs and Symptoms, Diagnosis, Musculoskeletal and Neural Physiological Phenomena, HIV, HIV-1, AIDS, Frailty, Depression, Weight Loss, Weakness, Slowness, Exhaustion, Aging, clinical techniques
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Osteopathic Manipulative Treatment as a Useful Adjunctive Tool for Pneumonia
Institutions: New York Institute of Technology College of Osteopathic Medicine.
Pneumonia, the inflammatory state of lung tissue primarily due to microbial infection, claimed 52,306 lives in the United States in 20071
and resulted in the hospitalization of 1.1 million patients2
. With an average length of in-patient hospital stay of five days2
, pneumonia and influenza comprise significant financial burden costing the United States $40.2 billion in 20053
. Under the current Infectious Disease Society of America/American Thoracic Society guidelines, standard-of-care recommendations include the rapid administration of an appropriate antibiotic regiment, fluid replacement, and ventilation (if necessary). Non-standard therapies include the use of corticosteroids and statins; however, these therapies lack conclusive supporting evidence4
. (Figure 1)
Osteopathic Manipulative Treatment (OMT) is a cost-effective adjunctive treatment of pneumonia that has been shown to reduce patients’ length of hospital stay, duration of intravenous antibiotics, and incidence of respiratory failure or death when compared to subjects who received conventional care alone5
. The use of manual manipulation techniques for pneumonia was first recorded as early as the Spanish influenza pandemic of 1918, when patients treated with standard medical care had an estimated mortality rate of 33%, compared to a 10% mortality rate in patients treated by osteopathic physicians6
. When applied to the management of pneumonia, manual manipulation techniques bolster lymphatic flow, respiratory function, and immunological defense by targeting anatomical structures involved in the these systems7,8, 9, 10
The objective of this review video-article is three-fold: a) summarize the findings of randomized controlled studies on the efficacy of OMT in adult patients with diagnosed pneumonia, b) demonstrate established protocols utilized by osteopathic physicians treating pneumonia, c) elucidate the physiological mechanisms behind manual manipulation of the respiratory and lymphatic systems. Specifically, we will discuss and demonstrate four routine techniques that address autonomics, lymph drainage, and rib cage mobility: 1) Rib Raising, 2) Thoracic Pump, 3) Doming of the Thoracic Diaphragm, and 4) Muscle Energy for Rib 1.5,11
Medicine, Issue 87, Pneumonia, osteopathic manipulative medicine (OMM) and techniques (OMT), lymphatic, rib raising, thoracic pump, muscle energy, doming diaphragm, alternative treatment
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e.
C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample).
Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e.
colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ
soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Improving IV Insulin Administration in a Community Hospital
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4
The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5
It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance.
The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6
Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8
Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia.
Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control