Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration.
Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release.
The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
27 Related JoVE Articles!
Movement Retraining using Real-time Feedback of Performance
Institutions: University of British Columbia .
Any modification of movement - especially movement patterns that have been honed over a number of years - requires re-organization of the neuromuscular patterns responsible for governing the movement performance. This motor learning can be enhanced through a number of methods that are utilized in research and clinical settings alike. In general, verbal feedback of performance in real-time or knowledge of results following movement is commonly used clinically as a preliminary means of instilling motor learning. Depending on patient preference and learning style, visual feedback (e.g.
through use of a mirror or different types of video) or proprioceptive guidance utilizing therapist touch, are used to supplement verbal instructions from the therapist. Indeed, a combination of these forms of feedback is commonplace in the clinical setting to facilitate motor learning and optimize outcomes.
Laboratory-based, quantitative motion analysis has been a mainstay in research settings to provide accurate and objective analysis of a variety of movements in healthy and injured populations. While the actual mechanisms of capturing the movements may differ, all current motion analysis systems rely on the ability to track the movement of body segments and joints and to use established equations of motion to quantify key movement patterns. Due to limitations in acquisition and processing speed, analysis and description of the movements has traditionally occurred offline after completion of a given testing session.
This paper will highlight a new supplement to standard motion analysis techniques that relies on the near instantaneous assessment and quantification of movement patterns and the display of specific movement characteristics to the patient during
a movement analysis session. As a result, this novel technique can provide a new method of feedback delivery that has advantages over currently used feedback methods.
Medicine, Issue 71, Biophysics, Anatomy, Physiology, Physics, Biomedical Engineering, Behavior, Psychology, Kinesiology, Physical Therapy, Musculoskeletal System, Biofeedback, biomechanics, gait, movement, walking, rehabilitation, clinical, training
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro
. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro
replication of HIV-1 as influenced by the gag
gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag
gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro
replication of chronically derived gag-pro
sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Peptide-based Identification of Functional Motifs and their Binding Partners
Institutions: Morehouse School of Medicine, Institute for Systems Biology, Universiti Sains Malaysia.
Specific short peptides derived from motifs found in full-length proteins, in our case HIV-1 Nef, not only retain their biological function, but can also competitively inhibit the function of the full-length protein. A set of 20 Nef scanning peptides, 20 amino acids in length with each overlapping 10 amino acids of its neighbor, were used to identify motifs in Nef responsible for its induction of apoptosis. Peptides containing these apoptotic motifs induced apoptosis at levels comparable to the full-length Nef protein. A second peptide, derived from the Secretion Modification Region (SMR) of Nef, retained the ability to interact with cellular proteins involved in Nef's secretion in exosomes (exNef). This SMRwt peptide was used as the "bait" protein in co-immunoprecipitation experiments to isolate cellular proteins that bind specifically to Nef's SMR motif. Protein transfection and antibody inhibition was used to physically disrupt the interaction between Nef and mortalin, one of the isolated SMR-binding proteins, and the effect was measured with a fluorescent-based exNef secretion assay. The SMRwt peptide's ability to outcompete full-length Nef for cellular proteins that bind the SMR motif, make it the first inhibitor of exNef secretion. Thus, by employing the techniques described here, which utilize the unique properties of specific short peptides derived from motifs found in full-length proteins, one may accelerate the identification of functional motifs in proteins and the development of peptide-based inhibitors of pathogenic functions.
Virology, Issue 76, Biochemistry, Immunology, Infection, Infectious Diseases, Molecular Biology, Medicine, Genetics, Microbiology, Genomics, Proteins, Exosomes, HIV, Peptides, Exocytosis, protein trafficking, secretion, HIV-1, Nef, Secretion Modification Region, SMR, peptide, AIDS, assay
Development of Cell-type specific anti-HIV gp120 aptamers for siRNA delivery
Institutions: Beckman Research Institute of City of Hope, Beckman Research Institute of City of Hope, Beckman Research Institute of City of Hope.
The global epidemic of infection by HIV has created an urgent need for new classes of antiretroviral agents. The potent ability of small interfering (si)RNAs to inhibit the expression of complementary RNA transcripts is being exploited as a new class of therapeutics for a variety of diseases including HIV. Many previous reports have shown that novel RNAi-based anti-HIV/AIDS therapeutic strategies have considerable promise; however, a key obstacle to the successful therapeutic application and clinical translation of siRNAs is efficient delivery. Particularly, considering the safety and efficacy of RNAi-based therapeutics, it is highly desirable to develop a targeted intracellular siRNA delivery approach to specific cell populations or tissues. The HIV-1 gp120 protein, a glycoprotein envelope on the surface of HIV-1, plays an important role in viral entry into CD4 cells. The interaction of gp120 and CD4 that triggers HIV-1 entry and initiates cell fusion has been validated as a clinically relevant anti-viral strategy for drug discovery.
Herein, we firstly discuss the selection and identification of 2'-F modified anti-HIV gp120 RNA aptamers. Using a conventional nitrocellulose filter SELEX method, several new aptamers with nanomolar affinity were isolated from a 50 random nt RNA library. In order to successfully obtain bound species with higher affinity, the selection stringency is carefully controlled by adjusting the conditions. The selected aptamers can specifically bind and be rapidly internalized into cells expressing the HIV-1 envelope protein. Additionally, the aptamers alone can neutralize HIV-1 infectivity. Based upon the best aptamer A-1, we also create a novel dual inhibitory function anti-gp120 aptamer-siRNA chimera in which both the aptamer and the siRNA portions have potent anti-HIV activities. Further, we utilize the gp120 aptamer-siRNA chimeras for cell-type specific delivery of the siRNA into HIV-1 infected cells. This dual function chimera shows considerable potential for combining various nucleic acid therapeutic agents (aptamer and siRNA) in suppressing HIV-1 infection, making the aptamer-siRNA chimeras attractive therapeutic candidates for patients failing highly active antiretroviral therapy (HAART).
Immunology, Issue 52, SELEX (Systematic Evolution of Ligands by EXponential enrichment), RNA aptamer, HIV-1 gp120, RNAi (RNA interference), siRNA (small interfering RNA), cell-type specific delivery
Rapid Screening of HIV Reverse Transcriptase and Integrase Inhibitors
Institutions: National Cancer Institute.
Although a number of anti HIV drugs have been approved, there are still problems with toxicity and drug resistance. This demonstrates a need to identify new compounds that can inhibit infection by the common drug resistant HIV-1 strains with minimal toxicity. Here we describe an efficient assay that can be used to rapidly determine the cellular cytotoxicity and efficacy of a compound against WT and mutant viral strains.
The desired target cell line is seeded in a 96-well plate and, after a 24 hr incubation, serially dilutions of the compounds to be tested are added. No further manipulations are necessary for cellular cytotoxicity assays; for anti HIV assays a predetermined amount of either a WT or drug resistant HIV-1 vector that expresses luciferase is added to the cells. Cytotoxicity is measured by using an ATP dependent luminescence assay and the impact of the compounds on infectivity is measured by determining the amount of luciferase in the presence or the absence of the putative inhibitors.
This screening assay takes 4 days to complete and multiple compounds can be screened in parallel. Compounds are screened in triplicate and the data are normalized to the infectivity/ATP levels in absence of target compounds. This technique provides a quick and accurate measurement of the efficacy and toxicity of potential anti HIV compounds.
Immunology, Issue 86, HIV, cytotoxicity, infectivity, luciferase, drug resistance, integrase, reverse transcriptase
Collection, Isolation, and Flow Cytometric Analysis of Human Endocervical Samples
Institutions: University of Manitoba, University of Manitoba.
Despite the public health importance of mucosal pathogens (including HIV), relatively little is known about mucosal immunity, particularly at the female genital tract (FGT). Because heterosexual transmission now represents the dominant mechanism of HIV transmission, and given the continual spread of sexually transmitted infections (STIs), it is critical to understand the interplay between host and pathogen at the genital mucosa. The substantial gaps in knowledge around FGT immunity are partially due to the difficulty in successfully collecting and processing mucosal samples. In order to facilitate studies with sufficient sample size, collection techniques must be minimally invasive and efficient. To this end, a protocol for the collection of cervical cytobrush samples and subsequent isolation of cervical mononuclear cells (CMC) has been optimized. Using ex vivo
flow cytometry-based immunophenotyping, it is possible to accurately and reliably quantify CMC lymphocyte/monocyte population frequencies and phenotypes. This technique can be coupled with the collection of cervical-vaginal lavage (CVL), which contains soluble immune mediators including cytokines, chemokines and anti-proteases, all of which can be used to determine the anti- or pro-inflammatory environment in the vagina.
Medicine, Issue 89, mucosal, immunology, FGT, lavage, cervical, CMC
Measuring Frailty in HIV-infected Individuals. Identification of Frail Patients is the First Step to Amelioration and Reversal of Frailty
Institutions: University of Arizona, University of Arizona.
A simple, validated protocol consisting of a battery of tests is available to identify elderly patients with frailty syndrome. This syndrome of decreased reserve and resistance to stressors increases in incidence with increasing age. In the elderly, frailty may pursue a step-wise loss of function from non-frail to pre-frail to frail. We studied frailty in HIV-infected patients and found that ~20% are frail using the Fried phenotype using stringent criteria developed for the elderly1,2
. In HIV infection the syndrome occurs at a younger age.
HIV patients were checked for 1) unintentional weight loss; 2) slowness as determined by walking speed; 3) weakness as measured by a grip dynamometer; 4) exhaustion by responses to a depression scale; and 5) low physical activity was determined by assessing kilocalories expended in a week's time. Pre-frailty was present with any two of five criteria and frailty was present if any three of the five criteria were abnormal.
The tests take approximately 10-15 min to complete and they can be performed by medical assistants during routine clinic visits. Test results are scored by referring to standard tables. Understanding which of the five components contribute to frailty in an individual patient can allow the clinician to address relevant underlying problems, many of which are not evident in routine HIV clinic visits.
Medicine, Issue 77, Infection, Virology, Infectious Diseases, Anatomy, Physiology, Molecular Biology, Biomedical Engineering, Retroviridae Infections, Body Weight Changes, Diagnostic Techniques and Procedures, Physical Examination, Muscle Strength, Behavior, Virus Diseases, Pathological Conditions, Signs and Symptoms, Diagnosis, Musculoskeletal and Neural Physiological Phenomena, HIV, HIV-1, AIDS, Frailty, Depression, Weight Loss, Weakness, Slowness, Exhaustion, Aging, clinical techniques
New Tools to Expand Regulatory T Cells from HIV-1-infected Individuals
Institutions: Ragon Institute of MGH, MIT, and Harvard, Massachusetts General Hospital.
CD4+ Regulatory T cells (Tregs) are potent immune modulators and serve an important function in human immune homeostasis. Depletion of Tregs has led to measurable increases in antigen-specific T cell responses in vaccine settings for cancer and infectious pathogens. However, their role in HIV-1 immuno-pathogenesis remains controversial, as they could either serve to suppress deleterious HIV-1-associated immune activation and thus slow HIV-1 disease progression or alternatively suppress HIV-1-specific immunity and thereby promote virus spread. Understanding and modulating Treg function in the context of HIV-1 could lead to potential new strategies for immunotherapy or HIV vaccines. However, important open questions remain on their role in the context of HIV-1 infection, which needs to be carefully studied.
Representing roughly 5% of human CD4+ T cells in the peripheral blood, studying the Treg population has proven to be difficult, especially in HIV-1 infected individuals where HIV-1-associated CD4 T cell and with that Treg depletion occurs. The characterization of regulatory T cells in individuals with advanced HIV-1 disease or tissue samples, for which only very small biological samples can be obtained, is therefore extremely challenging. We propose a technical solution to overcome these limitations using isolation and expansion of Tregs from HIV-1-positive individuals.
Here we describe an easy and robust method to successfully expand Tregs isolated from HIV-1-infected individuals in vitro
. Flow-sorted CD3+
Tregs were stimulated with anti-CD3/anti-CD28 coated beads and cultured in the presence of IL-2. The expanded Tregs expressed high levels of FOXP3, CTLA4 and HELIOS compared to conventional T cells and were shown to be highly suppressive. Easier access to large numbers of Tregs will allow researchers to address important questions concerning their role in HIV-1 immunopathogenesis. We believe answering these questions may provide useful insight for the development of an effective HIV-1 vaccine.
Infection, Issue 75, Infectious Diseases, Medicine, Immunology, Virology, Cellular Biology, Molecular Biology, Lymphocytes, T-Lymphocytes, Regulatory, HIV, Culture Techniques, flow cytometry, cell culture, Treg expansion, regulatory T cells, CD4+ T cells, Tregs, HIV-1, virus, HIV-1 infection, AIDS, clinical techniques
Preparation and Use of HIV-1 Infected Primary CD4+ T-Cells as Target Cells in Natural Killer Cell Cytotoxic Assays
Institutions: Rush University Medical Center.
Natural killer (NK) cells are a vital component of the innate immune response to virus-infected cells. It is important to understand the ability of NK cells to recognize and lyse HIV-1 infected cells because identifying any aberrancy in NK cell function against HIV-infected cells could potentially lead to therapies that would enhance their cytolytic activity. There is a need to use HIV-infected primary T-cell blasts as target cells rather then infected-T-cell lines in the cytotoxicity assays. T-cell lines, even without infection, are quite susceptible to NK cell lysis. Furthermore, it is necessary to use autologous primary cells to prevent major histocompatibility complex class I mismatches between the target and effector cell that will result in lysis. Early studies evaluating NK cell cytolytic responses to primary HIV-infected cells failed to show significant killing of the infected cells 1,2
. However, using HIV-1 infected primary T-cells as target cells in NK cell functional assays has been difficult due the presence of contaminating uninfected cells 3
. This inconsistent infected cell to uninfected cell ratio will result in variation in NK cell killing between samples that may not be due to variability in donor NK cell function. Thus, it would be beneficial to work with a purified infected cell population in order to standardize the effector to target cell ratios between experiments 3,4
. Here we demonstrate the isolation of a highly purified population of HIV-1 infected cells by taking advantage of HIV-1's ability to down-modulate CD4 on infected cells and the availability of commercial kits to remove dead or dying cells 3-6
. The purified infected primary T-cell blasts can then be used as targets in either a degranulation or cytotoxic assay with purified NK cells as the effector population 5-7
. Use of NK cells as effectors in a degranulation assay evaluates the ability of an NK cell to release the lytic contents of specialized lysosomes 8
called "cytolytic granules". By staining with a fluorochrome conjugated antibody against CD107a, a lysosomal membrane protein that becomes expressed on the NK cell surface when the cytolytic granules fuse to the plasma membrane, we can determine what percentage of NK cells degranulate in response to target cell recognition. Alternatively, NK cell lytic activity can be evaluated in a cytotoxic assay that allows for the determination of the percentage of target cells lysed by release of 51
Cr from within the target cell in the presence of NK cells.
Immunology, Issue 49, innate immunity, HIV-1, natural killer cell, cytolytic assay, degranulation assay, primary lymphocytes
Physical, Chemical and Biological Characterization of Six Biochars Produced for the Remediation of Contaminated Sites
Institutions: Royal Military College of Canada, Queen's University.
The physical and chemical properties of biochar vary based on feedstock sources and production conditions, making it possible to engineer biochars with specific functions (e.g.
carbon sequestration, soil quality improvements, or contaminant sorption). In 2013, the International Biochar Initiative (IBI) made publically available their Standardized Product Definition and Product Testing Guidelines (Version 1.1) which set standards for physical and chemical characteristics for biochar. Six biochars made from three different feedstocks and at two temperatures were analyzed for characteristics related to their use as a soil amendment. The protocol describes analyses of the feedstocks and biochars and includes: cation exchange capacity (CEC), specific surface area (SSA), organic carbon (OC) and moisture percentage, pH, particle size distribution, and proximate and ultimate analysis. Also described in the protocol are the analyses of the feedstocks and biochars for contaminants including polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), metals and mercury as well as nutrients (phosphorous, nitrite and nitrate and ammonium as nitrogen). The protocol also includes the biological testing procedures, earthworm avoidance and germination assays. Based on the quality assurance / quality control (QA/QC) results of blanks, duplicates, standards and reference materials, all methods were determined adequate for use with biochar and feedstock materials. All biochars and feedstocks were well within the criterion set by the IBI and there were little differences among biochars, except in the case of the biochar produced from construction waste materials. This biochar (referred to as Old biochar) was determined to have elevated levels of arsenic, chromium, copper, and lead, and failed the earthworm avoidance and germination assays. Based on these results, Old biochar would not be appropriate for use as a soil amendment for carbon sequestration, substrate quality improvements or remediation.
Environmental Sciences, Issue 93, biochar, characterization, carbon sequestration, remediation, International Biochar Initiative (IBI), soil amendment
A Neuroscientific Approach to the Examination of Concussions in Student-Athletes
Institutions: Elon University, Elon University, Duquesne University, Elon University.
Concussions are occurring at alarming rates in the United States and have become a serious public health concern. The CDC estimates that 1.6 to 3.8 million concussions occur in sports and recreational activities annually. Concussion as defined by the 2013 Concussion Consensus Statement “may be caused either by a direct blow to the head, face, neck or elsewhere on the body with an ‘impulsive’ force transmitted to the head.” Concussions leave the individual with both short- and long-term effects. The short-term effects of sport related concussions may include changes in playing ability, confusion, memory disturbance, the loss of consciousness, slowing of reaction time, loss of coordination, headaches, dizziness, vomiting, changes in sleep patterns and mood changes. These symptoms typically resolve in a matter of days. However, while some individuals recover from a single concussion rather quickly, many experience lingering effects that can last for weeks or months. The factors related to concussion susceptibility and the subsequent recovery times are not well known or understood at this time. Several factors have been suggested and they include the individual’s concussion history, the severity of the initial injury, history of migraines, history of learning disabilities, history of psychiatric comorbidities, and possibly, genetic factors. Many studies have individually investigated certain factors both the short-term and long-term effects of concussions, recovery time course, susceptibility and recovery. What has not been clearly established is an effective multifaceted approach to concussion evaluation that would yield valuable information related to the etiology, functional changes, and recovery. The purpose of this manuscript is to show one such multifaceted approached which examines concussions using computerized neurocognitive testing, event related potentials, somatosensory perceptual responses, balance assessment, gait assessment and genetic testing.
Medicine, Issue 94, Concussions, Student-Athletes, Mild Traumatic Brain Injury, Genetics, Cognitive Function, Balance, Gait, Somatosensory
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4
is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1
). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6
. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8
. Using logistic regression analysis of subject scores (i.e.
pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e.
composite networks with improved discrimination of patients from healthy control subjects5,6
. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9
. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10
. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11
. These standardized values can in turn be used to assist in differential diagnosis12,13
and to assess disease progression and treatment effects at the network level7,14-16
. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
High Efficiency Differentiation of Human Pluripotent Stem Cells to Cardiomyocytes and Characterization by Flow Cytometry
Institutions: Medical College of Wisconsin, Stanford University School of Medicine, Medical College of Wisconsin, Hong Kong University, Johns Hopkins University School of Medicine, Medical College of Wisconsin.
There is an urgent need to develop approaches for repairing the damaged heart, discovering new therapeutic drugs that do not have toxic effects on the heart, and improving strategies to accurately model heart disease. The potential of exploiting human induced pluripotent stem cell (hiPSC) technology to generate cardiac muscle “in a dish” for these applications continues to generate high enthusiasm. In recent years, the ability to efficiently generate cardiomyogenic cells from human pluripotent stem cells (hPSCs) has greatly improved, offering us new opportunities to model very early stages of human cardiac development not otherwise accessible. In contrast to many previous methods, the cardiomyocyte differentiation protocol described here does not require cell aggregation or the addition of Activin A or BMP4 and robustly generates cultures of cells that are highly positive for cardiac troponin I and T (TNNI3, TNNT2), iroquois-class homeodomain protein IRX-4 (IRX4), myosin regulatory light chain 2, ventricular/cardiac muscle isoform (MLC2v) and myosin regulatory light chain 2, atrial isoform (MLC2a) by day 10 across all human embryonic stem cell (hESC) and hiPSC lines tested to date. Cells can be passaged and maintained for more than 90 days in culture. The strategy is technically simple to implement and cost-effective. Characterization of cardiomyocytes derived from pluripotent cells often includes the analysis of reference markers, both at the mRNA and protein level. For protein analysis, flow cytometry is a powerful analytical tool for assessing quality of cells in culture and determining subpopulation homogeneity. However, technical variation in sample preparation can significantly affect quality of flow cytometry data. Thus, standardization of staining protocols should facilitate comparisons among various differentiation strategies. Accordingly, optimized staining protocols for the analysis of IRX4, MLC2v, MLC2a, TNNI3, and TNNT2 by flow cytometry are described.
Cellular Biology, Issue 91, human induced pluripotent stem cell, flow cytometry, directed differentiation, cardiomyocyte, IRX4, TNNI3, TNNT2, MCL2v, MLC2a
Analysis of Nephron Composition and Function in the Adult Zebrafish Kidney
Institutions: University of Notre Dame.
The zebrafish model has emerged as a relevant system to study kidney development, regeneration and disease. Both the embryonic and adult zebrafish kidneys are composed of functional units known as nephrons, which are highly conserved with other vertebrates, including mammals. Research in zebrafish has recently demonstrated that two distinctive phenomena transpire after adult nephrons incur damage: first, there is robust regeneration within existing nephrons that replaces the destroyed tubule epithelial cells; second, entirely new nephrons are produced from renal progenitors in a process known as neonephrogenesis. In contrast, humans and other mammals seem to have only a limited ability for nephron epithelial regeneration. To date, the mechanisms responsible for these kidney regeneration phenomena remain poorly understood. Since adult zebrafish kidneys undergo both nephron epithelial regeneration and neonephrogenesis, they provide an outstanding experimental paradigm to study these events. Further, there is a wide range of genetic and pharmacological tools available in the zebrafish model that can be used to delineate the cellular and molecular mechanisms that regulate renal regeneration. One essential aspect of such research is the evaluation of nephron structure and function. This protocol describes a set of labeling techniques that can be used to gauge renal composition and test nephron functionality in the adult zebrafish kidney. Thus, these methods are widely applicable to the future phenotypic characterization of adult zebrafish kidney injury paradigms, which include but are not limited to, nephrotoxicant exposure regimes or genetic methods of targeted cell death such as the nitroreductase mediated cell ablation technique. Further, these methods could be used to study genetic perturbations in adult kidney formation and could also be applied to assess renal status during chronic disease modeling.
Cellular Biology, Issue 90,
zebrafish; kidney; nephron; nephrology; renal; regeneration; proximal tubule; distal tubule; segment; mesonephros; physiology; acute kidney injury (AKI)
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e.
, lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.
) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization.
Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25
. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7
with a multiobjective evolutionary algorithm SPEA226
, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
Predicting the Effectiveness of Population Replacement Strategy Using Mathematical Modeling
Institutions: University of California, Los Angeles.
Charles Taylor and John Marshall explain the utility of mathematical modeling for evaluating the effectiveness of population replacement strategy. Insight is given into how computational models can provide information on the population dynamics of mosquitoes and the spread of transposable elements through A. gambiae subspecies. The ethical considerations of releasing genetically modified mosquitoes into the wild are discussed.
Cellular Biology, Issue 5, mosquito, malaria, popuulation, replacement, modeling, infectious disease
Basics of Multivariate Analysis in Neuroimaging Data
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9
. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Population Replacement Strategies for Controlling Vector Populations and the Use of Wolbachia pipientis for Genetic Drive
Institutions: Johns Hopkins University.
In this video, Jason Rasgon discusses population replacement strategies to control vector-borne diseases such as malaria and dengue. "Population replacement" is the replacement of wild vector populations (that are competent to transmit pathogens) with those that are not competent to transmit pathogens. There are several theoretical strategies to accomplish this. One is to exploit the maternally-inherited symbiotic bacteria Wolbachia pipientis. Wolbachia is a widespread reproductive parasite that spreads in a selfish manner at the extent of its host's fitness. Jason Rasgon discusses, in detail, the basic biology of this bacterial symbiont and various ways to use it for control of vector-borne diseases.
Cellular Biology, Issue 5, mosquito, malaria, genetics, infectious disease, Wolbachia
Interview: Glycolipid Antigen Presentation by CD1d and the Therapeutic Potential of NKT cell Activation
Institutions: La Jolla Institute for Allergy and Immunology.
Natural Killer T cells (NKT) are critical determinants of the immune response to cancer, regulation of autioimmune disease, clearance of infectious agents, and the development of artheriosclerotic plaques. In this interview, Mitch Kronenberg discusses his laboratory's efforts to understand the mechanism through which NKT cells are activated by glycolipid antigens. Central to these studies is CD1d - the antigen presenting molecule that presents glycolipids to NKT cells. The advent of CD1d tetramer technology, a technique developed by the Kronenberg lab, is critical for the sorting and identification of subsets of specific glycolipid-reactive T cells. Mitch explains how glycolipid agonists are being used as therapeutic agents to activate NKT cells in cancer patients and how CD1d tetramers can be used to assess the state of the NKT cell population in vivo following glycolipid agonist therapy. Current status of ongoing clinical trials using these agonists are discussed as well as Mitch's prediction for areas in the field of immunology that will have emerging importance in the near future.
Immunology, Issue 10, Natural Killer T cells, NKT cells, CD1 Tetramers, antigen presentation, glycolipid antigens, CD1d, Mucosal Immunity, Translational Research
Ole Isacson: Development of New Therapies for Parkinson's Disease
Institutions: Harvard Medical School.
Medicine, Issue 3, Parkinson' disease, Neuroscience, dopamine, neuron, L-DOPA, stem cell, transplantation