Hepatitis C Virus (HCV) affects 3% of the world’s population and causes serious liver ailments including chronic hepatitis, cirrhosis, and hepatocellular carcinoma. HCV is an enveloped RNA virus belonging to the family Flaviviridae. Current treatment is not fully effective and causes adverse side effects. There is no HCV vaccine available. Thus, continued effort is required for developing a vaccine and better therapy. An HCV cell culture system is critical for studying various stages of HCV growth including viral entry, genome replication, packaging, and egress. In the current procedure presented, we used a wild-type intragenotype 2a chimeric virus, FNX-HCV, and a recombinant FNX-Rluc virus carrying a Renilla luciferase reporter gene to study the virus replication. A human hepatoma cell line (Huh-7 based) was used for transfection of in vitro transcribed HCV genomic RNAs. Cell-free culture supernatants, protein lysates and total RNA were harvested at various time points post-transfection to assess HCV growth. HCV genome replication status was evaluated by quantitative RT-PCR and visualizing the presence of HCV double-stranded RNA. The HCV protein expression was verified by Western blot and immunofluorescence assays using antibodies specific for HCV NS3 and NS5A proteins. HCV RNA transfected cells released infectious particles into culture supernatant and the viral titer was measured. Luciferase assays were utilized to assess the replication level and infectivity of reporter HCV. In conclusion, we present various virological assays for characterizing different stages of the HCV replication cycle.
26 Related JoVE Articles!
Fecal Microbiota Transplantation via Colonoscopy for Recurrent C. difficile Infection
Institutions: Brigham and Women‘s Hospital.
Fecal Microbiota Transplantation (FMT) is a safe and highly effective treatment for recurrent and refractory C. difficile
infection (CDI). Various methods of FMT administration have been reported in the literature including nasogastric tube, upper endoscopy, enema and colonoscopy. FMT via
colonoscopy yields excellent cure rates and is also well tolerated. We have found that patients find this an acceptable and tolerable mode of delivery. At our Center, we have initiated a fecal transplant program for patients with recurrent or refractory CDI. We have developed a protocol using an iterative process of revision and have performed 24 fecal transplants on 22 patients with success rates comparable to the current published literature. A systematic approach to patient and donor screening, preparation of stool, and delivery of the stool maximizes therapeutic success. Here we detail each step of the FMT protocol that can be carried out at any endoscopy center with a high degree of safety and success.
Immunology, Issue 94, C.difficile, colonoscopy, fecal transplant, stool, diarrhea, microbiota
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Studying Food Reward and Motivation in Humans
Institutions: University of Cambridge, University of Cambridge, University of Cambridge, Addenbrooke's Hospital.
A key challenge in studying reward processing in humans is to go beyond subjective self-report measures and quantify different aspects of reward such as hedonics, motivation, and goal value in more objective ways. This is particularly relevant for the understanding of overeating and obesity as well as their potential treatments. In this paper are described a set of measures of food-related motivation using handgrip force as a motivational measure. These methods can be used to examine changes in food related motivation with metabolic (satiety) and pharmacological manipulations and can be used to evaluate interventions targeted at overeating and obesity. However to understand food-related decision making in the complex food environment it is essential to be able to ascertain the reward goal values that guide the decisions and behavioral choices that people make. These values are hidden but it is possible to ascertain them more objectively using metrics such as the willingness to pay and a method for this is described. Both these sets of methods provide quantitative measures of motivation and goal value that can be compared within and between individuals.
Behavior, Issue 85, Food reward, motivation, grip force, willingness to pay, subliminal motivation
Laboratory-determined Phosphorus Flux from Lake Sediments as a Measure of Internal Phosphorus Loading
Institutions: Grand Valley State University.
Eutrophication is a water quality issue in lakes worldwide, and there is a critical need to identify and control nutrient sources. Internal phosphorus (P) loading from lake sediments can account for a substantial portion of the total P load in eutrophic, and some mesotrophic, lakes. Laboratory determination of P release rates from sediment cores is one approach for determining the role of internal P loading and guiding management decisions. Two principal alternatives to experimental determination of sediment P release exist for estimating internal load: in situ
measurements of changes in hypolimnetic P over time and P mass balance. The experimental approach using laboratory-based sediment incubations to quantify internal P load is a direct method, making it a valuable tool for lake management and restoration.
Laboratory incubations of sediment cores can help determine the relative importance of internal vs. external P loads, as well as be used to answer a variety of lake management and research questions. We illustrate the use of sediment core incubations to assess the effectiveness of an aluminum sulfate (alum) treatment for reducing sediment P release. Other research questions that can be investigated using this approach include the effects of sediment resuspension and bioturbation on P release.
The approach also has limitations. Assumptions must be made with respect to: extrapolating results from sediment cores to the entire lake; deciding over what time periods to measure nutrient release; and addressing possible core tube artifacts. A comprehensive dissolved oxygen monitoring strategy to assess temporal and spatial redox status in the lake provides greater confidence in annual P loads estimated from sediment core incubations.
Environmental Sciences, Issue 85, Limnology, internal loading, eutrophication, nutrient flux, sediment coring, phosphorus, lakes
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33
. To help improve this understanding, proton magnetic resonance spectroscopy (1
H-MRS) can be used as it allows the in vivo
quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41
. In fact, a recent study demonstrated that 1
H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34
. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1
H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31
. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Tumor Treating Field Therapy in Combination with Bevacizumab for the Treatment of Recurrent Glioblastoma
Institutions: Southern Illinois University School of Medicine.
A novel device that employs TTF therapy has recently been developed and is currently in use for the treatment of recurrent glioblastoma (rGBM). It was FDA approved in April 2011 for the treatment of patients 22 years or older with rGBM. The device delivers alternating electric fields and is programmed to ensure maximal tumor cell kill1
Glioblastoma is the most common type of glioma and has an estimated incidence of approximately 10,000 new cases per year in the United States alone2
. This tumor is particularly resistant to treatment and is uniformly fatal especially in the recurrent setting3-5
. Prior to the approval of the TTF System, the only FDA approved treatment for rGBM was bevacizumab6
. Bevacizumab is a humanized monoclonal antibody targeted against the vascular endothelial growth factor (VEGF) protein that drives tumor angiogenesis7
. By blocking the VEGF pathway, bevacizumab can result in a significant radiographic response (pseudoresponse), improve progression free survival and reduce corticosteroid requirements in rGBM patients8,9
. Bevacizumab however failed to prolong overall survival in a recent phase III trial26
. A pivotal phase III trial (EF-11) demonstrated comparable overall survival between physicians’ choice chemotherapy and TTF Therapy but better quality of life were observed in the TTF arm10
There is currently an unmet need to develop novel approaches designed to prolong overall survival and/or improve quality of life in this unfortunate patient population. One appealing approach would be to combine the two currently approved treatment modalities namely bevacizumab and TTF Therapy. These two treatments are currently approved as monotherapy11,12
, but their combination has never been evaluated in a clinical trial. We have developed an approach for combining those two treatment modalities and treated 2 rGBM patients. Here we describe a detailed methodology outlining this novel treatment protocol and present representative data from one of the treated patients.
Medicine, Issue 92, Tumor Treating Fields, TTF System, TTF Therapy, Recurrent Glioblastoma, Bevacizumab, Brain Tumor
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Propagation of Homalodisca coagulata virus-01 via Homalodisca vitripennis Cell Culture
Institutions: University of Texas at Tyler, USDA ARS.
The glassy-winged sharpshooter (Homalodisca vitripennis
) is a highly vagile and polyphagous insect found throughout the southwestern United States. These insects are the predominant vectors of Xylella fastidiosa (X. fastidiosa),
a xylem-limited bacterium that is the causal agent of Pierce's disease (PD) of grapevine. Pierce’s disease is economically damaging; thus, H. vitripennis
have become a target for pathogen management strategies. A dicistrovirus identified as Homalodisca coagulata virus-01
(HoCV-01) has been associated with an increased mortality in H. vitripennis
populations. Because a host cell is required for HoCV-01 replication, cell culture provides a uniform environment for targeted replication that is logistically and economically valuable for biopesticide production. In this study, a system for large-scale propagation of H. vitripennis
cells via tissue culture was developed, providing a viral replication mechanism. HoCV-01 was extracted from whole body insects and used to inoculate cultured H. vitripennis
cells at varying levels. The culture medium was removed every 24 hr for 168 hr, RNA extracted and analyzed with qRT-PCR. Cells were stained with trypan blue and counted to quantify cell survivability using light microscopy. Whole virus particles were extracted up to 96 hr after infection, which was the time point determined to be before total cell culture collapse occurred. Cells were also subjected to fluorescent staining and viewed using confocal microscopy to investigate viral activity on F-actin attachment and nuclei integrity. The conclusion of this study is that H. vitripennis
cells are capable of being cultured and used for mass production of HoCV-01 at a suitable level to allow production of a biopesticide.
Infection, Issue 91, Homalodisca vitripennis, Homalodisca coagulata virus-01, cell culture, Pierce’s disease of grapevine, Xylella fastidiosa, Dicistroviridae
The Goeckerman Regimen for the Treatment of Moderate to Severe Psoriasis
Institutions: University of Southern California, University of California, San Francisco , University of California Irvine School of Medicine, University of Arizona College of Medicine, Chicago College of Osteopathic Medicine.
Psoriasis is a chronic, immune-mediated inflammatory skin disease affecting approximately 2-3% of the population. The Goeckerman regimen consists of exposure to ultraviolet B (UVB) light and application of crude coal tar (CCT). Goeckerman therapy is extremely effective and relatively safe for the treatment of psoriasis and for improving a patient's quality of life. In the following article, we present our protocol for the Goeckerman therapy that is utilized specifically at the University of California, San Francisco. This protocol details the preparation of supplies, administration of phototherapy and application of topical tar. This protocol also describes how to assess the patient daily, monitor for adverse effects (including pruritus and burning), and adjust the treatment based on the patient's response. Though it is one of the oldest therapies available for psoriasis, there is an absence of any published videos demonstrating the process in detail. The video is beneficial for healthcare providers who want to administer the therapy, for trainees who want to learn more about the process, and for prospective patients who want to undergo treatment for their cutaneous disease.
Medicine, Issue 77, Infection, Biomedical Engineering, Anatomy, Physiology, Immunology, Dermatology, Skin, Dermis, Epidermis, Skin Diseases, Skin Diseases, Eczematous, Goeckerman, Crude Coal Tar, phototherapy, psoriasis, Eczema, Goeckerman regimen, clinical techniques
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo
. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls.
DTI data analysis is performed in a variate fashion, i.e.
voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e.
differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels.
In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
High-density EEG Recordings of the Freely Moving Mice using Polyimide-based Microelectrode
Institutions: Korea Institute of Science and Technology (KIST), University of Science and Technology, Korea Advanced Nano Fab Center.
Electroencephalogram (EEG) indicates the averaged electrical activity of the neuronal populations on a large-scale level. It is widely utilized as a noninvasive brain monitoring tool in cognitive neuroscience as well as a diagnostic tool for epilepsy and sleep disorders in neurology. However, the underlying mechanism of EEG rhythm generation is still under the veil. Recently introduced polyimide-based microelectrode (PBM-array) for high resolution mouse EEG1
is one of the trials to answer the neurophysiological questions on EEG signals based on a rich genetic resource that the mouse model contains for the analysis of complex EEG generation process. This application of nanofabricated PBM-array to mouse skull is an efficient tool for collecting large-scale brain activity of transgenic mice and accommodates to identify the neural correlates to certain EEG rhythms in conjunction with behavior. However its ultra-thin thickness and bifurcated structure cause a trouble in handling and implantation of PBM-array. In the presented video, the preparation and surgery steps for the implantation of PBM-array on a mouse skull are described step by step. Handling and surgery tips to help researchers succeed in implantation are also provided.
Neuroscience, Issue 47, Electroencephalography (EEG), Mouse, Microelectrode, Brain Imaging
A Protocol for Computer-Based Protein Structure and Function Prediction
Institutions: University of Michigan , University of Kansas.
Genome sequencing projects have ciphered millions of protein sequence, which require knowledge of their structure and function to improve the understanding of their biological role. Although experimental methods can provide detailed information for a small fraction of these proteins, computational modeling is needed for the majority of protein molecules which are experimentally uncharacterized. The I-TASSER server is an on-line workbench for high-resolution modeling of protein structure and function. Given a protein sequence, a typical output from the I-TASSER server includes secondary structure prediction, predicted solvent accessibility of each residue, homologous template proteins detected by threading and structure alignments, up to five full-length tertiary structural models, and structure-based functional annotations for enzyme classification, Gene Ontology terms and protein-ligand binding sites. All the predictions are tagged with a confidence score which tells how accurate the predictions are without knowing the experimental data. To facilitate the special requests of end users, the server provides channels to accept user-specified inter-residue distance and contact maps to interactively change the I-TASSER modeling; it also allows users to specify any proteins as template, or to exclude any template proteins during the structure assembly simulations. The structural information could be collected by the users based on experimental evidences or biological insights with the purpose of improving the quality of I-TASSER predictions. The server was evaluated as the best programs for protein structure and function predictions in the recent community-wide CASP experiments. There are currently >20,000 registered scientists from over 100 countries who are using the on-line I-TASSER server.
Biochemistry, Issue 57, On-line server, I-TASSER, protein structure prediction, function prediction
The CYP2D6 Animal Model: How to Induce Autoimmune Hepatitis in Mice
Institutions: Goethe University Hospital Frankfurt.
Autoimmune hepatitis is a rare but life threatening autoimmune disease of the liver of unknown etiology1,2
. In the past many attempts have been made to generate an animal model that reflects the characteristics of the human disease 3-5
. However, in various models the induction of disease was rather complex and often hepatitis was only transient3-5
. Therefore, we have developed a straightforward mouse model that uses the major human autoantigen in type 2 autoimmune hepatitis (AIH-2), namely hCYP2D6, as a trigger6
. Type 1 liver-kidney microsomal antibodies (LKM-1) antibodies recognizing hCYP2D6 are the hallmark of AIH-27,8
. Delivery of hCYP2D6 into wildtype FVB or C57BL/6 mice was by an Adenovirus construct (Ad-2D6) that ensures a direct delivery of the triggering antigen to the liver. Thus, the ensuing local inflammation generates a fertile field9
for the subsequent development of autoimmunity. A combination of intravenous and intraperitoneal injection of Ad-2D6 is the most effective route to induce a long-lasting autoimmune damage to the liver (section 1). Here we provide a detailed protocol on how autoimmune liver disease is induced in the CYP2D6 model and how the different aspects of liver damage can be assessed. First, the serum levels of markers indicating hepatocyte destruction, such as aminotransferases, as well as the titers of hCYP2D6 antibodies are determined by sampling blood retroorbitaly (section 2). Second, the hCYP2D6-specific T cell response is characterized by collecting lymphocytes from the spleen and the liver. In order to obtain pure liver lymphocytes, the livers are perfused by PBS via the portal vein (section 3), digested in collagen and purified over a Percoll gradient (section 4). The frequency of hCYP2D6-specific T cells is analyzed by stimulation with hCYP2D6 peptides and identification of IFNγ-producing cells by flow cytometry (section 5). Third, cellular infiltration and fibrosis is determined by immunohistochemistry of liver sections (section 6). Such analysis regimen has to be conducted at several times after initiation of the disease in order to prove the chronic nature of the model. The magnitude of the immune response characterized by the frequency and activity of hCYP2D6-specific T and/or B cells and the degree of the liver damage and fibrosis have to be assessed for a subsequent evaluation of possible treatments to prevent, delay or abrogate the autodestructive process of the liver.
Medicine, Issue 60, autoimmunity, liver, autoantigen, fibrosis, perfusion
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2
, the degree of those aversions vary substantially across individuals, such that the subjective value
of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3
to assess the neural representation of the subjective values of risky and ambiguous options4
. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations.
In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
Visualization and Analysis of Blood Flow and Oxygen Consumption in Hepatic Microcirculation: Application to an Acute Hepatitis Model
Institutions: Keio University, Keio University, Japan Science and Technology Agency (JST).
There is a considerable discrepancy between oxygen supply and demand in the liver because hepatic oxygen consumption is relatively high but about 70% of the hepatic blood supply is poorly oxygenated portal vein blood derived from the gastrointestinal tract and spleen. Oxygen is delivered to hepatocytes by blood flowing from a terminal branch of the portal vein to a central venule via sinusoids, and this makes an oxygen gradient in hepatic lobules. The oxygen gradient is an important physical parameter that involves the expression of enzymes upstream and downstream in hepatic microcirculation, but the lack of techniques for measuring oxygen consumption in the hepatic microcirculation has delayed the elucidation of mechanisms relating to oxygen metabolism in liver. We therefore used FITC-labeled erythrocytes to visualize the hepatic microcirculation and used laser-assisted phosphorimetry to measure the partial pressure of oxygen in the microvessels there. Noncontact and continuous optical measurement can quantify blood flow velocities, vessel diameters, and oxygen gradients related to oxygen consumption in the liver. In an acute hepatitis model we made by administering acetaminophen to mice we observed increased oxygen pressure in both portal and central venules but a decreased oxygen gradient in the sinusoids, indicating that hepatocyte necrosis in the pericentral zone could shift the oxygen pressure up and affect enzyme expression in the periportal zone. In conclusion, our optical methods for measuring hepatic hemodynamics and oxygen consumption can reveal mechanisms related to hepatic disease.
Medicine, Issue 66, Physics, Biochemistry, Immunology, Physiology, microcirculation, liver, blood flow, oxygen consumption, phosphorescence, hepatitis
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Use of the Operant Orofacial Pain Assessment Device (OPAD) to Measure Changes in Nociceptive Behavior
Institutions: University of Florida College of Dentistry, University of Florida College of Medicine , Stoelting Co., University of Florida .
We present an operant system for the detection of pain in awake, conscious rodents. The Orofacial Pain Assessment Device (OPAD) assesses pain behaviors in a more clinically relevant way by not relying on reflex-based measures of nociception. Food fasted, hairless (or shaved) rodents are placed into a Plexiglas chamber which has two Peltier-based thermodes that can be programmed to any temperature between 7 °C and 60 °C. The rodent is trained to make contact with these in order to access a reward bottle. During a session, a number of behavioral pain outcomes are automatically recorded and saved. These measures include the number of reward bottle activations (licks) and facial contact stimuli (face contacts), but custom measures like the lick/face ratio (total number of licks per session/total number of contacts) can also be created. The stimulus temperature can be set to a single temperature or multiple temperatures within a session. The OPAD is a high-throughput, easy to use operant assay which will lead to better translation of pain research in the future as it includes cortical input instead of relying on spinal reflex-based nociceptive assays.
Behavior, Issue 76, Neuroscience, Neurobiology, Anatomy, Physiology, Medicine, Biomedical Engineering, Surgery, Neurologic Manifestations, Pain, Chronic Pain, Nociceptive Pain, Acute Pain, Pain Perception, Operant, mouse, rat, analgesia, nociception, thermal, hyperalgesia, animal model
Using Visual and Narrative Methods to Achieve Fair Process in Clinical Care
Institutions: Brandeis University, Brandeis University.
The Institute of Medicine has targeted patient-centeredness as an important area of quality improvement. A major dimension of patient-centeredness is respect for patient's values, preferences, and expressed needs. Yet specific approaches to gaining this understanding and translating it to quality care in the clinical setting are lacking. From a patient perspective quality is not a simple concept but is best understood in terms of five dimensions: technical outcomes; decision-making efficiency; amenities and convenience; information and emotional support; and overall patient satisfaction. Failure to consider quality from this five-pronged perspective results in a focus on medical outcomes, without considering the processes central to quality from the patient's perspective and vital to achieving good outcomes. In this paper, we argue for applying the concept of fair process in clinical settings. Fair process involves using a collaborative approach to exploring diagnostic issues and treatments with patients, explaining the rationale for decisions, setting expectations about roles and responsibilities, and implementing a core plan and ongoing evaluation. Fair process opens the door to bringing patient expertise into the clinical setting and the work of developing health care goals and strategies. This paper provides a step by step illustration of an innovative visual approach, called photovoice or photo-elicitation, to achieve fair process in clinical work with acquired brain injury survivors and others living with chronic health conditions. Applying this visual tool and methodology in the clinical setting will enhance patient-provider communication; engage patients as partners in identifying challenges, strengths, goals, and strategies; and support evaluation of progress over time. Asking patients to bring visuals of their lives into the clinical interaction can help to illuminate gaps in clinical knowledge, forge better therapeutic relationships with patients living with chronic conditions such as brain injury, and identify patient-centered goals and possibilities for healing. The process illustrated here can be used by clinicians, (primary care physicians, rehabilitation therapists, neurologists, neuropsychologists, psychologists, and others) working with people living with chronic conditions such as acquired brain injury, mental illness, physical disabilities, HIV/AIDS, substance abuse, or post-traumatic stress, and by leaders of support groups for the types of patients described above and their family members or caregivers.
Medicine, Issue 48, person-centered care, participatory visual methods, photovoice, photo-elicitation, narrative medicine, acquired brain injury, disability, rehabilitation, palliative care
Combining Behavioral Endocrinology and Experimental Economics: Testosterone and Social Decision Making
Institutions: University of Zurich, Royal Holloway, University of London.
Behavioral endocrinological research in humans as well as in animals suggests that testosterone plays a key role in social interactions. Studies in rodents have shown a direct link between testosterone and aggressive behavior1
and folk wisdom adapts these findings to humans, suggesting that testosterone induces antisocial, egoistic or even aggressive behavior2
. However, many researchers doubt a direct testosterone-aggression link in humans, arguing instead that testosterone is primarily involved in status-related behavior3,4
. As a high status can also be achieved by aggressive and antisocial means it can be difficult to distinguish between anti-social and status seeking behavior.
We therefore set up an experimental environment, in which status can only be achieved by prosocial means. In a double-blind and placebo-controlled experiment, we administered a single sublingual dose of 0.5 mg of testosterone (with a hydroxypropyl-β-cyclodextrin carrier) to 121 women and investigated their social interaction behavior in an economic bargaining paradigm. Real monetary incentives are at stake in this paradigm; every player A receives a certain amount of money and has to make an offer to another player B on how to share the money. If B accepts, she gets what was offered and player A keeps the rest. If B refuses the offer, nobody gets anything. A status seeking player A is expected to avoid being rejected by behaving in a prosocial way, i.e. by making higher offers.
The results show that if expectations about the hormone are controlled for, testosterone administration leads to a significant increase in fair bargaining offers compared to placebo. The role of expectations is reflected in the fact that subjects who report that they believe to have received testosterone make lower offers than those who say they believe that they were treated with a placebo. These findings suggest that the experimental economics approach is sensitive for detecting neurobiological effects as subtle as those achieved by administration of hormones. Moreover, the findings point towards the importance of both psychosocial as well as neuroendocrine factors in determining the influence of testosterone on human social behavior.
Neuroscience, Issue 49, behavioral endocrinology, testosterone, social status, decision making
Using Learning Outcome Measures to assess Doctoral Nursing Education
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
Laser-Induced Chronic Ocular Hypertension Model on SD Rats
Institutions: The University of Hong Kong - HKU.
Glaucoma is one of the major causes of blindness in the world. Elevated intraocular pressure is a major risk factor. Laser photocoagulation induced ocular hypertension is one of the well established animal models. This video demonstrates how to induce ocular hypertension by Argon laser photocoagulation in rat.
Neuroscience, Issue 10, glaucoma, ocular hypertension, rat
BioMEMS and Cellular Biology: Perspectives and Applications
Institutions: University of Washington.
The ability to culture cells has revolutionized hypothesis testing in basic cell and molecular biology research. It has become a standard methodology in drug screening, toxicology, and clinical assays, and is increasingly used in regenerative medicine. However, the traditional cell culture methodology essentially consisting of the immersion of a large population of cells in a homogeneous fluid medium and on a homogeneous flat substrate has become increasingly limiting both from a fundamental and practical perspective. Microfabrication technologies have enabled researchers to design, with micrometer control, the biochemical composition and topology of the substrate, and the medium composition, as well as the neighboring cell type in the surrounding cellular microenvironment. Additionally, microtechnology is conceptually well-suited for the development of fast, low-cost in vitro systems that allow for high-throughput culturing and analysis of cells under large numbers of conditions. In this interview, Albert Folch explains these limitations, how they can be overcome with soft lithography and microfluidics, and describes some relevant examples of research in his lab and future directions.
Biomedical Engineering, Issue 8, BioMEMS, Soft Lithography, Microfluidics, Agrin, Axon Guidance, Olfaction, Interview