A novel device that employs TTF therapy has recently been developed and is currently in use for the treatment of recurrent glioblastoma (rGBM). It was FDA approved in April 2011 for the treatment of patients 22 years or older with rGBM. The device delivers alternating electric fields and is programmed to ensure maximal tumor cell kill1.
Glioblastoma is the most common type of glioma and has an estimated incidence of approximately 10,000 new cases per year in the United States alone2. This tumor is particularly resistant to treatment and is uniformly fatal especially in the recurrent setting3-5. Prior to the approval of the TTF System, the only FDA approved treatment for rGBM was bevacizumab6. Bevacizumab is a humanized monoclonal antibody targeted against the vascular endothelial growth factor (VEGF) protein that drives tumor angiogenesis7. By blocking the VEGF pathway, bevacizumab can result in a significant radiographic response (pseudoresponse), improve progression free survival and reduce corticosteroid requirements in rGBM patients8,9. Bevacizumab however failed to prolong overall survival in a recent phase III trial26. A pivotal phase III trial (EF-11) demonstrated comparable overall survival between physicians’ choice chemotherapy and TTF Therapy but better quality of life were observed in the TTF arm10.
There is currently an unmet need to develop novel approaches designed to prolong overall survival and/or improve quality of life in this unfortunate patient population. One appealing approach would be to combine the two currently approved treatment modalities namely bevacizumab and TTF Therapy. These two treatments are currently approved as monotherapy11,12, but their combination has never been evaluated in a clinical trial. We have developed an approach for combining those two treatment modalities and treated 2 rGBM patients. Here we describe a detailed methodology outlining this novel treatment protocol and present representative data from one of the treated patients.
24 Related JoVE Articles!
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2
, the degree of those aversions vary substantially across individuals, such that the subjective value
of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3
to assess the neural representation of the subjective values of risky and ambiguous options4
. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations.
In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
Voluntary Breath-hold Technique for Reducing Heart Dose in Left Breast Radiotherapy
Institutions: Royal Marsden NHS Foundation Trust, University of Surrey, Institute of Cancer Research, Sutton, UK, Institute of Cancer Research, Sutton, UK.
Breath-holding techniques reduce the amount of radiation received by cardiac structures during tangential-field left breast radiotherapy. With these techniques, patients hold their breath while radiotherapy is delivered, pushing the heart down and away from the radiotherapy field. Despite clear dosimetric benefits, these techniques are not yet in widespread use. One reason for this is that commercially available solutions require specialist equipment, necessitating not only significant capital investment, but often also incurring ongoing costs such as a need for daily disposable mouthpieces. The voluntary breath-hold technique described here does not require any additional specialist equipment. All breath-holding techniques require a surrogate to monitor breath-hold consistency and whether breath-hold is maintained. Voluntary breath-hold uses the distance moved by the anterior and lateral reference marks (tattoos) away from the treatment room lasers in breath-hold to monitor consistency at CT-planning and treatment setup. Light fields are then used to monitor breath-hold consistency prior to and during radiotherapy delivery.
Medicine, Issue 89, breast, radiotherapy, heart, cardiac dose, breath-hold
Bioluminescent Orthotopic Model of Pancreatic Cancer Progression
Institutions: Monash University, University of Bern, University of California Los Angeles .
Pancreatic cancer has an extremely poor five-year survival rate of 4-6%. New therapeutic options are critically needed and depend on improved understanding of pancreatic cancer biology. To better understand the interaction of cancer cells with the pancreatic microenvironment, we demonstrate an orthotopic model of pancreatic cancer that permits non-invasive monitoring of cancer progression. Luciferase-tagged pancreatic cancer cells are resuspended in Matrigel and delivered into the pancreatic tail during laparotomy. Matrigel solidifies at body temperature to prevent leakage of cancer cells during injection. Primary tumor growth and metastasis to distant organs are monitored following injection of the luciferase substrate luciferin, using in vivo
imaging of bioluminescence emission from the cancer cells. In vivo
imaging also may be used to track primary tumor recurrence after resection. This orthotopic model is suited to both syngeneic and xenograft models and may be used in pre-clinical trials to investigate the impact of novel anti-cancer therapeutics on the growth of the primary pancreatic tumor and metastasis.
Cancer Biology, Issue 76, Medicine, Molecular Biology, Cellular Biology, Genetics, Biomedical Engineering, Surgery, Neoplasms, Pancreatic Cancer, Cancer, Orthotopic Model, Bioluminescence, In Vivo Imaging, Matrigel, Metastasis, pancreas, tumor, cancer, cell culture, laparotomy, animal model, imaging
Handling of the Cotton Rat in Studies for the Pre-clinical Evaluation of Oncolytic Viruses
Institutions: McMaster University.
Oncolytic viruses are a novel anticancer therapy with the ability to target tumor cells, while leaving healthy cells intact. For this strategy to be successful, recent studies have shown that involvement of the host immune system is essential. Therefore, oncolytic virotherapy should be evaluated within the context of an immunocompetent model. Furthermore, the study of antitumor therapies in tolerized animal models may better recapitulate results seen in clinical trials. Cotton rats, commonly used to study respiratory viruses, are an attractive model to study oncolytic virotherapy as syngeneic models of mammary carcinoma and osteosarcoma are well established. However, there is a lack of published information on the proper handling procedure for these highly excitable rodents. The handling and capture approach outlined minimizes animal stress to facilitate experimentation. This technique hinges upon the ability of the researcher to keep calm during handling and perform procedures in a timely fashion. Finally, we describe how to prepare cotton rat mammary tumor cells for consistent subcutaneous tumor formation, and how to perform intratumoral and intraperitoneal injections. These methods can be applied to a wide range of studies furthering the development of the cotton rat as a relevant pre-clinical model to study antitumor therapy.
Virology, Issue 93, cotton rat, oncolytic virus, animal handling, bovine herpesvirus type 1
Polymalic Acid-based Nano Biopolymers for Targeting of Multiple Tumor Markers: An Opportunity for Personalized Medicine?
Institutions: Cedars-Sinai Medical Center.
Tumors with similar grade and morphology often respond differently to the same treatment because of variations in molecular profiling. To account for this diversity, personalized medicine is developed for silencing malignancy associated genes. Nano drugs fit these needs by targeting tumor and delivering antisense oligonucleotides for silencing of genes. As drugs for the treatment are often administered repeatedly, absence of toxicity and negligible immune response are desirable. In the example presented here, a nano medicine is synthesized from the biodegradable, non-toxic and non-immunogenic platform polymalic acid by controlled chemical ligation of antisense oligonucleotides and tumor targeting molecules. The synthesis and treatment is exemplified for human Her2-positive breast cancer using an experimental mouse model. The case can be translated towards synthesis and treatment of other tumors.
Chemistry, Issue 88, Cancer treatment, personalized medicine, polymalic acid, nanodrug, biopolymer, targeting, host compatibility, biodegradability
Enhancement of Apoptotic and Autophagic Induction by a Novel Synthetic C-1 Analogue of 7-deoxypancratistatin in Human Breast Adenocarcinoma and Neuroblastoma Cells with Tamoxifen
Institutions: University of Windsor, Brock University.
Breast cancer is one of the most common cancers amongst women in North America. Many current anti-cancer treatments, including ionizing radiation, induce apoptosis via DNA damage. Unfortunately, such treatments are non-selective to cancer cells and produce similar toxicity in normal cells. We have reported selective induction of apoptosis in cancer cells by the natural compound pancratistatin (PST). Recently, a novel PST analogue, a C-1 acetoxymethyl derivative of 7-deoxypancratistatin (JCTH-4), was produced by de novo synthesis and it exhibits comparable selective apoptosis inducing activity in several cancer cell lines. Recently, autophagy has been implicated in malignancies as both pro-survival and pro-death mechanisms in response to chemotherapy. Tamoxifen (TAM) has invariably demonstrated induction of pro-survival autophagy in numerous cancers. In this study, the efficacy of JCTH-4 alone and in combination with TAM to induce cell death in human breast cancer (MCF7) and neuroblastoma (SH-SY5Y) cells was evaluated. TAM alone induced autophagy, but insignificant cell death whereas JCTH-4 alone caused significant induction of apoptosis with some induction of autophagy. Interestingly, the combinatory treatment yielded a drastic increase in apoptotic and autophagic induction. We monitored time-dependent morphological changes in MCF7 cells undergoing TAM-induced autophagy, JCTH-4-induced apoptosis and autophagy, and accelerated cell death with combinatorial treatment using time-lapse microscopy. We have demonstrated these compounds to induce apoptosis/autophagy by mitochondrial targeting in these cancer cells. Importantly, these treatments did not affect the survival of noncancerous human fibroblasts. Thus, these results indicate that JCTH-4 in combination with TAM could be used as a safe and very potent anti-cancer therapy against breast cancer and neuroblastoma cells.
Cancer Biology, Issue 63, Medicine, Biochemistry, Breast adenocarcinoma, neuroblastoma, tamoxifen, combination therapy, apoptosis, autophagy
Pre-clinical Evaluation of Tyrosine Kinase Inhibitors for Treatment of Acute Leukemia
Institutions: University of Colorado Anschutz Medical Campus, University Hospital of Essen.
Receptor tyrosine kinases have been implicated in the development and progression of many cancers, including both leukemia and solid tumors, and are attractive druggable therapeutic targets. Here we describe an efficient four-step strategy for pre-clinical evaluation of tyrosine kinase inhibitors (TKIs) in the treatment of acute leukemia. Initially, western blot analysis is used to confirm target inhibition in cultured leukemia cells. Functional activity is then evaluated using clonogenic assays in methylcellulose or soft agar cultures. Experimental compounds that demonstrate activity in cell culture assays are evaluated in vivo
using NOD-SCID-gamma (NSG) mice transplanted orthotopically with human leukemia cell lines. Initial in vivo
pharmacodynamic studies evaluate target inhibition in leukemic blasts isolated from the bone marrow. This approach is used to determine the dose and schedule of administration required for effective target inhibition. Subsequent studies evaluate the efficacy of the TKIs in vivo
using luciferase expressing leukemia cells, thereby allowing for non-invasive bioluminescent monitoring of leukemia burden and assessment of therapeutic response using an in vivo
bioluminescence imaging system. This strategy has been effective for evaluation of TKIs in vitro
and in vivo
and can be applied for identification of molecularly-targeted agents with therapeutic potential or for direct comparison and prioritization of multiple compounds.
Medicine, Issue 79, Leukemia, Receptor Protein-Tyrosine Kinases, Molecular Targeted Therapy, Therapeutics, novel small molecule inhibitor, receptor tyrosine kinase, leukemia
Assessing Functional Performance in the Mdx Mouse Model
Institutions: Leiden University Medical Center.
Duchenne muscular dystrophy (DMD) is a severe and progressive muscle wasting disorder for which no cure is available. Nevertheless, several potential pharmaceutical compounds and gene therapy approaches have progressed into clinical trials. With improvement in muscle function being the most important end point in these trials, a lot of emphasis has been placed on setting up reliable, reproducible, and easy to perform functional tests to pre clinically assess muscle function, strength, condition, and coordination in the mdx
mouse model for DMD. Both invasive and noninvasive tests are available. Tests that do not exacerbate the disease can be used to determine the natural history of the disease and the effects of therapeutic interventions (e.g
. forelimb grip strength test, two different hanging tests using either a wire or a grid and rotarod running). Alternatively, forced treadmill running can be used to enhance disease progression and/or assess protective effects of therapeutic interventions on disease pathology. We here describe how to perform these most commonly used functional tests in a reliable and reproducible manner. Using these protocols based on standard operating procedures enables comparison of data between different laboratories.
Behavior, Issue 85, Duchenne muscular dystrophy, neuromuscular disorders, outcome measures, functional testing, mouse model, grip strength, hanging test wire, hanging test grid, rotarod running, treadmill running
Electrochemotherapy of Tumours
Institutions: Institute of Oncology Ljubljana, University of Ljubljana.
Electrochemotherapy is a combined use of certain chemotherapeutic drugs and electric pulses applied to the treated tumour nodule. Local application of electric pulses to the tumour increases drug delivery into cells, specifically at the site of electric pulse application. Drug uptake by delivery of electric pulses is increased for only those chemotherapeutic drugs whose transport through the plasma membrane is impeded. Among many drugs that have been tested so far, bleomycin and cisplatin found their way from preclinical testing to clinical use. Clinical data collected within a number of clinical studies indicate that approximately 80% of the treated cutaneous and subcutaneous tumour nodules of different malignancies are in an objective response, from these, approximately 70% in complete response after a single application of electrochemotherapy. Usually only one treatment is needed, however, electrochemotherapy can be repeated several times every few weeks with equal effectiveness each time. The treatment results in an effective eradication of the treated nodules, with a good cosmetic effect without tissue scarring.
Medicine, Issue 22, electrochemotherapy, electroporation, cisplatin, bleomycin, malignant tumours, cutaneous lesions
Stereotactic Radiosurgery for Gynecologic Cancer
Institutions: University Hospitals Case Medical Center and Case Western Reserve University School of Medicine, University Hospitals Case Medical Center and Case Western Reserve University School of Medicine.
Stereotactic body radiotherapy (SBRT) distinguishes itself by necessitating more rigid patient immobilization, accounting for respiratory motion, intricate treatment planning, on-board imaging, and reduced number of ablative radiation doses to cancer targets usually refractory to chemotherapy and conventional radiation. Steep SBRT radiation dose drop-off permits narrow 'pencil beam' treatment fields to be used for ablative radiation treatment condensed into 1 to 3 treatments.
Treating physicians must appreciate that SBRT comes at a bigger danger of normal tissue injury and chance of geographic tumor miss. Both must be tackled by immobilization of cancer targets and by high-precision treatment delivery. Cancer target immobilization has been achieved through use of indexed customized Styrofoam casts, evacuated bean bags, or body-fix molds with patient-independent abdominal compression.1-3
Intrafraction motion of cancer targets due to breathing now can be reduced by patient-responsive breath hold techniques,4
patient mouthpiece active breathing coordination,5
respiration-correlated computed tomography,6
or image-guided tracking of fiducials implanted within and around a moving tumor.7-9
The Cyberknife system (Accuray [Sunnyvale, CA]) utilizes a radiation linear accelerator mounted on a industrial robotic arm that accurately follows patient respiratory motion by a camera-tracked set of light-emitting diodes (LED) impregnated on a vest fitted to a patient.10
Substantial reductions in radiation therapy margins can be achieved by motion tracking, ultimately rendering a smaller planning target volumes that are irradiated with submillimeter accuracy.11-13
Cancer targets treated by SBRT are irradiated by converging, tightly collimated beams. Resultant radiation dose to cancer target volume histograms have a more pronounced radiation "shoulder" indicating high percentage target coverage and a small high-dose radiation "tail." Thus, increased target conformality comes at the expense of decreased dose uniformity in the SBRT cancer target. This may have implications for both subsequent tumor control in the SBRT target and normal tissue tolerance of organs at-risk. Due to the sharp dose falloff in SBRT, the possibility of occult disease escaping ablative radiation dose occurs when cancer targets are not fully recognized and inadequate SBRT dose margins are applied. Clinical target volume (CTV) expansion by 0.5 cm, resulting in a larger planning target volume (PTV), is associated with increased target control without undue normal tissue injury.7,8
Further reduction in the probability of geographic miss may be achieved by incorporation of 2-[18
F-FDG) positron emission tomography (PET).8
Use of 18
F-FDG PET/CT in SBRT treatment planning is only the beginning of attempts to discover new imaging target molecular signatures for gynecologic cancers.
Medicine, Issue 62, radiosurgery, Cyberknife stereotactic radiosurgery, radiation, ovarian cancer, cervix cancer
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g.
, food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera
) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides.
We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
Modeling Astrocytoma Pathogenesis In Vitro and In Vivo Using Cortical Astrocytes or Neural Stem Cells from Conditional, Genetically Engineered Mice
Institutions: University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, University of North Carolina School of Medicine, Emory University School of Medicine, University of North Carolina School of Medicine.
Current astrocytoma models are limited in their ability to define the roles of oncogenic mutations in specific brain cell types during disease pathogenesis and their utility for preclinical drug development. In order to design a better model system for these applications, phenotypically wild-type cortical astrocytes and neural stem cells (NSC) from conditional, genetically engineered mice (GEM) that harbor various combinations of floxed oncogenic alleles were harvested and grown in culture. Genetic recombination was induced in vitro
using adenoviral Cre-mediated recombination, resulting in expression of mutated oncogenes and deletion of tumor suppressor genes. The phenotypic consequences of these mutations were defined by measuring proliferation, transformation, and drug response in vitro
. Orthotopic allograft models, whereby transformed cells are stereotactically injected into the brains of immune-competent, syngeneic littermates, were developed to define the role of oncogenic mutations and cell type on tumorigenesis in vivo
. Unlike most established human glioblastoma cell line xenografts, injection of transformed GEM-derived cortical astrocytes into the brains of immune-competent littermates produced astrocytomas, including the most aggressive subtype, glioblastoma, that recapitulated the histopathological hallmarks of human astrocytomas, including diffuse invasion of normal brain parenchyma. Bioluminescence imaging of orthotopic allografts from transformed astrocytes engineered to express luciferase was utilized to monitor in vivo
tumor growth over time. Thus, astrocytoma models using astrocytes and NSC harvested from GEM with conditional oncogenic alleles provide an integrated system to study the genetics and cell biology of astrocytoma pathogenesis in vitro
and in vivo
and may be useful in preclinical drug development for these devastating diseases.
Neuroscience, Issue 90, astrocytoma, cortical astrocytes, genetically engineered mice, glioblastoma, neural stem cells, orthotopic allograft
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus
, consequently the name Taq
PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to:
● Set up reactions and thermal cycling conditions for a conventional PCR experiment
● Understand the function of various reaction components and their overall effect on a PCR experiment
● Design and optimize a PCR experiment for any DNA template
● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Deep Neuromuscular Blockade Leads to a Larger Intraabdominal Volume During Laparoscopy
Institutions: Aleris-Hamlet Hospitals, Soeborg, Denmark, Aleris-Hamlet Hospitals, Soeborg, Denmark.
Shoulder pain is a commonly reported symptom following laparoscopic procedures such as myomectomy or hysterectomy, and recent studies have shown that lowering the insufflation pressure during surgery may reduce the risk of post-operative pain. In this pilot study, a method is presented for measuring the intra-abdominal space available to the surgeon during laproscopy, in order to examine whether the relaxation produced by deep neuromuscular blockade can increase the working surgical space sufficiently to permit a reduction in the CO2
insufflation pressure. Using the laproscopic grasper, the distance from the promontory to the skin is measured at two different insufflation pressures: 8 mm Hg and 12 mm Hg. After the initial measurements, a neuromuscular blocking agent (rocuronium) is administered to the patient and the intra-abdominal volume is measured again. Pilot data collected from 15 patients shows that the intra-abdominal space at 8 mm Hg with blockade is comparable to the intra-abdominal space measured at 12 mm Hg without blockade. The impact of neuromuscular blockade was not correlated with patient height, weight, BMI, and age. Thus, using neuromuscular blockade to maintain a steady volume while reducing insufflation pressure may produce improved patient outcomes.
Medicine, Issue 76, Anatomy, Physiology, Neurobiology, Surgery, gynecology, laparoscopy, deep neuromuscular blockade, reversal, rocuronium, sugammadex, laparoscopic surgery, clinical techniques, surgical techniques
Ex Vivo Culture of Patient Tissue & Examination of Gene Delivery
Institutions: University College Cork, University College Cork.
This video describes the use of patient tissue as an ex vivo
model for the study of gene delivery. Fresh patient tissue obtained at the time of surgery is sliced and maintained in culture. The ex vivo
model system allows for the physical delivery of genes into intact patient tissue and gene expression is analysed by bioluminescence imaging using the IVIS detection system. The bioluminescent detection system demonstrates rapid and accurate quantification of gene expression within individual slices without the need for tissue sacrifice. This slice tissue culture system may be used in a variety of tissue types including normal and malignant tissue and allows us to study the effects of the heterogeneous nature of intact tissue and the high degree of variability between individual patients. This model system could be used in certain situations as an alternative to animal models and as a complementary preclinical mode prior to entering clinical trial.
Medicine, Issue 46, Bioluminescent imaging, Ex vivo tissue model, Preclinical research, Gene delivery
Manufacturing Devices and Instruments for Easier Rat Liver Transplantation
Institutions: University of Geneva Hospitals, University of Pavia , University of Geneva, University of Geneva Hospitals.
Orthotopic rat liver transplantation is a popular model, which has been shown in a recent JoVE paper with the use of the "quick-linker" device. This technique allows for easier venous cuff-anatomoses after a reasonable learning curve. The device is composed of two handles, which are carved out from scalpel blades, one approximator, which is obtained by modifying Kocher's forceps, and cuffs designed from fine-bore polyethylene tubing. The whole process can be performed at a low-cost using common laboratory material. The present report provides a step-by-step protocol for the design of the required pieces and includes stencils.
Medicine, Issue 75, Biomedical Engineering, Bioengineering, Mechanical Engineering, Anatomy, Physiology, Surgery, Tissue Engineering, Liver Transplantation, Liver, transplantation, rat, quick-linker, orthotopic, graft, cuff, clinical techniques, animal model
Improving IV Insulin Administration in a Community Hospital
Institutions: Wyoming Medical Center.
Diabetes mellitus is a major independent risk factor for increased morbidity and mortality in the hospitalized patient, and elevated blood glucose concentrations, even in non-diabetic patients, predicts poor outcomes.1-4
The 2008 consensus statement by the American Association of Clinical Endocrinologists (AACE) and the American Diabetes Association (ADA) states that "hyperglycemia in hospitalized patients, irrespective of its cause, is unequivocally associated with adverse outcomes."5
It is important to recognize that hyperglycemia occurs in patients with known or undiagnosed diabetes as well as during acute illness in those with previously normal glucose tolerance.
The Normoglycemia in Intensive Care Evaluation-Survival Using Glucose Algorithm Regulation (NICE-SUGAR) study involved over six thousand adult intensive care unit (ICU) patients who were randomized to intensive glucose control or conventional glucose control.6
Surprisingly, this trial found that intensive glucose control increased the risk of mortality by 14% (odds ratio, 1.14; p=0.02). In addition, there was an increased prevalence of severe hypoglycemia in the intensive control group compared with the conventional control group (6.8% vs. 0.5%, respectively; p<0.001). From this pivotal trial and two others,7,8
Wyoming Medical Center (WMC) realized the importance of controlling hyperglycemia in the hospitalized patient while avoiding the negative impact of resultant hypoglycemia.
Despite multiple revisions of an IV insulin paper protocol, analysis of data from usage of the paper protocol at WMC shows that in terms of achieving normoglycemia while minimizing hypoglycemia, results were suboptimal. Therefore, through a systematical implementation plan, monitoring of patient blood glucose levels was switched from using a paper IV insulin protocol to a computerized glucose management system. By comparing blood glucose levels using the paper protocol to that of the computerized system, it was determined, that overall, the computerized glucose management system resulted in more rapid and tighter glucose control than the traditional paper protocol. Specifically, a substantial increase in the time spent within the target blood glucose concentration range, as well as a decrease in the prevalence of severe hypoglycemia (BG < 40 mg/dL), clinical hypoglycemia (BG < 70 mg/dL), and hyperglycemia (BG > 180 mg/dL), was witnessed in the first five months after implementation of the computerized glucose management system. The computerized system achieved target concentrations in greater than 75% of all readings while minimizing the risk of hypoglycemia. The prevalence of hypoglycemia (BG < 70 mg/dL) with the use of the computer glucose management system was well under 1%.
Medicine, Issue 64, Physiology, Computerized glucose management, Endotool, hypoglycemia, hyperglycemia, diabetes, IV insulin, paper protocol, glucose control