JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Sample Size Estimation for Non-Inferiority Trials: Frequentist Approach versus Decision Theory Approach.
.
PLoS ONE
PUBLISHED: 06-16-2015
Non-inferiority trials are performed when the main therapeutic effect of the new therapy is expected to be not unacceptably worse than that of the standard therapy, and the new therapy is expected to have advantages over the standard therapy in costs or other (health) consequences. These advantages however are not included in the classic frequentist approach of sample size calculation for non-inferiority trials. In contrast, the decision theory approach of sample size calculation does include these factors. The objective of this study is to compare the conceptual and practical aspects of the frequentist approach and decision theory approach of sample size calculation for non-inferiority trials, thereby demonstrating that the decision theory approach is more appropriate for sample size calculation of non-inferiority trials.
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Published: 09-19-2012
ABSTRACT
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
22 Related JoVE Articles!
Play Button
Interview: Glycolipid Antigen Presentation by CD1d and the Therapeutic Potential of NKT cell Activation
Authors: Mitchell Kronenberg.
Institutions: La Jolla Institute for Allergy and Immunology.
Natural Killer T cells (NKT) are critical determinants of the immune response to cancer, regulation of autioimmune disease, clearance of infectious agents, and the development of artheriosclerotic plaques. In this interview, Mitch Kronenberg discusses his laboratory's efforts to understand the mechanism through which NKT cells are activated by glycolipid antigens. Central to these studies is CD1d - the antigen presenting molecule that presents glycolipids to NKT cells. The advent of CD1d tetramer technology, a technique developed by the Kronenberg lab, is critical for the sorting and identification of subsets of specific glycolipid-reactive T cells. Mitch explains how glycolipid agonists are being used as therapeutic agents to activate NKT cells in cancer patients and how CD1d tetramers can be used to assess the state of the NKT cell population in vivo following glycolipid agonist therapy. Current status of ongoing clinical trials using these agonists are discussed as well as Mitch's prediction for areas in the field of immunology that will have emerging importance in the near future.
Immunology, Issue 10, Natural Killer T cells, NKT cells, CD1 Tetramers, antigen presentation, glycolipid antigens, CD1d, Mucosal Immunity, Translational Research
635
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
51318
Play Button
Irrelevant Stimuli and Action Control: Analyzing the Influence of Ignored Stimuli via the Distractor-Response Binding Paradigm
Authors: Birte Moeller, Hartmut Schächinger, Christian Frings.
Institutions: Trier University, Trier University.
Selection tasks in which simple stimuli (e.g. letters) are presented and a target stimulus has to be selected against one or more distractor stimuli are frequently used in the research on human action control. One important question in these settings is how distractor stimuli, competing with the target stimulus for a response, influence actions. The distractor-response binding paradigm can be used to investigate this influence. It is particular useful to separately analyze response retrieval and distractor inhibition effects. Computer-based experiments are used to collect the data (reaction times and error rates). In a number of sequentially presented pairs of stimulus arrays (prime-probe design), participants respond to targets while ignoring distractor stimuli. Importantly, the factors response relation in the arrays of each pair (repetition vs. change) and distractor relation (repetition vs. change) are varied orthogonally. The repetition of the same distractor then has a different effect depending on response relation (repetition vs. change) between arrays. This result pattern can be explained by response retrieval due to distractor repetition. In addition, distractor inhibition effects are indicated by a general advantage due to distractor repetition. The described paradigm has proven useful to determine relevant parameters for response retrieval effects on human action.
Behavior, Issue 87, stimulus-response binding, distractor-response binding, response retrieval, distractor inhibition, event file, action control, selection task
51571
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Combining Magnetic Sorting of Mother Cells and Fluctuation Tests to Analyze Genome Instability During Mitotic Cell Aging in Saccharomyces cerevisiae
Authors: Melissa N. Patterson, Patrick H. Maxwell.
Institutions: Rensselaer Polytechnic Institute.
Saccharomyces cerevisiae has been an excellent model system for examining mechanisms and consequences of genome instability. Information gained from this yeast model is relevant to many organisms, including humans, since DNA repair and DNA damage response factors are well conserved across diverse species. However, S. cerevisiae has not yet been used to fully address whether the rate of accumulating mutations changes with increasing replicative (mitotic) age due to technical constraints. For instance, measurements of yeast replicative lifespan through micromanipulation involve very small populations of cells, which prohibit detection of rare mutations. Genetic methods to enrich for mother cells in populations by inducing death of daughter cells have been developed, but population sizes are still limited by the frequency with which random mutations that compromise the selection systems occur. The current protocol takes advantage of magnetic sorting of surface-labeled yeast mother cells to obtain large enough populations of aging mother cells to quantify rare mutations through phenotypic selections. Mutation rates, measured through fluctuation tests, and mutation frequencies are first established for young cells and used to predict the frequency of mutations in mother cells of various replicative ages. Mutation frequencies are then determined for sorted mother cells, and the age of the mother cells is determined using flow cytometry by staining with a fluorescent reagent that detects bud scars formed on their cell surfaces during cell division. Comparison of predicted mutation frequencies based on the number of cell divisions to the frequencies experimentally observed for mother cells of a given replicative age can then identify whether there are age-related changes in the rate of accumulating mutations. Variations of this basic protocol provide the means to investigate the influence of alterations in specific gene functions or specific environmental conditions on mutation accumulation to address mechanisms underlying genome instability during replicative aging.
Microbiology, Issue 92, Aging, mutations, genome instability, Saccharomyces cerevisiae, fluctuation test, magnetic sorting, mother cell, replicative aging
51850
Play Button
A New Technique for Quantitative Analysis of Hair Loss in Mice Using Grayscale Analysis
Authors: Tulasi Ponnapakkam, Ranjitha Katikaneni, Rohan Gulati, Robert Gensure.
Institutions: Children's Hospital at Montefiore.
Alopecia is a common form of hair loss which can occur in many different conditions, including male-pattern hair loss, polycystic ovarian syndrome, and alopecia areata. Alopecia can also occur as a side effect of chemotherapy in cancer patients. In this study, our goal was to develop a consistent and reliable method to quantify hair loss in mice, which will allow investigators to accurately assess and compare new therapeutic approaches for these various forms of alopecia. The method utilizes a standard gel imager to obtain and process images of mice, measuring the light absorption, which occurs in rough proportion to the amount of black (or gray) hair on the mouse. Data that has been quantified in this fashion can then be analyzed using standard statistical techniques (i.e., ANOVA, T-test). This methodology was tested in mouse models of chemotherapy-induced alopecia, alopecia areata and alopecia from waxing. In this report, the detailed protocol is presented for performing these measurements, including validation data from C57BL/6 and C3H/HeJ strains of mice. This new technique offers a number of advantages, including relative simplicity of application, reliance on equipment which is readily available in most research laboratories, and applying an objective, quantitative assessment which is more robust than subjective evaluations. Improvements in quantification of hair growth in mice will improve study of alopecia models and facilitate evaluation of promising new therapies in preclinical studies.
Structural Biology, Issue 97, Alopecia, Mice, Grayscale, Hair, Chemotherapy-Induced Alopecia, Alopecia Areata
52185
Play Button
Methods to Test Visual Attention Online
Authors: Amanda Yung, Pedro Cardoso-Leite, Gillian Dale, Daphne Bavelier, C. Shawn Green.
Institutions: University of Rochester, University of Geneva, University of Wisconsin-Madison, University of Rochester.
Online data collection methods have particular appeal to behavioral scientists because they offer the promise of much larger and much more representative data samples than can typically be collected on college campuses. However, before such methods can be widely adopted, a number of technological challenges must be overcome – in particular in experiments where tight control over stimulus properties is necessary. Here we present methods for collecting performance data on two tests of visual attention. Both tests require control over the visual angle of the stimuli (which in turn requires knowledge of the viewing distance, monitor size, screen resolution, etc.) and the timing of the stimuli (as the tests involve either briefly flashed stimuli or stimuli that move at specific rates). Data collected on these tests from over 1,700 online participants were consistent with data collected in laboratory-based versions of the exact same tests. These results suggest that with proper care, timing/stimulus size dependent tasks can be deployed in web-based settings.
Behavior, Issue 96, Behavior, visual attention, web-based assessment, computer-based assessment, visual search, multiple object tracking
52470
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
50977
Play Button
High-throughput Fluorometric Measurement of Potential Soil Extracellular Enzyme Activities
Authors: Colin W. Bell, Barbara E. Fricks, Jennifer D. Rocca, Jessica M. Steinweg, Shawna K. McMahon, Matthew D. Wallenstein.
Institutions: Colorado State University, Oak Ridge National Laboratory, University of Colorado.
Microbes in soils and other environments produce extracellular enzymes to depolymerize and hydrolyze organic macromolecules so that they can be assimilated for energy and nutrients. Measuring soil microbial enzyme activity is crucial in understanding soil ecosystem functional dynamics. The general concept of the fluorescence enzyme assay is that synthetic C-, N-, or P-rich substrates bound with a fluorescent dye are added to soil samples. When intact, the labeled substrates do not fluoresce. Enzyme activity is measured as the increase in fluorescence as the fluorescent dyes are cleaved from their substrates, which allows them to fluoresce. Enzyme measurements can be expressed in units of molarity or activity. To perform this assay, soil slurries are prepared by combining soil with a pH buffer. The pH buffer (typically a 50 mM sodium acetate or 50 mM Tris buffer), is chosen for the buffer's particular acid dissociation constant (pKa) to best match the soil sample pH. The soil slurries are inoculated with a nonlimiting amount of fluorescently labeled (i.e. C-, N-, or P-rich) substrate. Using soil slurries in the assay serves to minimize limitations on enzyme and substrate diffusion. Therefore, this assay controls for differences in substrate limitation, diffusion rates, and soil pH conditions; thus detecting potential enzyme activity rates as a function of the difference in enzyme concentrations (per sample). Fluorescence enzyme assays are typically more sensitive than spectrophotometric (i.e. colorimetric) assays, but can suffer from interference caused by impurities and the instability of many fluorescent compounds when exposed to light; so caution is required when handling fluorescent substrates. Likewise, this method only assesses potential enzyme activities under laboratory conditions when substrates are not limiting. Caution should be used when interpreting the data representing cross-site comparisons with differing temperatures or soil types, as in situ soil type and temperature can influence enzyme kinetics.
Environmental Sciences, Issue 81, Ecological and Environmental Phenomena, Environment, Biochemistry, Environmental Microbiology, Soil Microbiology, Ecology, Eukaryota, Archaea, Bacteria, Soil extracellular enzyme activities (EEAs), fluorometric enzyme assays, substrate degradation, 4-methylumbelliferone (MUB), 7-amino-4-methylcoumarin (MUC), enzyme temperature kinetics, soil
50961
Play Button
Comprehensive & Cost Effective Laboratory Monitoring of HIV/AIDS: an African Role Model
Authors: Denise Lawrie, George Janossy, Maarten Roos, Deborah K. Glencross.
Institutions: National Health Laboratory Services (NHLS-SA), University of Witwatersrand, Lightcurve Films.
We present the video about assisting anti-retroviral therapy (ART) by an apt laboratory service - representing a South-African role model for economical large scale diagnostic testing. In the low-income countries inexpensive ART has transformed the prospects for the survival of HIV seropositive patients but there are doubts whether there is a need for the laboratory monitoring of ART and at what costs - in situations when the overall quality of pathology services can still be very low. The appropriate answer is to establish economically sound services with better coordination and stricter internal quality assessment than seen in western countries. This video, photographed at location in the National Health Laboratory Services (NHLS-SA) at the Witwatersrand University, Johannesburg, South Africa, provides such a coordinated scheme expanding the original 2-color CD4-CD45 PanLeucoGating strategy (PLG). Thus the six modules of the video presentation reveal the simplicity of a 4-color flow cytometric assay to combine haematological, immunological and virology-related tests in a single tube. These video modules are: (i) the set-up of instruments; (ii) sample preparations; (iii) testing absolute counts and monitoring quality for each sample by bead-count-rate; (iv) the heamatological CD45 test for white cell counts and differentials; (v) the CD4 counts, and (vi) the activation of CD8+ T cells measured by CD38 display, a viral load related parameter. The potential cost-savings are remarkable. This arrangement is a prime example for the feasibility of performing > 800-1000 tests per day with a stricter quality control than that applied in western laboratories, and also with a transfer of technology to other laboratories within a NHLS-SA network. Expert advisors, laboratory managers and policy makers who carry the duty of making decisions about introducing modern medical technology are frequently not in a position to see the latest technical details as carried out in the large regional laboratories with huge burdens of workload. Hence this video shows details of these new developments.
Immunology, Issue 44, Human Immunodeficiency virus (HIV); CD4 lymphocyte count, white cell count, CD45, panleucogating, lymphocyte activation, CD38, HIV viral load, antiretroviral therapy (ART), internal quality control
2312
Play Button
Thermal Ablation for the Treatment of Abdominal Tumors
Authors: Christopher L. Brace, J. Louis Hinshaw, Meghan G. Lubner.
Institutions: University of Wisconsin-Madison, University of Wisconsin-Madison.
Percutaneous thermal ablation is an emerging treatment option for many tumors of the abdomen not amenable to conventional treatments. During a thermal ablation procedure, a thin applicator is guided into the target tumor under imaging guidance. Energy is then applied to the tissue until temperatures rise to cytotoxic levels (50-60 °C). Various energy sources are available to heat biological tissues, including radiofrequency (RF) electrical current, microwaves, laser light and ultrasonic waves. Of these, RF and microwave ablation are most commonly used worldwide. During RF ablation, alternating electrical current (~500 kHz) produces resistive heating around the interstitial electrode. Skin surface electrodes (ground pads) are used to complete the electrical circuit. RF ablation has been in use for nearly 20 years, with good results for local tumor control, extended survival and low complication rates1,2. Recent studies suggest RF ablation may be a first-line treatment option for small hepatocellular carcinoma and renal-cell carcinoma3-5. However, RF heating is hampered by local blood flow and high electrical impedance tissues (eg, lung, bone, desiccated or charred tissue)6,7. Microwaves may alleviate some of these problems by producing faster, volumetric heating8-10. To create larger or conformal ablations, multiple microwave antennas can be used simultaneously while RF electrodes require sequential operation, which limits their efficiency. Early experiences with microwave systems suggest efficacy and safety similar to, or better than RF devices11-13. Alternatively, cryoablation freezes the target tissues to lethal levels (-20 to -40 °C). Percutaneous cryoablation has been shown to be effective against RCC and many metastatic tumors, particularly colorectal cancer, in the liver14-16. Cryoablation may also be associated with less post-procedure pain and faster recovery for some indications17. Cryoablation is often contraindicated for primary liver cancer due to underlying coagulopathy and associated bleeding risks frequently seen in cirrhotic patients. In addition, sudden release of tumor cellular contents when the frozen tissue thaws can lead to a potentially serious condition known as cryoshock 16. Thermal tumor ablation can be performed at open surgery, laparoscopy or using a percutaneous approach. When performed percutaneously, the ablation procedure relies on imaging for diagnosis, planning, applicator guidance, treatment monitoring and follow-up. Ultrasound is the most popular modality for guidance and treatment monitoring worldwide, but computed tomography (CT) and magnetic resonance imaging (MRI) are commonly used as well. Contrast-enhanced CT or MRI are typically employed for diagnosis and follow-up imaging.
Medicine, Issue 49, Thermal ablation, interventional oncology, image-guided therapy, radiology, cancer
2596
Play Button
Absolute Quantum Yield Measurement of Powder Samples
Authors: Luis A. Moreno.
Institutions: Hitachi High Technologies America.
Measurement of fluorescence quantum yield has become an important tool in the search for new solutions in the development, evaluation, quality control and research of illumination, AV equipment, organic EL material, films, filters and fluorescent probes for bio-industry. Quantum yield is calculated as the ratio of the number of photons absorbed, to the number of photons emitted by a material. The higher the quantum yield, the better the efficiency of the fluorescent material. For the measurements featured in this video, we will use the Hitachi F-7000 fluorescence spectrophotometer equipped with the Quantum Yield measuring accessory and Report Generator program. All the information provided applies to this system. Measurement of quantum yield in powder samples is performed following these steps: Generation of instrument correction factors for the excitation and emission monochromators. This is an important requirement for the correct measurement of quantum yield. It has been performed in advance for the full measurement range of the instrument and will not be shown in this video due to time limitations. Measurement of integrating sphere correction factors. The purpose of this step is to take into consideration reflectivity characteristics of the integrating sphere used for the measurements. Reference and Sample measurement using direct excitation and indirect excitation. Quantum Yield calculation using Direct and Indirect excitation. Direct excitation is when the sample is facing directly the excitation beam, which would be the normal measurement setup. However, because we use an integrating sphere, a portion of the emitted photons resulting from the sample fluorescence are reflected by the integrating sphere and will re-excite the sample, so we need to take into consideration indirect excitation. This is accomplished by measuring the sample placed in the port facing the emission monochromator, calculating indirect quantum yield and correcting the direct quantum yield calculation. Corrected quantum yield calculation. Chromaticity coordinates calculation using Report Generator program. The Hitachi F-7000 Quantum Yield Measurement System offer advantages for this application, as follows: High sensitivity (S/N ratio 800 or better RMS). Signal is the Raman band of water measured under the following conditions: Ex wavelength 350 nm, band pass Ex and Em 5 nm, response 2 sec), noise is measured at the maximum of the Raman peak. High sensitivity allows measurement of samples even with low quantum yield. Using this system we have measured quantum yields as low as 0.1 for a sample of salicylic acid and as high as 0.8 for a sample of magnesium tungstate. Highly accurate measurement with a dynamic range of 6 orders of magnitude allows for measurements of both sharp scattering peaks with high intensity, as well as broad fluorescence peaks of low intensity under the same conditions. High measuring throughput and reduced light exposure to the sample, due to a high scanning speed of up to 60,000 nm/minute and automatic shutter function. Measurement of quantum yield over a wide wavelength range from 240 to 800 nm. Accurate quantum yield measurements are the result of collecting instrument spectral response and integrating sphere correction factors before measuring the sample. Large selection of calculated parameters provided by dedicated and easy to use software. During this video we will measure sodium salicylate in powder form which is known to have a quantum yield value of 0.4 to 0.5.
Molecular Biology, Issue 63, Powders, Quantum, Yield, F-7000, Quantum Yield, phosphor, chromaticity, Photo-luminescence
3066
Play Button
The WATCHMAN Left Atrial Appendage Closure Device for Atrial Fibrillation
Authors: Sven Möbius-Winkler, Marcus Sandri, Norman Mangner, Phillip Lurz, Ingo Dähnert, Gerhard Schuler.
Institutions: University of Leipzig Heart Center.
Atrial fibrillation (AF) is the most common cardiac arrhythmia, affecting an estimated 6 million people in the United States 1. Since AF affects primarily elderly people, its prevalence increases parallel with age. As such, it is expected that 15.9 million Americans will be affected by the year 2050 2. Ischemic stroke occurs in 5% of non-anticoagulated AF patients each year. Current treatments for AF include rate control, rhythm control and prevention of stroke 3. The American College of Cardiology, American Heart Association, and European Society of Cardiology currently recommended rate control as the first course of therapy for AF 3. Rate control is achieved by administration of pharmacological agents, such as β-blockers, that lower the heart rate until it reaches a less symptomatic state 3. Rhythm control aims to return the heart to its normal sinus rhythm and is typically achieved through administration of antiarrhythmic drugs such as amiodarone, electrical cardioversion or ablation therapy. Rhythm control methods, however, have not been demonstrated to be superior to rate-control methods 4-6. In fact, certain antiarrhythmic drugs have been shown to be associated with higher hospitalization rates, serious adverse effects 3, or even increases in mortality in patients with structural heart defects 7. Thus, treatment with antiarrhythmics is more often used when rate-control drugs are ineffective or contraindicated. Rate-control and antiarrhythmic agents relieve the symptoms of AF, including palpitations, shortness of breath, and fatigue 8, but don't reliably prevent thromboembolic events 6. Treatment with the anticoagulant drug warfarin significantly reduces the rate of stroke or embolism 9,10. However, because of problems associated with its use, fewer than 50% of patients are treated with it. The therapeutic dose is affected by drug, dietary, and metabolic interactions, and thus requires detailed monitoring. In addition, warfarin has the potential to cause severe, sometimes lethal, bleeding 2. As an alternative, aspirin is commonly prescribed. While aspirin is typically well tolerated, it is far less effective at preventing stroke 10. Other alternatives to warfarin, such as dabigatran 11 or rivaroxaban 12 demonstrate non-inferiority to warfarin with respect to thromboembolic events (in fact, dabigatran given as a high dose of 150 mg twice a day has shown superiority). While these drugs have the advantage of eliminating dietary concerns and eliminating the need for regular blood monitoring, major bleeding and associated complications, while somewhat less so than with warfarin, remain an issue 13-15. Since 90% of AF-associated strokes result from emboli that arise from the left atrial appendage (LAA) 2, one alternative approach to warfarin therapy has been to exclude the LAA using an implanted device to trap blood clots before they exit. Here, we demonstrate a procedure for implanting the WATCHMAN Left Atrial Appendage Closure Device. A transseptal cannula is inserted through the femoral vein, and under fluoroscopic guidance, inter-atrial septum is crossed. Once access to the left atrium has been achieved, a guidewire is placed in the upper pulmonary vein and the WATCHMAN Access Sheath and dilator are advanced over the wire into the left atrium. The guidewire is removed, and the access sheath is carefully advanced into the distal portion of the LAA over a pigtail catheter. The WATCHMAN Delivery System is prepped, inserted into the access sheath, and slowly advanced. The WATCHMAN device is then deployed into the LAA. The device release criteria are confirmed via fluoroscopy and transesophageal echocardiography (TEE) and the device is released.
Medicine, Issue 60, atrial fibrillation, cardiology, cardiac, interventional cardiology, medical procedures, medicine, WATCHMAN, medical device, left atrial appendage
3671
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
4375
Play Button
The Dig Task: A Simple Scent Discrimination Reveals Deficits Following Frontal Brain Damage
Authors: Kris M. Martens, Cole Vonder Haar, Blake A. Hutsell, Michael R. Hoane.
Institutions: Southern Illinois University at Carbondale.
Cognitive impairment is the most frequent cause of disability in humans following brain damage, yet the behavioral tasks used to assess cognition in rodent models of brain injury is lacking. Borrowing from the operant literature our laboratory utilized a basic scent discrimination paradigm1-4 in order to assess deficits in frontally-injured rats. Previously we have briefly described the Dig task and demonstrated that rats with frontal brain damage show severe deficits across multiple tests within the task5. Here we present a more detailed protocol for this task. Rats are placed into a chamber and allowed to discriminate between two scented sands, one of which contains a reinforcer. The trial ends after the rat either correctly discriminates (defined as digging in the correct scented sand), incorrectly discriminates, or 30 sec elapses. Rats that correctly discriminate are allowed to recover and consume the reinforcer. Rats that discriminate incorrectly are immediately removed from the chamber. This can continue through a variety of reversals and novel scents. The primary analysis is the accuracy for each scent pairing (cumulative proportion correct for each scent). The general findings from the Dig task suggest that it is a simple experimental preparation that can assess deficits in rats with bilateral frontal cortical damage compared to rats with unilateral parietal damage. The Dig task can also be easily incorporated into an existing cognitive test battery. The use of more tasks such as this one can lead to more accurate testing of frontal function following injury, which may lead to therapeutic options for treatment. All animal use was conducted in accordance with protocols approved by the Institutional Animal Care and Use Committee.
Neuroscience, Issue 71, Medicine, Neurobiology, Anatomy, Physiology, Psychology, Behavior, cognitive assessment, dig task, scent discrimination, olfactory, brain injury, traumatic brain injury, TBI, brain damage, rats, animal model
50033
Play Button
Rapid Colorimetric Assays to Qualitatively Distinguish RNA and DNA in Biomolecular Samples
Authors: Jennifer Patterson, Cameron Mura.
Institutions: University of Virginia .
Biochemical experimentation generally requires accurate knowledge, at an early stage, of the nucleic acid, protein, and other biomolecular components in potentially heterogeneous specimens. Nucleic acids can be detected via several established approaches, including analytical methods that are spectrophotometric (e.g., A260), fluorometric (e.g., binding of fluorescent dyes), or colorimetric (nucleoside-specific chromogenic chemical reactions).1 Though it cannot readily distinguish RNA from DNA, the A260/A280 ratio is commonly employed, as it offers a simple and rapid2 assessment of the relative content of nucleic acid, which absorbs predominantly near 260 nm and protein, which absorbs primarily near 280 nm. Ratios < 0.8 are taken as indicative of 'pure' protein specimens, while pure nucleic acid (NA) is characterized by ratios > 1.53. However, there are scenarios in which the protein/NA content cannot be as clearly or reliably inferred from simple uv-vis spectrophotometric measurements. For instance, (i) samples may contain one or more proteins which are relatively devoid of the aromatic amino acids responsible for absorption at ≈280 nm (Trp, Tyr, Phe), as is the case with some small RNA-binding proteins, and (ii) samples can exhibit intermediate A260/A280 ratios (~0.8 < ~1.5), where the protein/NA content is far less clear and may even reflect some high-affinity association between the protein and NA components. For such scenarios, we describe herein a suite of colorimetric assays to rapidly distinguish RNA, DNA, and reducing sugars in a potentially mixed sample of biomolecules. The methods rely on the differential sensitivity of pentoses and other carbohydrates to Benedict's, Bial's (orcinol), and Dische's (diphenylamine) reagents; the streamlined protocols can be completed in a matter of minutes, without any additional steps of having to isolate the components. The assays can be performed in parallel to differentiate between RNA and DNA, as well as indicate the presence of free reducing sugars such as glucose, fructose, and ribose (Figure 1).
Chemistry, Issue 72, Biochemistry, Chemical Biology, Genetics, Molecular Biology, Cellular Biology, Nucleic Acids, DNA, RNA, Proteins, analytical chemistry, Benedict's assay, Bial's orcinol assay, Dische's diphenylamine assay, colorimetric assay, reducing sugar, purification, transcription, reaction, assay
50225
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Training Synesthetic Letter-color Associations by Reading in Color
Authors: Olympia Colizoli, Jaap M. J. Murre, Romke Rouw.
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
50893
Play Button
Investigating the Function of Deep Cortical and Subcortical Structures Using Stereotactic Electroencephalography: Lessons from the Anterior Cingulate Cortex
Authors: Robert A. McGovern, Tarini Ratneswaren, Elliot H. Smith, Jennifer F. Russo, Amy C. Jongeling, Lisa M. Bateman, Catherine A. Schevon, Neil A. Feldstein, Guy M. McKhann, II, Sameer Sheth.
Institutions: Columbia University Medical Center, New York Presbyterian Hospital, Columbia University Medical Center, New York Presbyterian Hospital, Columbia University Medical Center, New York Presbyterian Hospital, King's College London.
Stereotactic Electroencephalography (SEEG) is a technique used to localize seizure foci in patients with medically intractable epilepsy. This procedure involves the chronic placement of multiple depth electrodes into regions of the brain typically inaccessible via subdural grid electrode placement. SEEG thus provides a unique opportunity to investigate brain function. In this paper we demonstrate how SEEG can be used to investigate the role of the dorsal anterior cingulate cortex (dACC) in cognitive control. We include a description of the SEEG procedure, demonstrating the surgical placement of the electrodes. We describe the components and process required to record local field potential (LFP) data from consenting subjects while they are engaged in a behavioral task. In the example provided, subjects play a cognitive interference task, and we demonstrate how signals are recorded and analyzed from electrodes in the dorsal anterior cingulate cortex, an area intimately involved in decision-making. We conclude with further suggestions of ways in which this method can be used for investigating human cognitive processes.
Neuroscience, Issue 98, epilepsy, stereotactic electroencephalography, anterior cingulate cortex, local field potential, electrode placement
52773
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.