Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes. Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements. Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning. Current data do not support the recommendation of any single instrument for measurement of functional capacity. The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
25 Related JoVE Articles!
Making Sense of Listening: The IMAP Test Battery
Institutions: MRC Institute of Hearing Research, National Biomedical Research Unit in Hearing.
The ability to hear is only the first step towards making sense of the range of information contained in an auditory signal. Of equal importance are the abilities to extract and use the information encoded in the auditory signal. We refer to these as listening skills (or auditory processing AP). Deficits in these skills are associated with delayed language and literacy development, though the nature of the relevant deficits and their causal connection with these delays is hotly debated.
When a child is referred to a health professional with normal hearing and unexplained difficulties in listening, or associated delays in language or literacy development, they should ideally be assessed with a combination of psychoacoustic (AP) tests, suitable for children and for use in a clinic, together with cognitive tests to measure attention, working memory, IQ, and language skills. Such a detailed examination needs to be relatively short and within the technical capability of any suitably qualified professional. Current tests for the presence of AP deficits tend to be poorly constructed and inadequately validated within the normal population. They have little or no reference to the presenting symptoms of the child, and typically include a linguistic component. Poor performance may thus reflect problems with language rather than with AP. To assist in the assessment of children with listening difficulties, pediatric audiologists need a single, standardized child-appropriate test battery based on the use of language-free stimuli.
We present the IMAP test battery which was developed at the MRC Institute of Hearing Research to supplement tests currently used to investigate cases of suspected AP deficits. IMAP assesses a range of relevant auditory and cognitive skills and takes about one hour to complete. It has been standardized in 1500 normally-hearing children from across the UK, aged 6-11 years. Since its development, it has been successfully used in a number of large scale studies both in the UK and the USA. IMAP provides measures for separating out sensory from cognitive contributions to hearing. It further limits confounds due to procedural effects by presenting tests in a child-friendly game-format. Stimulus-generation, management of test protocols and control of test presentation is mediated by the IHR-STAR software platform. This provides a standardized methodology for a range of applications and ensures replicable procedures across testers. IHR-STAR provides a flexible, user-programmable environment that currently has additional applications for hearing screening, mapping cochlear implant electrodes, and academic research or teaching.
Neuroscience, Issue 44, Listening skills, auditory processing, auditory psychophysics, clinical assessment, child-friendly testing
Measuring Frailty in HIV-infected Individuals. Identification of Frail Patients is the First Step to Amelioration and Reversal of Frailty
Institutions: University of Arizona, University of Arizona.
A simple, validated protocol consisting of a battery of tests is available to identify elderly patients with frailty syndrome. This syndrome of decreased reserve and resistance to stressors increases in incidence with increasing age. In the elderly, frailty may pursue a step-wise loss of function from non-frail to pre-frail to frail. We studied frailty in HIV-infected patients and found that ~20% are frail using the Fried phenotype using stringent criteria developed for the elderly1,2
. In HIV infection the syndrome occurs at a younger age.
HIV patients were checked for 1) unintentional weight loss; 2) slowness as determined by walking speed; 3) weakness as measured by a grip dynamometer; 4) exhaustion by responses to a depression scale; and 5) low physical activity was determined by assessing kilocalories expended in a week's time. Pre-frailty was present with any two of five criteria and frailty was present if any three of the five criteria were abnormal.
The tests take approximately 10-15 min to complete and they can be performed by medical assistants during routine clinic visits. Test results are scored by referring to standard tables. Understanding which of the five components contribute to frailty in an individual patient can allow the clinician to address relevant underlying problems, many of which are not evident in routine HIV clinic visits.
Medicine, Issue 77, Infection, Virology, Infectious Diseases, Anatomy, Physiology, Molecular Biology, Biomedical Engineering, Retroviridae Infections, Body Weight Changes, Diagnostic Techniques and Procedures, Physical Examination, Muscle Strength, Behavior, Virus Diseases, Pathological Conditions, Signs and Symptoms, Diagnosis, Musculoskeletal and Neural Physiological Phenomena, HIV, HIV-1, AIDS, Frailty, Depression, Weight Loss, Weakness, Slowness, Exhaustion, Aging, clinical techniques
Generation of Comprehensive Thoracic Oncology Database - Tool for Translational Research
Institutions: University of Chicago, University of Chicago, Northshore University Health Systems, University of Chicago, University of Chicago, University of Chicago.
The Thoracic Oncology Program Database Project was created to serve as a comprehensive, verified, and accessible repository for well-annotated cancer specimens and clinical data to be available to researchers within the Thoracic Oncology Research Program. This database also captures a large volume of genomic and proteomic data obtained from various tumor tissue studies. A team of clinical and basic science researchers, a biostatistician, and a bioinformatics expert was convened to design the database. Variables of interest were clearly defined and their descriptions were written within a standard operating manual to ensure consistency of data annotation. Using a protocol for prospective tissue banking and another protocol for retrospective banking, tumor and normal tissue samples from patients consented to these protocols were collected. Clinical information such as demographics, cancer characterization, and treatment plans for these patients were abstracted and entered into an Access database. Proteomic and genomic data have been included in the database and have been linked to clinical information for patients described within the database. The data from each table were linked using the relationships function in Microsoft Access to allow the database manager to connect clinical and laboratory information during a query. The queried data can then be exported for statistical analysis and hypothesis generation.
Medicine, Issue 47, Database, Thoracic oncology, Bioinformatics, Biorepository, Microsoft Access, Proteomics, Genomics
Intra-Operative Behavioral Tasks in Awake Humans Undergoing Deep Brain Stimulation Surgery
Institutions: Harvard Medical School, Massachusetts General Hospital.
Deep brain stimulation (DBS) is a surgical procedure that directs chronic, high frequency electrical stimulation to specific targets in the brain through implanted electrodes. Deep brain stimulation was first implemented as a therapeutic modality by Benabid et al.
in the late 1980s, when he used this technique to stimulate the ventral intermediate nucleus of the thalamus for the treatment of tremor 1
. Currently, the procedure is used to treat patients who fail to respond adequately to medical management for diseases such as Parkinson's, dystonia, and essential tremor. The efficacy of this procedure for the treatment of Parkinson's disease has been demonstrated in well-powered, randomized controlled trials 2
. Presently, the U.S. Food and Drug Administration has approved DBS as a treatment for patients with medically refractory essential tremor, Parkinson's disease, and dystonia. Additionally, DBS is currently being evaluated for the treatment of other psychiatric and neurological disorders, such as obsessive compulsive disorder, major depressive disorder, and epilepsy.
DBS has not only been shown to help people by improving their quality of life, it also provides researchers with the unique opportunity to study and understand the human brain. Microelectrode recordings are routinely performed during DBS surgery in order to enhance the precision of anatomical targeting. Firing patterns of individual neurons can therefore be recorded while the subject performs a behavioral task. Early studies using these data focused on descriptive aspects, including firing and burst rates, and frequency modulation 3
. More recent studies have focused on cognitive aspects of behavior in relation to neuronal activity 4,5
. This article will provide a description of the intra-operative methods used to perform behavioral tasks and record neuronal data with awake patients during DBS cases. Our exposition of the process of acquiring electrophysiological data will illuminate the current scope and limitations of intra-operative human experiments.
Medicine, Issue 47, Intra-Operative Physiology, Cognitive Neuroscience, Behavioral Neuroscience, Subthalamic Nucleus, Single-Unit Activity, Parkinson Disease, Deep Brain Stimulation
Eye Movement Monitoring of Memory
Institutions: Rotman Research Institute, University of Toronto, University of Toronto.
Explicit (often verbal) reports are typically used to investigate memory (e.g. "Tell me what you remember about the person you saw at the bank yesterday."), however such reports can often be unreliable or sensitive to response bias 1
, and may be unobtainable in some participant populations. Furthermore, explicit reports only reveal when information has reached consciousness and cannot comment on when memories were accessed during processing, regardless of whether the information is subsequently accessed in a conscious manner. Eye movement monitoring (eye tracking) provides a tool by which memory can be probed without asking participants to comment on the contents of their memories, and access of such memories can be revealed on-line 2,3
. Video-based eye trackers (either head-mounted or remote) use a system of cameras and infrared markers to examine the pupil and corneal reflection in each eye as the participant views a display monitor. For head-mounted eye trackers, infrared markers are also used to determine head position to allow for head movement and more precise localization of eye position. Here, we demonstrate the use of a head-mounted eye tracking system to investigate memory performance in neurologically-intact and neurologically-impaired adults. Eye movement monitoring procedures begin with the placement of the eye tracker on the participant, and setup of the head and eye cameras. Calibration and validation procedures are conducted to ensure accuracy of eye position recording. Real-time recordings of X,Y-coordinate positions on the display monitor are then converted and used to describe periods of time in which the eye is static (i.e. fixations) versus in motion (i.e., saccades). Fixations and saccades are time-locked with respect to the onset/offset of a visual display or another external event (e.g. button press). Experimental manipulations are constructed to examine how and when patterns of fixations and saccades are altered through different types of prior experience. The influence of memory is revealed in the extent to which scanning patterns to new images differ from scanning patterns to images that have been previously studied 2, 4-5
. Memory can also be interrogated for its specificity; for instance, eye movement patterns that differ between an identical and an altered version of a previously studied image reveal the storage of the altered detail in memory 2-3, 6-8
. These indices of memory can be compared across participant populations, thereby providing a powerful tool by which to examine the organization of memory in healthy individuals, and the specific changes that occur to memory with neurological insult or decline 2-3, 8-10
Neuroscience, Issue 42, eye movement monitoring, eye tracking, memory, aging, amnesia, visual processing
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
Optimized Negative Staining: a High-throughput Protocol for Examining Small and Asymmetric Protein Structure by Electron Microscopy
Institutions: The Molecular Foundry.
Structural determination of proteins is rather challenging for proteins with molecular masses between 40 - 200 kDa. Considering that more than half of natural proteins have a molecular mass between 40 - 200 kDa1,2
, a robust and high-throughput method with a nanometer resolution capability is needed. Negative staining (NS) electron microscopy (EM) is an easy, rapid, and qualitative approach which has frequently been used in research laboratories to examine protein structure and protein-protein interactions. Unfortunately, conventional NS protocols often generate structural artifacts on proteins, especially with lipoproteins that usually form presenting rouleaux artifacts. By using images of lipoproteins from cryo-electron microscopy (cryo-EM) as a standard, the key parameters in NS specimen preparation conditions were recently screened and reported as the optimized NS protocol (OpNS), a modified conventional NS protocol 3
. Artifacts like rouleaux can be greatly limited by OpNS, additionally providing high contrast along with reasonably high‐resolution (near 1 nm) images of small and asymmetric proteins. These high-resolution and high contrast images are even favorable for an individual protein (a single object, no average) 3D reconstruction, such as a 160 kDa antibody, through the method of electron tomography4,5
. Moreover, OpNS can be a high‐throughput tool to examine hundreds of samples of small proteins. For example, the previously published mechanism of 53 kDa cholesteryl ester transfer protein (CETP) involved the screening and imaging of hundreds of samples 6
. Considering cryo-EM rarely successfully images proteins less than 200 kDa has yet to publish any study involving screening over one hundred sample conditions, it is fair to call OpNS a high-throughput method for studying small proteins. Hopefully the OpNS protocol presented here can be a useful tool to push the boundaries of EM and accelerate EM studies into small protein structure, dynamics and mechanisms.
Environmental Sciences, Issue 90, small and asymmetric protein structure, electron microscopy, optimized negative staining
Laboratory Drop Towers for the Experimental Simulation of Dust-aggregate Collisions in the Early Solar System
Institutions: Technische Universität Braunschweig.
For the purpose of investigating the evolution of dust aggregates in the early Solar System, we developed two vacuum drop towers in which fragile dust aggregates with sizes up to ~10 cm and porosities up to 70% can be collided. One of the drop towers is primarily used for very low impact speeds down to below 0.01 m/sec and makes use of a double release mechanism. Collisions are recorded in stereo-view by two high-speed cameras, which fall along the glass vacuum tube in the center-of-mass frame of the two dust aggregates. The other free-fall tower makes use of an electromagnetic accelerator that is capable of gently accelerating dust aggregates to up to 5 m/sec. In combination with the release of another dust aggregate to free fall, collision speeds up to ~10 m/sec can be achieved. Here, two fixed high-speed cameras record the collision events. In both drop towers, the dust aggregates are in free fall during the collision so that they are weightless and match the conditions in the early Solar System.
Physics, Issue 88, astrophysics, planet formation, collisions, granular matter, high-speed imaging, microgravity drop tower
A Standardized Obstacle Course for Assessment of Visual Function in Ultra Low Vision and Artificial Vision
Institutions: University of Pittsburgh, University of Pittsburgh.
We describe an indoor, portable, standardized course that can be used to evaluate obstacle avoidance in persons who have ultralow vision. Six sighted controls and 36 completely blind but otherwise healthy adult male (n=29) and female (n=13) subjects (age range 19-85 years), were enrolled in one of three studies involving testing of the BrainPort sensory substitution device. Subjects were asked to navigate the course prior to, and after, BrainPort training. They completed a total of 837 course runs in two different locations. Means and standard deviations were calculated across control types, courses, lights, and visits. We used a linear mixed effects model to compare different categories in the PPWS (percent preferred walking speed) and error percent data to show that the course iterations were properly designed. The course is relatively inexpensive, simple to administer, and has been shown to be a feasible way to test mobility function. Data analysis demonstrates that for the outcome of percent error as well as for percentage preferred walking speed, that each of the three courses is different, and that within each level, each of the three iterations are equal. This allows for randomization of the courses during administration.
preferred walking speed (PWS)
course speed (CS)
percentage preferred walking speed (PPWS)
Medicine, Issue 84, Obstacle course, navigation assessment, BrainPort, wayfinding, low vision
Modeling Biological Membranes with Circuit Boards and Measuring Electrical Signals in Axons: Student Laboratory Exercises
Institutions: University of Kentucky, University of Toronto.
This is a demonstration of how electrical models can be used to characterize biological membranes. This exercise also introduces biophysical terminology used in electrophysiology. The same equipment is used in the membrane model as on live preparations. Some properties of an isolated nerve cord are investigated: nerve action potentials, recruitment of neurons, and responsiveness of the nerve cord to environmental factors.
Basic Protocols, Issue 47, Invertebrate, Crayfish, Modeling, Student laboratory, Nerve cord
Synthesis and Characterization of Functionalized Metal-organic Frameworks
Institutions: Northwestern University, Warsaw University of Technology, King Abdulaziz University.
Metal-organic frameworks have attracted extraordinary amounts of research attention, as they are attractive candidates for numerous industrial and technological applications. Their signature property is their ultrahigh porosity, which however imparts a series of challenges when it comes to both constructing them and working with them. Securing desired MOF chemical and physical functionality by linker/node assembly into a highly porous framework of choice can pose difficulties, as less porous and more thermodynamically stable congeners (e.g.
, other crystalline polymorphs, catenated analogues) are often preferentially obtained by conventional synthesis methods. Once the desired product is obtained, its characterization often requires specialized techniques that address complications potentially arising from, for example, guest-molecule loss or preferential orientation of microcrystallites. Finally, accessing the large voids inside the MOFs for use in applications that involve gases can be problematic, as frameworks may be subject to collapse during removal of solvent molecules (remnants of solvothermal synthesis). In this paper, we describe synthesis and characterization methods routinely utilized in our lab either to solve or circumvent these issues. The methods include solvent-assisted linker exchange, powder X-ray diffraction in capillaries, and materials activation (cavity evacuation) by supercritical CO2
drying. Finally, we provide a protocol for determining a suitable pressure region for applying the Brunauer-Emmett-Teller analysis to nitrogen isotherms, so as to estimate surface area of MOFs with good accuracy.
Chemistry, Issue 91, Metal-organic frameworks, porous coordination polymers, supercritical CO2 activation, crystallography, solvothermal, sorption, solvent-assisted linker exchange
Gibberella zeae Ascospore Production and Collection for Microarray Experiments.
Institutions: USDA, University of Minnesota/ Agroinnova, University of Torino, University of Minnesota.
Fusarium graminearum Schwabe (teleomorph Gibberella zeae) is a plant pathogen causing scab disease on wheat and barley that reduces crop yield and grain quality. F. graminearum also causes stalk and ear rots of maize and is a producer of mycotoxins such as the trichothecenes that contaminate grain and are harmful to humans and livestock (Goswami and Kistler, 2004).
The fungus produces two types of spores. Ascospores, the propagules resulting from sexual reproduction, are the main source of primary infection. These spores are forcibly discharged from mature perithecia and dispersed by wind (Francl et al 1999). Secondary infections are mainly caused by macroconidia which are produced by asexual means on the plant surface. To study the developmental processes of ascospores in this fungus, a procedure for their collection in large quantity under sterile conditions was required. Our protocol was filmed in order to generate the highest level of information for understanding and reproducibility; crucial aspects when full genome gene expression profiles are generated and interpreted. In particular, the variability of ascospore germination and biological activity are dependent on the prior manipulation of the material. The use of video for documenting every step in ascospore production is proposed in order to increase standardization, complying with the increasingly stringent requirements for microarray analysis. The procedure requires only standard laboratory equipment. Steps are shown to prevent contamination and favor time synchronization of ascospores.
Plant Biology, Issue 1, sexual cross, spore separation, MIAME standards
Electroantennographic Bioassay as a Screening Tool for Host Plant Volatiles
Institutions: Agricultural Research Service.
Plant volatiles play an important role in plant-insect interactions. Herbivorous insects use plant volatiles, known as kairomones, to locate their host plant.1,2
When a host plant is an important agronomic commodity feeding damage by insect pests can inflict serious economic losses to growers. Accordingly, kairomones can be used as attractants to lure or confuse these insects and, thus, offer an environmentally friendly alternative to pesticides for insect control.3
Unfortunately, plants can emit a vast number volatiles with varying compositions and ratios of emissions dependent upon the phenology of the commodity or the time of day. This makes identification of biologically active components or blends of volatile components an arduous process. To help identify the bioactive components of host plant volatile emissions we employ the laboratory-based screening bioassay electroantennography (EAG). EAG is an effective tool to evaluate and record electrophysiologically the olfactory responses of an insect via their antennal receptors. The EAG screening process can help reduce the number of volatiles tested to identify promising bioactive components. However, EAG bioassays only provide information about activation of receptors. It does not provide information about the type of insect behavior the compound elicits; which could be as an attractant, repellent or other type of behavioral response. Volatiles eliciting a significant response by EAG, relative to an appropriate positive control, are typically taken on to further testing of behavioral responses of the insect pest. The experimental design presented will detail the methodology employed to screen almond-based host plant volatiles4,5
by measurement of the electrophysiological antennal responses of an adult insect pest navel orangeworm (Amyelois transitella
) to single components and simple blends of components via EAG bioassay. The method utilizes two excised antennae placed across a "fork" electrode holder. The protocol demonstrated here presents a rapid, high-throughput standardized method for screening volatiles. Each volatile is at a set, constant amount as to standardize the stimulus level and thus allow antennal responses to be indicative of the relative chemoreceptivity. The negative control helps eliminate the electrophysiological response to both residual solvent and mechanical force of the puff. The positive control (in this instance acetophenone) is a single compound that has elicited a consistent response from male and female navel orangeworm (NOW) moth. An additional semiochemical standard that provides consistent response and is used for bioassay studies with the male NOW moth is (Z,Z)-11,13-hexdecadienal, an aldehyde component from the female-produced sex pheromone.6-8
Plant Biology, Issue 63, bioassay, chemoreception, electroantennography, electrophysiological response, high-throughput, host-plant volatiles, navel orangeworm, screening tool
Community-based Adapted Tango Dancing for Individuals with Parkinson's Disease and Older Adults
Institutions: Emory University School of Medicine, Brigham and Woman‘s Hospital and Massachusetts General Hospital.
Adapted tango dancing improves mobility and balance in older adults and additional populations with balance impairments. It is composed of very simple step elements. Adapted tango involves movement initiation and cessation, multi-directional perturbations, varied speeds and rhythms. Focus on foot placement, whole body coordination, and attention to partner, path of movement, and aesthetics likely underlie adapted tango’s demonstrated efficacy for improving mobility and balance. In this paper, we describe the methodology to disseminate the adapted tango teaching methods to dance instructor trainees and to implement the adapted tango by the trainees in the community for older adults and individuals with Parkinson’s Disease (PD). Efficacy in improving mobility (measured with the Timed Up and Go, Tandem stance, Berg Balance Scale, Gait Speed and 30 sec chair stand), safety and fidelity of the program is maximized through targeted instructor and volunteer training and a structured detailed syllabus outlining class practices and progression.
Behavior, Issue 94, Dance, tango, balance, pedagogy, dissemination, exercise, older adults, Parkinson's Disease, mobility impairments, falls
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Analysis of Volatile and Oxidation Sensitive Compounds Using a Cold Inlet System and Electron Impact Mass Spectrometry
Institutions: Bielefeld University.
This video presents a protocol for the mass spectrometrical analysis of volatile and oxidation sensitive compounds using electron impact ionization. The analysis of volatile and oxidation sensitive compounds by mass spectrometry is not easily achieved, as all state-of-the-art mass spectrometric methods require at least one sample preparation step, e.g.
, dissolution and dilution of the analyte (electrospray ionization), co-crystallization of the analyte with a matrix compound (matrix-assisted laser desorption/ionization), or transfer of the prepared samples into the ionization source of the mass spectrometer, to be conducted under atmospheric conditions. Here, the use of a sample inlet system is described which enables the analysis of volatile metal organyls, silanes, and phosphanes using a sector field mass spectrometer equipped with an electron impact ionization source. All sample preparation steps and the sample introduction into the ion source of the mass spectrometer take place either under air-free conditions or under vacuum, enabling the analysis of compounds highly susceptible to oxidation. The presented technique is especially of interest for inorganic chemists, working with metal organyls, silanes, or phosphanes, which have to be handled using inert conditions, such as the Schlenk technique. The principle of operation is presented in this video.
Chemistry, Issue 91, mass spectrometry, electron impact, inlet system, volatile, air sensitive
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g.
, signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation.
The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Institutions: Princeton University.
The aim of de novo
protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo
protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity.
To disseminate these methods for broader use we present Protein WISDOM (https://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Flexible Colonoscopy in Mice to Evaluate the Severity of Colitis and Colorectal Tumors Using a Validated Endoscopic Scoring System
Institutions: Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland, Case Western Reserve University School of Medicine, Cleveland.
The use of modern endoscopy for research purposes has greatly facilitated our understanding of gastrointestinal pathologies. In particular, experimental endoscopy has been highly useful for studies that require repeated assessments in a single laboratory animal, such as those evaluating mechanisms of chronic inflammatory bowel disease and the progression of colorectal cancer. However, the methods used across studies are highly variable. At least three endoscopic scoring systems have been published for murine colitis and published protocols for the assessment of colorectal tumors fail to address the presence of concomitant colonic inflammation. This study develops and validates a reproducible endoscopic scoring system that integrates evaluation of both inflammation and tumors simultaneously. This novel scoring system has three major components: 1) assessment of the extent and severity of colorectal inflammation (based on perianal findings, transparency of the wall, mucosal bleeding, and focal lesions), 2) quantitative recording of tumor lesions (grid map and bar graph), and 3) numerical sorting of clinical cases by their pathological and research relevance based on decimal units with assigned categories of observed lesions and endoscopic complications (decimal identifiers). The video and manuscript presented herein were prepared, following IACUC-approved protocols, to allow investigators to score their own experimental mice using a well-validated and highly reproducible endoscopic methodology, with the system option to differentiate distal from proximal endoscopic colitis (D-PECS).
Medicine, Issue 80, Crohn's disease, ulcerative colitis, colon cancer, Clostridium difficile, SAMP mice, DSS/AOM-colitis, decimal scoring identifier
Viability Assays for Cells in Culture
Institutions: Duquesne University.
Manual cell counts on a microscope are a sensitive means of assessing cellular viability but are time-consuming and therefore expensive. Computerized viability assays are expensive in terms of equipment but can be faster and more objective than manual cell counts. The present report describes the use of three such viability assays. Two of these assays are infrared and one is luminescent. Both infrared assays rely on a 16 bit Odyssey Imager. One infrared assay uses the DRAQ5 stain for nuclei combined with the Sapphire stain for cytosol and is visualized in the 700 nm channel. The other infrared assay, an In-Cell Western, uses antibodies against cytoskeletal proteins (α-tubulin or microtubule associated protein 2) and labels them in the 800 nm channel. The third viability assay is a commonly used luminescent assay for ATP, but we use a quarter of the recommended volume to save on cost. These measurements are all linear and correlate with the number of cells plated, but vary in sensitivity. All three assays circumvent time-consuming microscopy and sample the entire well, thereby reducing sampling error. Finally, all of the assays can easily be completed within one day of the end of the experiment, allowing greater numbers of experiments to be performed within short timeframes. However, they all rely on the assumption that cell numbers remain in proportion to signal strength after treatments, an assumption that is sometimes not met, especially for cellular ATP. Furthermore, if cells increase or decrease in size after treatment, this might affect signal strength without affecting cell number. We conclude that all viability assays, including manual counts, suffer from a number of caveats, but that computerized viability assays are well worth the initial investment. Using all three assays together yields a comprehensive view of cellular structure and function.
Cellular Biology, Issue 83, In-cell Western, DRAQ5, Sapphire, Cell Titer Glo, ATP, primary cortical neurons, toxicity, protection, N-acetyl cysteine, hormesis
Using Learning Outcome Measures to assess Doctoral Nursing Education
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
Using Visual and Narrative Methods to Achieve Fair Process in Clinical Care
Institutions: Brandeis University, Brandeis University.
The Institute of Medicine has targeted patient-centeredness as an important area of quality improvement. A major dimension of patient-centeredness is respect for patient's values, preferences, and expressed needs. Yet specific approaches to gaining this understanding and translating it to quality care in the clinical setting are lacking. From a patient perspective quality is not a simple concept but is best understood in terms of five dimensions: technical outcomes; decision-making efficiency; amenities and convenience; information and emotional support; and overall patient satisfaction. Failure to consider quality from this five-pronged perspective results in a focus on medical outcomes, without considering the processes central to quality from the patient's perspective and vital to achieving good outcomes. In this paper, we argue for applying the concept of fair process in clinical settings. Fair process involves using a collaborative approach to exploring diagnostic issues and treatments with patients, explaining the rationale for decisions, setting expectations about roles and responsibilities, and implementing a core plan and ongoing evaluation. Fair process opens the door to bringing patient expertise into the clinical setting and the work of developing health care goals and strategies. This paper provides a step by step illustration of an innovative visual approach, called photovoice or photo-elicitation, to achieve fair process in clinical work with acquired brain injury survivors and others living with chronic health conditions. Applying this visual tool and methodology in the clinical setting will enhance patient-provider communication; engage patients as partners in identifying challenges, strengths, goals, and strategies; and support evaluation of progress over time. Asking patients to bring visuals of their lives into the clinical interaction can help to illuminate gaps in clinical knowledge, forge better therapeutic relationships with patients living with chronic conditions such as brain injury, and identify patient-centered goals and possibilities for healing. The process illustrated here can be used by clinicians, (primary care physicians, rehabilitation therapists, neurologists, neuropsychologists, psychologists, and others) working with people living with chronic conditions such as acquired brain injury, mental illness, physical disabilities, HIV/AIDS, substance abuse, or post-traumatic stress, and by leaders of support groups for the types of patients described above and their family members or caregivers.
Medicine, Issue 48, person-centered care, participatory visual methods, photovoice, photo-elicitation, narrative medicine, acquired brain injury, disability, rehabilitation, palliative care