The split hand phenomenon refers to predominant wasting of thenar muscles and is an early and specific feature of amyotrophic lateral sclerosis (ALS). A novel split hand index (SI) was developed to quantify the split hand phenomenon, and its diagnostic utility was assessed in ALS patients. The split hand index was derived by dividing the product of the compound muscle action potential (CMAP) amplitude recorded over the abductor pollicis brevis and first dorsal interosseous muscles by the CMAP amplitude recorded over the abductor digiti minimi muscle. In order to assess the diagnostic utility of the split hand index, ALS patients were prospectively assessed and their results were compared to neuromuscular disorder patients. The split hand index was significantly reduced in ALS when compared to neuromuscular disorder patients (P<0.0001). Limb-onset ALS patients exhibited the greatest reduction in the split hand index, and a value of 5.2 or less reliably differentiated ALS from other neuromuscular disorders. Consequently, the split hand index appears to be a novel diagnostic biomarker for ALS, perhaps facilitating an earlier diagnosis.
25 Related JoVE Articles!
Detection and Genogrouping of Noroviruses from Children's Stools By Taqman One-step RT-PCR
Institutions: Universidad Peruana Cayetano Heredia, Johns Hopkins University, University of Concepcion,Chile, University of California San Diego School of Medicine.
Noroviruses (NoVs) are the leading cause of outbreaks of sporadic acute gastroenteritis worldwide in humans of all ages. They are important cause of hospitalizations in children with a public health impact similar to that of Rotavirus. NoVs are RNA viruses of great genetic diversity and there is a continuous appearance of new strains. Five genogroups are recognized; GI and GII with their many genotypes and subtypes being the most important for human infection. However, the diagnosis of these two genotypes remains problematic, delaying diagnosis and treatment. 1, 2, 3
For RNA extraction from stool specimens the most commonly used method is the QIAmp Viral RNA commercial kit from Qiagen. This method combines the binding properties of a silica gel membrane, buffers that control RNases and provide optimum binding of the RNA to the column together with the speed of microspin. This method is simple, fast and reliable and is carried out in a few steps that are detailed in the description provided by the manufacturer.
Norovirus is second only to rotavirus as the most common cause of diarrhea. Norovirus diagnosis should be available in all studies on pathogenesis of diarrhea as well as in outbreaks or individual diarrhea cases. At present however norovirus diagnosis is restricted to only a few centers due to the lack of simple methods of diagnosis. This delays diagnosis and treatment 1, 2, 3
. In addition, due to costs and regulated transportation of corrosive buffers within and between countries use of these manufactured kits poses logistical problems. As a result, in this protocol we describe an alternative, economic, in-house method which is based on the original Boom et al.
which uses the nucleic acid binding properties of silica particles together with the anti-nuclease properties of guanidinium thiocyanate.
For the detection and genogrouping (GI and GII) of NoVs isolates from stool specimens, several RT-PCR protocols utilizing different targets have been developed. The consensus is that an RT-PCR using TaqMan chemistry would be the best molecular technique for diagnosis, because it combines high sensitivity, specificity and reproducibility with high throughput and ease of use. Here we describe an assay targeting the open reading frame 1 (ORF1)-ORF2 junction region; the most conserved region of the NoV genome and hence most suitable for diagnosis. For further genetic analysis a conventional RT-PCR that targets the highly variable N-terminal-shell from the major protein of the capsid (Region C) using primers originally described by Kojima et al. 5
is detailed. Sequencing of the PCR product from the conventional PCR enables the differentiation of genotypes belonging to the GI and GII genogroups.
Virology, Issue 65, Medicine, Genetics, norovirus, gastroenteritis, RNA extraction, diarrhea, stool samples, PCR, RT-PCR, TaqMan, silica
Quantitative Visualization and Detection of Skin Cancer Using Dynamic Thermal Imaging
Institutions: The Johns Hopkins University.
In 2010 approximately 68,720 melanomas will be diagnosed in the US alone, with around 8,650 resulting in death 1
. To date, the only effective treatment for melanoma remains surgical excision, therefore, the key to extended survival is early detection 2,3
. Considering the large numbers of patients diagnosed every year and the limitations in accessing specialized care quickly, the development of objective in vivo
diagnostic instruments to aid the diagnosis is essential. New techniques to detect skin cancer, especially non-invasive diagnostic tools, are being explored in numerous laboratories. Along with the surgical methods, techniques such as digital photography, dermoscopy, multispectral imaging systems (MelaFind), laser-based systems (confocal scanning laser microscopy, laser doppler perfusion imaging, optical coherence tomography), ultrasound, magnetic resonance imaging, are being tested. Each technique offers unique advantages and disadvantages, many of which pose a compromise between effectiveness and accuracy versus ease of use and cost considerations. Details about these techniques and comparisons are available in the literature 4
Infrared (IR) imaging was shown to be a useful method to diagnose the signs of certain diseases by measuring the local skin temperature. There is a large body of evidence showing that disease or deviation from normal functioning are accompanied by changes of the temperature of the body, which again affect the temperature of the skin 5,6
. Accurate data about the temperature of the human body and skin can provide a wealth of information on the processes responsible for heat generation and thermoregulation, in particular the deviation from normal conditions, often caused by disease. However, IR imaging has not been widely recognized in medicine due to the premature use of the technology 7,8
several decades ago, when temperature measurement accuracy and the spatial resolution were inadequate and sophisticated image processing tools were unavailable. This situation changed dramatically in the late 1990s-2000s. Advances in IR instrumentation, implementation of digital image processing algorithms and dynamic IR imaging, which enables scientists to analyze not only the spatial, but also the temporal thermal behavior of the skin 9
, allowed breakthroughs in the field.
In our research, we explore the feasibility of IR imaging, combined with theoretical and experimental studies, as a cost effective, non-invasive, in vivo optical measurement technique for tumor detection, with emphasis on the screening and early detection of melanoma 10-13
. In this study, we show data obtained in a patient study in which patients that possess a pigmented lesion with a clinical indication for biopsy are selected for imaging. We compared the difference in thermal responses between healthy and malignant tissue and compared our data with biopsy results. We concluded that the increased metabolic activity of the melanoma lesion can be detected by dynamic infrared imaging.
Medicine, Issue 51, Infrared imaging, quantitative thermal analysis, image processing, skin cancer, melanoma, transient thermal response, skin thermal models, skin phantom experiment, patient study
Performing and Processing FNA of Anterior Fat Pad for Amyloid
Institutions: Medical College of Wisconsin, Wayne State University School of Medicine Detroit Medical Center, Medical College of Wisconsin, Medical College of Wisconsin, Medical College of Wisconsin.
Historically, heart, liver, and kidney biopsies were performed to demonstrate amyloid deposits in amyloidosis. Since the clinical presentation of this disease is so variable and non-specific, the associated risks of these biopsies are too great for the diagnostic yield. Other sites that have a lower biopsy risk, such as skin or gingival, are also relatively invasive and expensive. In addition, these biopsies may not always have sufficient amyloid deposits to establish a diagnosis. Fat pad aspiration has demonstrated good clinical correlation with low cost and minimal morbidity. However, there are no standardized protocols for performing this procedure or processing the aspirated specimen, which leads to variable and nonreproducible results. The most frequently utilized modality for detecting amyloid in tissue is an apple-green birefringence on Congo red stained sections using a polarizing microscope. This technique requires cell block preparation of aspirated material. Unfortunately, patients presenting in early stage of amyloidosis have minimal amounts of amyloid which greatly reduces the sensitivity of Congo red stained cell block sections of fat pad aspirates. Therefore, ultrastructural evaluation of fat pad aspirates by electron microscopy should be utilized, given its increased sensitivity for amyloid detection. This article demonstrates a simple and reproducible procedure for performing anterior fat pad aspiration for the detection of amyloid utilizing both Congo red staining of cell block sections and electron microscopy for ultrastructural identification.
Medicine, Issue 44, AL amyloidosis, Congo Red, abdominal fat pad biopsy, electron microscopy, ultrastructural evaluation
Flying Insect Detection and Classification with Inexpensive Sensors
Institutions: University of California, Riverside, University of California, Riverside, University of São Paulo - USP, ISCA Technologies.
An inexpensive, noninvasive system that could accurately classify flying insects would have important implications for entomological research, and allow for the development of many useful applications in vector and pest control for both medical and agricultural entomology. Given this, the last sixty years have seen many research efforts devoted to this task. To date, however, none of this research has had a lasting impact. In this work, we show that pseudo-acoustic optical sensors can produce superior data; that additional features, both intrinsic and extrinsic to the insect’s flight behavior, can be exploited to improve insect classification; that a Bayesian classification approach allows to efficiently learn classification models that are very robust to over-fitting, and a general classification framework allows to easily incorporate arbitrary number of features. We demonstrate the findings with large-scale experiments that dwarf all previous works combined, as measured by the number of insects and the number of species considered.
Bioengineering, Issue 92, flying insect detection, automatic insect classification, pseudo-acoustic optical sensors, Bayesian classification framework, flight sound, circadian rhythm
Cortical Source Analysis of High-Density EEG Recordings in Children
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1
. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2
, because the composition and spatial configuration of head tissues changes dramatically over development3
In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis.
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials
Measuring Attentional Biases for Threat in Children and Adults
Institutions: Rutgers University.
Investigators have long been interested in the human propensity for the rapid detection of threatening stimuli. However, until recently, research in this domain has focused almost exclusively on adult participants, completely ignoring the topic of threat detection over the course of development. One of the biggest reasons for the lack of developmental work in this area is likely the absence of a reliable paradigm that can measure perceptual biases for threat in children. To address this issue, we recently designed a modified visual search paradigm similar to the standard adult paradigm that is appropriate for studying threat detection in preschool-aged participants. Here we describe this new procedure. In the general paradigm, we present participants with matrices of color photographs, and ask them to find and touch a target on the screen. Latency to touch the target is recorded. Using a touch-screen monitor makes the procedure simple and easy, allowing us to collect data in participants ranging from 3 years of age to adults. Thus far, the paradigm has consistently shown that both adults and children detect threatening stimuli (e.g.,
snakes, spiders, angry/fearful faces) more quickly than neutral stimuli (e.g.,
flowers, mushrooms, happy/neutral faces). Altogether, this procedure provides an important new tool for researchers interested in studying the development of attentional biases for threat.
Behavior, Issue 92, Detection, threat, attention, attentional bias, anxiety, visual search
Adjustable Stiffness, External Fixator for the Rat Femur Osteotomy and Segmental Bone Defect Models
Institutions: Queensland University of Technology, RISystem AG.
The mechanical environment around the healing of broken bone is very important as it determines the way the fracture will heal. Over the past decade there has been great clinical interest in improving bone healing by altering the mechanical environment through the fixation stability around the lesion. One constraint of preclinical animal research in this area is the lack of experimental control over the local mechanical environment within a large segmental defect as well as osteotomies as they heal. In this paper we report on the design and use of an external fixator to study the healing of large segmental bone defects or osteotomies. This device not only allows for controlled axial stiffness on the bone lesion as it heals, but it also enables the change of stiffness during the healing process in vivo.
The conducted experiments have shown that the fixators were able to maintain a 5 mm femoral defect gap in rats in vivo
during unrestricted cage activity for at least 8 weeks. Likewise, we observed no distortion or infections, including pin infections during the entire healing period. These results demonstrate that our newly developed external fixator was able to achieve reproducible and standardized stabilization, and the alteration of the mechanical environment of in vivo
rat large bone defects and various size osteotomies. This confirms that the external fixation device is well suited for preclinical research investigations using a rat model in the field of bone regeneration and repair.
Medicine, Issue 92, external fixator, bone healing, small animal model, large bone defect and osteotomy model, rat model, mechanical environment, mechanobiology.
Creating Rigidly Stabilized Fractures for Assessing Intramembranous Ossification, Distraction Osteogenesis, or Healing of Critical Sized Defects
Institutions: University of California, San Francisco .
Assessing modes of skeletal repair is essential for developing therapies to be used clinically to treat fractures. Mechanical stability plays a large role in healing of bone injuries. In the worst-case scenario mechanical instability can lead to delayed or non-union in humans. However, motion can also stimulate the healing process. In fractures that have motion cartilage forms to stabilize the fracture bone ends, and this cartilage is gradually replaced by bone through recapitulation of the developmental process of endochondral ossification. In contrast, if a bone fracture is rigidly stabilized bone forms directly via intramembranous ossification. Clinically, both endochondral and intramembranous ossification occur simultaneously. To effectively replicate this process investigators insert a pin into the medullary canal of the fractured bone as described by Bonnarens4
. This experimental method provides excellent lateral stability while allowing rotational instability to persist. However, our understanding of the mechanisms that regulate these two distinct processes can also be enhanced by experimentally isolating each of these processes. We have developed a stabilization protocol that provides rotational and lateral stabilization. In this model, intramembranous ossification is the only mode of healing that is observed, and healing parameters can be compared among different strains of genetically modified mice 5-7
, after application of bioactive molecules 8,9
, after altering physiological parameters of healing 10
, after modifying the amount or time of stabilization 11
, after distraction osteogenesis 12
, after creation of a non-union 13
, or after creation of a critical sized defect. Here, we illustrate how to apply the modified Ilizarov fixators for studying tibial fracture healing and distraction osteogenesis in mice.
Medicine, Issue 62, Bone fracture, intramembranous ossification, distraction osteogenesis, bone healing
Cell-based Assay Protocol for the Prognostic Prediction of Idiopathic Scoliosis Using Cellular Dielectric Spectroscopy
Institutions: Sainte-Justine University Hospital Research Center, Université de Montréal.
This protocol details the experimental and analytical procedure for a cell-based assay developed in our laboratory as a functional test to predict the prognosis of idiopathic scoliosis in asymptomatic and affected children. The assay consists of the evaluation of the functional status of Gi and Gs proteins in peripheral blood mononuclear cells (PBMCs) by cellular dielectric spectroscopy (CDS), using an automated CDS-based instrument, and the classification of children into three functional groups (FG1, FG2, FG3) with respect to the profile of imbalance between the degree of response to Gi and Gs proteins stimulation. The classification is further confirmed by the differential effect of osteopontin (OPN) on response to Gi stimulation among groups and the severe progression of disease is referenced by FG2. Approximately, a volume of 10 ml of blood is required to extract PBMCs by Ficoll-gradient and cells are then stored in liquid nitrogen. The adequate number of PBMCs to perform the assay is obtained after two days of cell culture. Essentially, cells are first incubated with phytohemmaglutinin (PHA). After 24 hr incubation, medium is replaced by a PHA-free culture medium for an additional 24 hr prior to cell seeding and OPN treatment. Cells are then spectroscopically screened for their responses to somatostatin and isoproterenol, which respectively activate Gi and Gs proteins through their cognate receptors. Both somatostatin and isoproterenol are simultaneously injected with an integrated fluidics system and the cells' responses are monitored for 15 min. The assay can be performed with fresh or frozen PBMCs and the procedure is completed within 4 days.
Medicine, Issue 80, Blood Cells, Lymphocytes, Spinal Diseases, Diagnostic Techniques and Procedures, Clinical Laboratory Techniques, Dielectric Spectroscopy, Musculoskeletal Diseases, Idiopathic scoliosis, classification, prognosis, G proteins, cellular dielectric spectroscopy, PBMCs
Single Particle Electron Microscopy Reconstruction of the Exosome Complex Using the Random Conical Tilt Method
Institutions: Yale University.
Single particle electron microscopy (EM) reconstruction has recently become a popular tool to get the three-dimensional (3D) structure of large macromolecular complexes. Compared to X-ray crystallography, it has some unique advantages. First, single particle EM reconstruction does not need to crystallize the protein sample, which is the bottleneck in X-ray crystallography, especially for large macromolecular complexes. Secondly, it does not need large amounts of protein samples. Compared with milligrams of proteins necessary for crystallization, single particle EM reconstruction only needs several micro-liters of protein solution at nano-molar concentrations, using the negative staining EM method. However, despite a few macromolecular assemblies with high symmetry, single particle EM is limited at relatively low resolution (lower than 1 nm resolution) for many specimens especially those without symmetry. This technique is also limited by the size of the molecules under study, i.e. 100 kDa for negatively stained specimens and 300 kDa for frozen-hydrated specimens in general.
For a new sample of unknown structure, we generally use a heavy metal solution to embed the molecules by negative staining. The specimen is then examined in a transmission electron microscope to take two-dimensional (2D) micrographs of the molecules. Ideally, the protein molecules have a homogeneous 3D structure but exhibit different orientations in the micrographs. These micrographs are digitized and processed in computers as "single particles". Using two-dimensional alignment and classification techniques, homogenous molecules in the same views are clustered into classes. Their averages enhance the signal of the molecule's 2D shapes. After we assign the particles with the proper relative orientation (Euler angles), we will be able to reconstruct the 2D particle images into a 3D virtual volume.
In single particle 3D reconstruction, an essential step is to correctly assign the proper orientation of each single particle. There are several methods to assign the view for each particle, including the angular reconstitution1
and random conical tilt (RCT) method2
. In this protocol, we describe our practice in getting the 3D reconstruction of yeast exosome complex using negative staining EM and RCT. It should be noted that our protocol of electron microscopy and image processing follows the basic principle of RCT but is not the only way to perform the method. We first describe how to embed the protein sample into a layer of Uranyl-Formate with a thickness comparable to the protein size, using a holey carbon grid covered with a layer of continuous thin carbon film. Then the specimen is inserted into a transmission electron microscope to collect untilted (0-degree) and tilted (55-degree) pairs of micrographs that will be used later for processing and obtaining an initial 3D model of the yeast exosome. To this end, we perform RCT and then refine the initial 3D model by using the projection matching refinement method3
Structural Biology, Issue 49, Electron microscopy, single particle three-dimensional reconstruction, exosome complex, negative staining
Probe-based Confocal Laser Endomicroscopy of the Urinary Tract: The Technique
Institutions: Stanford University School of Medicine , Veterans Affairs Palo Alto Health Care System.
Probe-based confocal laser endomicroscopy (CLE) is an emerging optical imaging technology that enables real-time in vivo
microscopy of mucosal surfaces during standard endoscopy. With applications currently in the respiratory1
and gastrointestinal tracts,2-6
CLE has also been explored in the urinary tract for bladder cancer diagnosis.7-10
Cellular morphology and tissue microarchitecture can be resolved with micron scale resolution in real time, in addition to dynamic imaging of the normal and pathological vasculature.7
The probe-based CLE system (Cellvizio, Mauna Kea Technologies, France) consists of a reusable fiberoptic imaging probe coupled to a 488 nm laser scanning unit. The imaging probe is inserted in the working channels of standard flexible and rigid endoscopes. An endoscope-based CLE system (Optiscan, Australia), in which the confocal endomicroscopy functionality is integrated onto the endoscope, is also used in the gastrointestinal tract. Given the larger scope diameter, however, application in the urinary tract is currently limited to ex vivo
Confocal image acquisition is done through direct contact of the imaging probe with the target tissue and recorded as video sequences. As in the gastrointestinal tract, endomicroscopy of the urinary tract requires an exogenenous contrast agent—most commonly fluorescein, which can be administered intravenously or intravesically. Intravesical administration is a well-established method to introduce pharmacological agents locally with minimal systemic toxicity that is unique to the urinary tract. Fluorescein rapidly stains the extracellular matrix and has an established safety profile.12
Imaging probes of various diameters enable compatibility with different caliber endoscopes. To date, 1.4 and 2.6 mm probes have been evaluated with flexible and rigid cystoscopy.10
Recent availability of a < 1 mm imaging probe13
opens up the possibility of CLE in the upper urinary tract during ureteroscopy. Fluorescence cystoscopy (i.e.
photodynamic diagnosis) and narrow band imaging are additional endoscope-based optical imaging modalities14
that can be combined with CLE to achieve multimodal imaging of the urinary tract. In the future, CLE may be coupled with molecular contrast agents such as fluorescently labeled peptides15
and antibodies for endoscopic imaging of disease processes with molecular specificity.
Medicine, Issue 71, Anatomy, Physiology, Cancer Biology, Surgery, Basic Protocols, Confocal laser endomicroscopy, microscopy, endoscopy, cystoscopy, human bladder, bladder cancer, urology, minimally invasive, cellular imaging
One Dimensional Turing-Like Handshake Test for Motor Intelligence
Institutions: Ben-Gurion University.
In the Turing test, a computer model is deemed to "think intelligently" if it can generate answers that are not distinguishable from those of a human. However, this test is limited to the linguistic aspects of machine intelligence. A salient function of the brain is the control of movement, and the movement of the human hand is a sophisticated demonstration of this function. Therefore, we propose a Turing-like handshake test, for machine motor intelligence. We administer the test through a telerobotic system in which the interrogator is engaged in a task of holding a robotic stylus and interacting with another party (human or artificial). Instead of asking the interrogator whether the other party is a person or a computer program, we employ a two-alternative forced choice method and ask which of two systems is more human-like. We extract a quantitative grade for each model according to its resemblance to the human handshake motion and name it "Model Human-Likeness Grade" (MHLG). We present three methods to estimate the MHLG. (i) By calculating the proportion of subjects' answers that the model is more human-like than the human; (ii) By comparing two weighted sums of human and model handshakes we fit a psychometric curve and extract the point of subjective equality (PSE); (iii) By comparing a given model with a weighted sum of human and random signal, we fit a psychometric curve to the answers of the interrogator and extract the PSE for the weight of the human in the weighted sum. Altogether, we provide a protocol to test computational models of the human handshake. We believe that building a model is a necessary step in understanding any phenomenon and, in this case, in understanding the neural mechanisms responsible for the generation of the human handshake.
Neuroscience, Issue 46, Turing test, Human Machine Interface, Haptics, Teleoperation, Motor Control, Motor Behavior, Diagnostics, Perception, handshake, telepresence
Development of a Virtual Reality Assessment of Everyday Living Skills
Institutions: NeuroCog Trials, Inc., Duke-NUS Graduate Medical Center, Duke University Medical Center, Fox Evaluation and Consulting, PLLC, University of Miami Miller School of Medicine.
Cognitive impairments affect the majority of patients with schizophrenia and these impairments predict poor long term psychosocial outcomes. Treatment studies aimed at cognitive impairment in patients with schizophrenia not only require demonstration of improvements on cognitive tests, but also evidence that any cognitive changes lead to clinically meaningful improvements. Measures of “functional capacity” index the extent to which individuals have the potential to perform skills required for real world functioning. Current data do not support the recommendation of any single instrument for measurement of functional capacity. The Virtual Reality Functional Capacity Assessment Tool (VRFCAT) is a novel, interactive gaming based measure of functional capacity that uses a realistic simulated environment to recreate routine activities of daily living. Studies are currently underway to evaluate and establish the VRFCAT’s sensitivity, reliability, validity, and practicality. This new measure of functional capacity is practical, relevant, easy to use, and has several features that improve validity and sensitivity of measurement of function in clinical trials of patients with CNS disorders.
Behavior, Issue 86, Virtual Reality, Cognitive Assessment, Functional Capacity, Computer Based Assessment, Schizophrenia, Neuropsychology, Aging, Dementia
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion.
Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via
quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Identification of Disease-related Spatial Covariance Patterns using Neuroimaging Data
Institutions: The Feinstein Institute for Medical Research.
The scaled subprofile model (SSM)1-4
is a multivariate PCA-based algorithm that identifies major sources of variation in patient and control group brain image data while rejecting lesser components (Figure 1
). Applied directly to voxel-by-voxel covariance data of steady-state multimodality images, an entire group image set can be reduced to a few significant linearly independent covariance patterns and corresponding subject scores. Each pattern, termed a group invariant subprofile (GIS), is an orthogonal principal component that represents a spatially distributed network of functionally interrelated brain regions. Large global mean scalar effects that can obscure smaller network-specific contributions are removed by the inherent logarithmic conversion and mean centering of the data2,5,6
. Subjects express each of these patterns to a variable degree represented by a simple scalar score that can correlate with independent clinical or psychometric descriptors7,8
. Using logistic regression analysis of subject scores (i.e.
pattern expression values), linear coefficients can be derived to combine multiple principal components into single disease-related spatial covariance patterns, i.e.
composite networks with improved discrimination of patients from healthy control subjects5,6
. Cross-validation within the derivation set can be performed using bootstrap resampling techniques9
. Forward validation is easily confirmed by direct score evaluation of the derived patterns in prospective datasets10
. Once validated, disease-related patterns can be used to score individual patients with respect to a fixed reference sample, often the set of healthy subjects that was used (with the disease group) in the original pattern derivation11
. These standardized values can in turn be used to assist in differential diagnosis12,13
and to assess disease progression and treatment effects at the network level7,14-16
. We present an example of the application of this methodology to FDG PET data of Parkinson's Disease patients and normal controls using our in-house software to derive a characteristic covariance pattern biomarker of disease.
Medicine, Issue 76, Neurobiology, Neuroscience, Anatomy, Physiology, Molecular Biology, Basal Ganglia Diseases, Parkinsonian Disorders, Parkinson Disease, Movement Disorders, Neurodegenerative Diseases, PCA, SSM, PET, imaging biomarkers, functional brain imaging, multivariate spatial covariance analysis, global normalization, differential diagnosis, PD, brain, imaging, clinical techniques
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Training Synesthetic Letter-color Associations by Reading in Color
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Ultrasound Assessment of Endothelial-Dependent Flow-Mediated Vasodilation of the Brachial Artery in Clinical Research
Institutions: University of California, San Francisco, Veterans Affairs Medical Center, San Francisco, Veterans Affairs Medical Center, San Francisco.
The vascular endothelium is a monolayer of cells that cover the interior of blood vessels and provide both structural and functional roles. The endothelium acts as a barrier, preventing leukocyte adhesion and aggregation, as well as controlling permeability to plasma components. Functionally, the endothelium affects vessel tone.
Endothelial dysfunction is an imbalance between the chemical species which regulate vessel tone, thombroresistance, cellular proliferation and mitosis. It is the first step in atherosclerosis and is associated with coronary artery disease, peripheral artery disease, heart failure, hypertension, and hyperlipidemia.
The first demonstration of endothelial dysfunction involved direct infusion of acetylcholine and quantitative coronary angiography. Acetylcholine binds to muscarinic receptors on the endothelial cell surface, leading to an increase of intracellular calcium and increased nitric oxide (NO) production. In subjects with an intact endothelium, vasodilation was observed while subjects with endothelial damage experienced paradoxical vasoconstriction.
There exists a non-invasive, in vivo
method for measuring endothelial function in peripheral arteries using high-resolution B-mode ultrasound. The endothelial function of peripheral arteries is closely related to coronary artery function. This technique measures the percent diameter change in the brachial artery during a period of reactive hyperemia following limb ischemia.
This technique, known as endothelium-dependent, flow-mediated vasodilation (FMD) has value in clinical research settings. However, a number of physiological and technical issues can affect the accuracy of the results and appropriate guidelines for the technique have been published. Despite the guidelines, FMD remains heavily operator dependent and presents a steep learning curve. This article presents a standardized method for measuring FMD in the brachial artery on the upper arm and offers suggestions to reduce intra-operator variability.
Medicine, Issue 92, endothelial function, endothelial dysfunction, brachial artery, peripheral artery disease, ultrasound, vascular, endothelium, cardiovascular disease.
Tilt Testing with Combined Lower Body Negative Pressure: a "Gold Standard" for Measuring Orthostatic Tolerance
Institutions: Simon Fraser University .
Orthostatic tolerance (OT) refers to the ability to maintain cardiovascular stability when upright, against the hydrostatic effects of gravity, and hence to maintain cerebral perfusion and prevent syncope (fainting). Various techniques are available to assess OT and the effects of gravitational stress upon the circulation, typically by reproducing a presyncopal event (near-fainting episode) in a controlled laboratory environment. The time and/or degree of stress required to provoke this response provides the measure of OT. Any technique used to determine OT should: enable distinction between patients with orthostatic intolerance (of various causes) and asymptomatic control subjects; be highly reproducible, enabling evaluation of therapeutic interventions; avoid invasive procedures, which are known to impair OT1
In the late 1980s head-upright tilt testing was first utilized for diagnosing syncope2
. Since then it has been used to assess OT in patients with syncope of unknown cause, as well as in healthy subjects to study postural cardiovascular reflexes2-6
. Tilting protocols comprise three categories: passive tilt; passive tilt accompanied by pharmacological provocation; and passive tilt with combined lower body negative pressure (LBNP). However, the effects of tilt testing (and other orthostatic stress testing modalities) are often poorly reproducible, with low sensitivity and specificity to diagnose orthostatic intolerance7
Typically, a passive tilt includes 20-60 min of orthostatic stress continued until the onset of presyncope in patients2-6
. However, the main drawback of this procedure is its inability to invoke presyncope in all individuals undergoing the test, and corresponding low sensitivity8,9
. Thus, different methods were explored to increase the orthostatic stress and improve sensitivity.
Pharmacological provocation has been used to increase the orthostatic challenge, for example using isoprenaline4,7,10,11
or sublingual nitrate12,13
. However, the main drawback of these approaches are increases in sensitivity at the cost of unacceptable decreases in specificity10,14
, with a high positive response rate immediately after administration15
. Furthermore, invasive procedures associated with some pharmacological provocations greatly increase the false positive rate1
Another approach is to combine passive tilt testing with LBNP, providing a stronger orthostatic stress without invasive procedures or drug side-effects, using the technique pioneered by Professor Roger Hainsworth in the 1990s16-18
. This approach provokes presyncope in almost all subjects (allowing for symptom recognition in patients with syncope), while discriminating between patients with syncope and healthy controls, with a specificity of 92%, sensitivity of 85%, and repeatability of 1.1±0.6 min16,17
. This allows not only diagnosis and pathophysiological assessment19-22
, but also the evaluation of treatments for orthostatic intolerance due to its high repeatability23-30
. For these reasons, we argue this should be the "gold standard" for orthostatic stress testing, and accordingly this will be the method described in this paper.
Medicine, Issue 73, Anatomy, Physiology, Biomedical Engineering, Neurobiology, Kinesiology, Cardiology, tilt test, lower body negative pressure, orthostatic stress, syncope, orthostatic tolerance, fainting, gravitational stress, head upright, stroke, clinical techniques
Detection of Invasive Pulmonary Aspergillosis in Haematological Malignancy Patients by using Lateral-flow Technology
Institutions: University of Exeter, Queen Mary University of London, St. Bartholomew's Hospital and The London NHS Trust.
Invasive pulmonary aspergillosis (IPA) is a leading cause of morbidity and mortality in haematological malignancy patients and hematopoietic stem cell transplant recipients1
. Detection of IPA represents a formidable diagnostic challenge and, in the absence of a 'gold standard', relies on a combination of clinical data and microbiology and histopathology where feasible. Diagnosis of IPA must conform to the European Organization for Research and Treatment of Cancer and the National Institute of Allergy and Infectious Diseases Mycology Study Group (EORTC/MSG) consensus defining "proven", "probable", and "possible" invasive fungal diseases2
. Currently, no nucleic acid-based tests have been externally validated for IPA detection and so polymerase chain reaction (PCR) is not included in current EORTC/MSG diagnostic criteria.
Identification of Aspergillus
in histological sections is problematic because of similarities in hyphal morphologies with other invasive fungal pathogens3
, and proven identification requires isolation of the etiologic agent in pure culture. Culture-based approaches rely on the availability of biopsy samples, but these are not always accessible in sick patients, and do not always yield viable propagules for culture when obtained.
An important feature in the pathogenesis of Aspergillus
is angio-invasion, a trait that provides opportunities to track the fungus immunologically using tests that detect characteristic antigenic signatures molecules in serum and bronchoalveolar lavage (BAL) fluids. This has led to the development of the Platelia enzyme immunoassay (GM-EIA) that detects Aspergillus
galactomannan and a 'pan-fungal' assay (Fungitell test) that detects the conserved fungal cell wall component (1 →3)-β-D-glucan, but not in the mucorales that lack this component in their cell walls1,4
. Issues surrounding the accuracy of these tests1,4-6
has led to the recent development of next-generation monoclonal antibody (MAb)-based assays that detect surrogate markers of infection1,5
recently described the generation of an Aspergillus
-specific MAb (JF5) using hybridoma technology and its use to develop an immuno-chromatographic lateral-flow device (LFD) for the point-of-care (POC) diagnosis of IPA. A major advantage of the LFD is its ability to detect activity since MAb JF5 binds to an extracellular glycoprotein antigen that is secreted during active growth of the fungus only5
. This is an important consideration when using fluids such as lung BAL for diagnosing IPA since Aspergillus
spores are a common component of inhaled air. The utility of the device in diagnosing IPA has been demonstrated using an animal model of infection, where the LFD displayed improved sensitivity and specificity compared to the Platelia GM and Fungitell (1 → 3)-β-D-glucan assays7
Here, we present a simple LFD procedure to detect Aspergillus
antigen in human serum and BAL fluids. Its speed and accuracy provides a novel adjunct point-of-care test for diagnosis of IPA in haematological malignancy patients.
Immunology, Issue 61, Invasive pulmonary aspergillosis, acute myeloid leukemia, bone marrow transplant, diagnosis, monoclonal antibody, lateral-flow technology
One-step Metabolomics: Carbohydrates, Organic and Amino Acids Quantified in a Single Procedure
Institutions: Saint Louis University School of Medicine.
Every infant born in the US is now screened for up to 42 rare genetic disorders called "inborn errors of metabolism". The screening method is based on tandem mass spectrometry and quantifies acylcarnitines as a screen for organic acidemias and also measures amino acids. All states also perform enzymatic testing for carbohydrate disorders such as galactosemia. Because the results can be non-specific, follow-up testing of positive results is required using a more definitive method. The present report describes the "urease" method of sample preparation for inborn error screening. Crystalline urease enzyme is used to remove urea from body fluids which permits most other water-soluble metabolites to be dehydrated and derivatized for gas chromatography in a single procedure. Dehydration by evaporation in a nitrogen stream is facilitated by adding acetonitrile and methylene chloride. Then, trimethylsilylation takes place in the presence of a unique catalyst, triethylammonium trifluoroacetate. Automated injection and chromatography is followed by macro-driven custom quantification of 192 metabolites and semi-quantification of every major component using specialized libraries of mass spectra of TMS derivatized biological compounds. The analysis may be performed on the widely-used Chemstation platform using the macros and libraries available from the author. In our laboratory, over 16,000 patient samples have been analyzed using the method with a diagnostic yield of about 17%--that is, 17% of the samples results reveal findings that should be acted upon by the ordering physician. Included in these are over 180 confirmed inborn errors, of which about 38% could not have been diagnosed using previous methods.
Biochemistry, Issue 40, metabolomics, gas chromatography/mass spectrometry, GC/MS, inborn errors, vitamin deficiency, BNA analyses, carbohydrate, amino acid, organic acid, urease
Basics of Multivariate Analysis in Neuroimaging Data
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9
. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Institutions: University of Utah.
A limitation of traditional full-field electroretinograms (ERG) for the diagnosis of retinopathy is lack of sensitivity. Generally, ERG results are normal unless more than approximately 20% of the retina is affected. In practical terms, a patient might be legally blind as a result of macular degeneration or other scotomas and still appear normal, according to traditional full field ERG. An important development in ERGs is the multifocal ERG (mfERG). Erich Sutter adapted the mathematical sequences called binary m-sequences enabling the isolation from a single electrical signal an electroretinogram representing less than each square millimeter of retina in response to a visual stimulus1
Results that are generated by mfERG appear similar to those generated by flash ERG. In contrast to flash ERG, which best generates data appropriate for whole-eye disorders. The basic mfERG result is based on the calculated mathematical average of an approximation of the positive deflection component of traditional ERG response, known as the b-wave1
. Multifocal ERG programs measure electrical activity from more than a hundred retinal areas per eye, in a few minutes. The enhanced spatial resolution enables scotomas and retinal dysfunction to be mapped and quantified.
In the protocol below, we describe the recording of mfERGs using a bipolar speculum contact lens.
Components of mfERG systems vary between manufacturers. For the presentation of visible stimulus, some suitable CRT monitors are available but most systems have adopted the use of flat-panel liquid crystal displays (LCD). The visual stimuli depicted here, were produced by a LCD microdisplay subtending 35 - 40 degrees horizontally and 30 - 35 degrees vertically of visual field, and calibrated to produce multifocal flash intensities of 2.7 cd s m-2
. Amplification was 50K. Lower and upper bandpass limits were 10 and 300 Hz. The software packages used were VERIS versions 5 and 6.
Medicine, Issue 58, Multifocal electroretinogram, mfERG, electroretinogram, ERG