JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Testing adaptive hypotheses of convergence with functional landscapes: a case study of bone-cracking hypercarnivores.
PUBLISHED: 01-01-2013
Morphological convergence is a well documented phenomenon in mammals, and adaptive explanations are commonly employed to infer similar functions for convergent characteristics. I present a study that adopts aspects of theoretical morphology and engineering optimization to test hypotheses about adaptive convergent evolution. Bone-cracking ecomorphologies in Carnivora were used as a case study. Previous research has shown that skull deepening and widening are major evolutionary patterns in convergent bone-cracking canids and hyaenids. A simple two-dimensional design space, with skull width-to-length and depth-to-length ratios as variables, was used to examine optimized shapes for two functional properties: mechanical advantage (MA) and strain energy (SE). Functionality of theoretical skull shapes was studied using finite element analysis (FEA) and visualized as functional landscapes. The distribution of actual skull shapes in the landscape showed a convergent trend of plesiomorphically low-MA and moderate-SE skulls evolving towards higher-MA and moderate-SE skulls; this is corroborated by FEA of 13 actual specimens. Nevertheless, regions exist in the landscape where high-MA and lower-SE shapes are not represented by existing species; their vacancy is observed even at higher taxonomic levels. Results highlight the interaction of biomechanical and non-biomechanical factors in constraining general skull dimensions to localized functional optima through evolution.
Authors: Tayyab Suratwala, Rusty Steele, Michael Feit, Rebecca Dylla-Spears, Richard Desjardin, Dan Mason, Lana Wong, Paul Geraghty, Phil Miller, Nan Shen.
Published: 12-01-2014
Convergent Polishing is a novel polishing system and method for finishing flat and spherical glass optics in which a workpiece, independent of its initial shape (i.e., surface figure), will converge to final surface figure with excellent surface quality under a fixed, unchanging set of polishing parameters in a single polishing iteration. In contrast, conventional full aperture polishing methods require multiple, often long, iterative cycles involving polishing, metrology and process changes to achieve the desired surface figure. The Convergent Polishing process is based on the concept of workpiece-lap height mismatch resulting in pressure differential that decreases with removal and results in the workpiece converging to the shape of the lap. The successful implementation of the Convergent Polishing process is a result of the combination of a number of technologies to remove all sources of non-uniform spatial material removal (except for workpiece-lap mismatch) for surface figure convergence and to reduce the number of rogue particles in the system for low scratch densities and low roughness. The Convergent Polishing process has been demonstrated for the fabrication of both flats and spheres of various shapes, sizes, and aspect ratios on various glass materials. The practical impact is that high quality optical components can be fabricated more rapidly, more repeatedly, with less metrology, and with less labor, resulting in lower unit costs. In this study, the Convergent Polishing protocol is specifically described for fabricating 26.5 cm square fused silica flats from a fine ground surface to a polished ~λ/2 surface figure after polishing 4 hr per surface on a 81 cm diameter polisher.
23 Related JoVE Articles!
Play Button
In situ Compressive Loading and Correlative Noninvasive Imaging of the Bone-periodontal Ligament-tooth Fibrous Joint
Authors: Andrew T. Jang, Jeremy D. Lin, Youngho Seo, Sergey Etchin, Arno Merkle, Kevin Fahey, Sunita P. Ho.
Institutions: University of California San Francisco, University of California San Francisco, Xradia Inc..
This study demonstrates a novel biomechanics testing protocol. The advantage of this protocol includes the use of an in situ loading device coupled to a high resolution X-ray microscope, thus enabling visualization of internal structural elements under simulated physiological loads and wet conditions. Experimental specimens will include intact bone-periodontal ligament (PDL)-tooth fibrous joints. Results will illustrate three important features of the protocol as they can be applied to organ level biomechanics: 1) reactionary force vs. displacement: tooth displacement within the alveolar socket and its reactionary response to loading, 2) three-dimensional (3D) spatial configuration and morphometrics: geometric relationship of the tooth with the alveolar socket, and 3) changes in readouts 1 and 2 due to a change in loading axis, i.e. from concentric to eccentric loads. Efficacy of the proposed protocol will be evaluated by coupling mechanical testing readouts to 3D morphometrics and overall biomechanics of the joint. In addition, this technique will emphasize on the need to equilibrate experimental conditions, specifically reactionary loads prior to acquiring tomograms of fibrous joints. It should be noted that the proposed protocol is limited to testing specimens under ex vivo conditions, and that use of contrast agents to visualize soft tissue mechanical response could lead to erroneous conclusions about tissue and organ-level biomechanics.
Bioengineering, Issue 85, biomechanics, bone-periodontal ligament-tooth complex, concentric loads, eccentric loads, contrast agent
Play Button
Non-radioactive in situ Hybridization Protocol Applicable for Norway Spruce and a Range of Plant Species
Authors: Anna Karlgren, Jenny Carlsson, Niclas Gyllenstrand, Ulf Lagercrantz, Jens F. Sundström.
Institutions: Uppsala University, Swedish University of Agricultural Sciences.
The high-throughput expression analysis technologies available today give scientists an overflow of expression profiles but their resolution in terms of tissue specific expression is limited because of problems in dissecting individual tissues. Expression data needs to be confirmed and complemented with expression patterns using e.g. in situ hybridization, a technique used to localize cell specific mRNA expression. The in situ hybridization method is laborious, time-consuming and often requires extensive optimization depending on species and tissue. In situ experiments are relatively more difficult to perform in woody species such as the conifer Norway spruce (Picea abies). Here we present a modified DIG in situ hybridization protocol, which is fast and applicable on a wide range of plant species including P. abies. With just a few adjustments, including altered RNase treatment and proteinase K concentration, we could use the protocol to study tissue specific expression of homologous genes in male reproductive organs of one gymnosperm and two angiosperm species; P. abies, Arabidopsis thaliana and Brassica napus. The protocol worked equally well for the species and genes studied. AtAP3 and BnAP3 were observed in second and third whorl floral organs in A. thaliana and B. napus and DAL13 in microsporophylls of male cones from P. abies. For P. abies the proteinase K concentration, used to permeablize the tissues, had to be increased to 3 g/ml instead of 1 g/ml, possibly due to more compact tissues and higher levels of phenolics and polysaccharides. For all species the RNase treatment was removed due to reduced signal strength without a corresponding increase in specificity. By comparing tissue specific expression patterns of homologous genes from both flowering plants and a coniferous tree we demonstrate that the DIG in situ protocol presented here, with only minute adjustments, can be applied to a wide range of plant species. Hence, the protocol avoids both extensive species specific optimization and the laborious use of radioactively labeled probes in favor of DIG labeled probes. We have chosen to illustrate the technically demanding steps of the protocol in our film. Anna Karlgren and Jenny Carlsson contributed equally to this study. Corresponding authors: Anna Karlgren at and Jens F. Sundström at
Plant Biology, Issue 26, RNA, expression analysis, Norway spruce, Arabidopsis, rapeseed, conifers
Play Button
A Proboscis Extension Response Protocol for Investigating Behavioral Plasticity in Insects: Application to Basic, Biomedical, and Agricultural Research
Authors: Brian H. Smith, Christina M. Burden.
Institutions: Arizona State University.
Insects modify their responses to stimuli through experience of associating those stimuli with events important for survival (e.g., food, mates, threats). There are several behavioral mechanisms through which an insect learns salient associations and relates them to these events. It is important to understand this behavioral plasticity for programs aimed toward assisting insects that are beneficial for agriculture. This understanding can also be used for discovering solutions to biomedical and agricultural problems created by insects that act as disease vectors and pests. The Proboscis Extension Response (PER) conditioning protocol was developed for honey bees (Apis mellifera) over 50 years ago to study how they perceive and learn about floral odors, which signal the nectar and pollen resources a colony needs for survival. The PER procedure provides a robust and easy-to-employ framework for studying several different ecologically relevant mechanisms of behavioral plasticity. It is easily adaptable for use with several other insect species and other behavioral reflexes. These protocols can be readily employed in conjunction with various means for monitoring neural activity in the CNS via electrophysiology or bioimaging, or for manipulating targeted neuromodulatory pathways. It is a robust assay for rapidly detecting sub-lethal effects on behavior caused by environmental stressors, toxins or pesticides. We show how the PER protocol is straightforward to implement using two procedures. One is suitable as a laboratory exercise for students or for quick assays of the effect of an experimental treatment. The other provides more thorough control of variables, which is important for studies of behavioral conditioning. We show how several measures for the behavioral response ranging from binary yes/no to more continuous variable like latency and duration of proboscis extension can be used to test hypotheses. And, we discuss some pitfalls that researchers commonly encounter when they use the procedure for the first time.
Neuroscience, Issue 91, PER, conditioning, honey bee, olfaction, olfactory processing, learning, memory, toxin assay
Play Button
Shrinkage of Dental Composite in Simulated Cavity Measured with Digital Image Correlation
Authors: Jianying Li, Preetanjali Thakur, Alex S. L. Fok.
Institutions: University of Minnesota.
Polymerization shrinkage of dental resin composites can lead to restoration debonding or cracked tooth tissues in composite-restored teeth. In order to understand where and how shrinkage strain and stress develop in such restored teeth, Digital Image Correlation (DIC) was used to provide a comprehensive view of the displacement and strain distributions within model restorations that had undergone polymerization shrinkage. Specimens with model cavities were made of cylindrical glass rods with both diameter and length being 10 mm. The dimensions of the mesial-occlusal-distal (MOD) cavity prepared in each specimen measured 3 mm and 2 mm in width and depth, respectively. After filling the cavity with resin composite, the surface under observation was sprayed with first a thin layer of white paint and then fine black charcoal powder to create high-contrast speckles. Pictures of that surface were then taken before curing and 5 min after. Finally, the two pictures were correlated using DIC software to calculate the displacement and strain distributions. The resin composite shrunk vertically towards the bottom of the cavity, with the top center portion of the restoration having the largest downward displacement. At the same time, it shrunk horizontally towards its vertical midline. Shrinkage of the composite stretched the material in the vicinity of the “tooth-restoration” interface, resulting in cuspal deflections and high tensile strains around the restoration. Material close to the cavity walls or floor had direct strains mostly in the directions perpendicular to the interfaces. Summation of the two direct strain components showed a relatively uniform distribution around the restoration and its magnitude equaled approximately to the volumetric shrinkage strain of the material.
Medicine, Issue 89, image processing, computer-assisted, polymer matrix composites, testing of materials (composite materials), dental composite restoration, polymerization shrinkage, digital image correlation, full-field strain measurement, interfacial debonding
Play Button
An Improved Mechanical Testing Method to Assess Bone-implant Anchorage
Authors: Spencer Bell, Elnaz Ajami, John E. Davies.
Institutions: University of Toronto.
Recent advances in material science have led to a substantial increase in the topographical complexity of implant surfaces, both on a micro- and a nano-scale. As such, traditional methods of describing implant surfaces - namely numerical determinants of surface roughness - are inadequate for predicting in vivo performance. Biomechanical testing provides an accurate and comparative platform to analyze the performance of biomaterial surfaces. An improved mechanical testing method to test the anchorage of bone to candidate implant surfaces is presented. The method is applicable to both early and later stages of healing and can be employed for any range of chemically or mechanically modified surfaces - but not smooth surfaces. Custom rectangular implants are placed bilaterally in the distal femora of male Wistar rats and collected with the surrounding bone. Test specimens are prepared and potted using a novel breakaway mold and the disruption test is conducted using a mechanical testing machine. This method allows for alignment of the disruption force exactly perpendicular, or parallel, to the plane of the implant surface, and provides an accurate and reproducible means for isolating an exact peri-implant region for testing.
Bioengineering, Issue 84, Mechanical test, bone anchorage, disruption test, surface topography, peri-implant bone, bone-implant interface, bone-bonding, microtopography, nanotopography
Play Button
Design and Construction of an Urban Runoff Research Facility
Authors: Benjamin G. Wherley, Richard H. White, Kevin J. McInnes, Charles H. Fontanier, James C. Thomas, Jacqueline A. Aitkenhead-Peterson, Steven T. Kelly.
Institutions: Texas A&M University, The Scotts Miracle-Gro Company.
As the urban population increases, so does the area of irrigated urban landscape. Summer water use in urban areas can be 2-3x winter base line water use due to increased demand for landscape irrigation. Improper irrigation practices and large rainfall events can result in runoff from urban landscapes which has potential to carry nutrients and sediments into local streams and lakes where they may contribute to eutrophication. A 1,000 m2 facility was constructed which consists of 24 individual 33.6 m2 field plots, each equipped for measuring total runoff volumes with time and collection of runoff subsamples at selected intervals for quantification of chemical constituents in the runoff water from simulated urban landscapes. Runoff volumes from the first and second trials had coefficient of variability (CV) values of 38.2 and 28.7%, respectively. CV values for runoff pH, EC, and Na concentration for both trials were all under 10%. Concentrations of DOC, TDN, DON, PO4-P, K+, Mg2+, and Ca2+ had CV values less than 50% in both trials. Overall, the results of testing performed after sod installation at the facility indicated good uniformity between plots for runoff volumes and chemical constituents. The large plot size is sufficient to include much of the natural variability and therefore provides better simulation of urban landscape ecosystems.
Environmental Sciences, Issue 90, urban runoff, landscapes, home lawns, turfgrass, St. Augustinegrass, carbon, nitrogen, phosphorus, sodium
Play Button
The Use of Magnetic Resonance Spectroscopy as a Tool for the Measurement of Bi-hemispheric Transcranial Electric Stimulation Effects on Primary Motor Cortex Metabolism
Authors: Sara Tremblay, Vincent Beaulé, Sébastien Proulx, Louis-Philippe Lafleur, Julien Doyon, Małgorzata Marjańska, Hugo Théoret.
Institutions: University of Montréal, McGill University, University of Minnesota.
Transcranial direct current stimulation (tDCS) is a neuromodulation technique that has been increasingly used over the past decade in the treatment of neurological and psychiatric disorders such as stroke and depression. Yet, the mechanisms underlying its ability to modulate brain excitability to improve clinical symptoms remains poorly understood 33. To help improve this understanding, proton magnetic resonance spectroscopy (1H-MRS) can be used as it allows the in vivo quantification of brain metabolites such as γ-aminobutyric acid (GABA) and glutamate in a region-specific manner 41. In fact, a recent study demonstrated that 1H-MRS is indeed a powerful means to better understand the effects of tDCS on neurotransmitter concentration 34. This article aims to describe the complete protocol for combining tDCS (NeuroConn MR compatible stimulator) with 1H-MRS at 3 T using a MEGA-PRESS sequence. We will describe the impact of a protocol that has shown great promise for the treatment of motor dysfunctions after stroke, which consists of bilateral stimulation of primary motor cortices 27,30,31. Methodological factors to consider and possible modifications to the protocol are also discussed.
Neuroscience, Issue 93, proton magnetic resonance spectroscopy, transcranial direct current stimulation, primary motor cortex, GABA, glutamate, stroke
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Controlled Cortical Impact Model for Traumatic Brain Injury
Authors: Jennifer Romine, Xiang Gao, Jinhui Chen.
Institutions: Indiana University School of Medicine.
Every year over a million Americans suffer a traumatic brain injury (TBI). Combined with the incidence of TBIs worldwide, the physical, emotional, social, and economical effects are staggering. Therefore, further research into the effects of TBI and effective treatments is necessary. The controlled cortical impact (CCI) model induces traumatic brain injuries ranging from mild to severe. This method uses a rigid impactor to deliver mechanical energy to an intact dura exposed following a craniectomy. Impact is made under precise parameters at a set velocity to achieve a pre-determined deformation depth. Although other TBI models, such as weight drop and fluid percussion, exist, CCI is more accurate, easier to control, and most importantly, produces traumatic brain injuries similar to those seen in humans. However, no TBI model is currently able to reproduce pathological changes identical to those seen in human patients. The CCI model allows investigation into the short-term and long-term effects of TBI, such as neuronal death, memory deficits, and cerebral edema, as well as potential therapeutic treatments for TBI.
Medicine, Issue 90, controlled cortical impact, traumatic brain injury, cortical contusion
Play Button
Building An Open-source Robotic Stereotaxic Instrument
Authors: Kevin R. Coffey, David J. Barker, Sisi Ma, Mark O. West.
Institutions: Rutgers, The State University of New Jersey.
This protocol includes the designs and software necessary to upgrade an existing stereotaxic instrument to a robotic (CNC) stereotaxic instrument for around $1,000 (excluding a drill), using industry standard stepper motors and CNC controlling software. Each axis has variable speed control and may be operated simultaneously or independently. The robot's flexibility and open coding system (g-code) make it capable of performing custom tasks that are not supported by commercial systems. Its applications include, but are not limited to, drilling holes, sharp edge craniotomies, skull thinning, and lowering electrodes or cannula. In order to expedite the writing of g-coding for simple surgeries, we have developed custom scripts that allow individuals to design a surgery with no knowledge of programming. However, for users to get the most out of the motorized stereotax, it would be beneficial to be knowledgeable in mathematical programming and G-Coding (simple programming for CNC machining). The recommended drill speed is greater than 40,000 rpm. The stepper motor resolution is 1.8°/Step, geared to 0.346°/Step. A standard stereotax has a resolution of 2.88 μm/step. The maximum recommended cutting speed is 500 μm/sec. The maximum recommended jogging speed is 3,500 μm/sec. The maximum recommended drill bit size is HP 2.
Neuroscience, Issue 80, Surgical Instruments, computer aided manufacturing (CAM), Engineering, Behavioral Sciences, Stereotactic Surgery, Robotic Surgery, Replicability, Open-Source, Computer Numerical Control, G-Code, CNC
Play Button
Training Synesthetic Letter-color Associations by Reading in Color
Authors: Olympia Colizoli, Jaap M. J. Murre, Romke Rouw.
Institutions: University of Amsterdam.
Synesthesia is a rare condition in which a stimulus from one modality automatically and consistently triggers unusual sensations in the same and/or other modalities. A relatively common and well-studied type is grapheme-color synesthesia, defined as the consistent experience of color when viewing, hearing and thinking about letters, words and numbers. We describe our method for investigating to what extent synesthetic associations between letters and colors can be learned by reading in color in nonsynesthetes. Reading in color is a special method for training associations in the sense that the associations are learned implicitly while the reader reads text as he or she normally would and it does not require explicit computer-directed training methods. In this protocol, participants are given specially prepared books to read in which four high-frequency letters are paired with four high-frequency colors. Participants receive unique sets of letter-color pairs based on their pre-existing preferences for colored letters. A modified Stroop task is administered before and after reading in order to test for learned letter-color associations and changes in brain activation. In addition to objective testing, a reading experience questionnaire is administered that is designed to probe for differences in subjective experience. A subset of questions may predict how well an individual learned the associations from reading in color. Importantly, we are not claiming that this method will cause each individual to develop grapheme-color synesthesia, only that it is possible for certain individuals to form letter-color associations by reading in color and these associations are similar in some aspects to those seen in developmental grapheme-color synesthetes. The method is quite flexible and can be used to investigate different aspects and outcomes of training synesthetic associations, including learning-induced changes in brain function and structure.
Behavior, Issue 84, synesthesia, training, learning, reading, vision, memory, cognition
Play Button
Dissecting the Non-human Primate Brain in Stereotaxic Space
Authors: Mark W. Burke, Shahin Zangenehpour, Denis Boire, Maurice Ptito.
Institutions: University of Montreal, University of Montreal, Université du Québes à Trois-Rivières.
The use of non-human primates provides an excellent translational model for our understanding of developmental and aging processes in humans1-6. In addition, the use of non-human primates has recently afforded the opportunity to naturally model complex psychiatric disorders such as alcohol abuse7. Here we describe a technique for blocking the brain in the coronal plane of the vervet monkey (Chlorocebus aethiops sabeus) in the intact skull in stereotaxic space. The method described here provides a standard plane of section between blocks and subjects and minimizes partial sections between blocks. Sectioning a block of tissue in the coronal plane also facilitates the delineation of an area of interest. This method provides manageable sized blocks since a single hemisphere of the vervet monkey yields more than 1200 sections when slicing at 50μm. Furthermore by blocking the brain into 1cm blocks, it facilitates penetration of sucrose for cyroprotection and allows the block to be sliced on a standard cryostat.
Neuroscience, Issue 29, Non-human primate, brain bank, stereotaxic apparatus, cryostat, dissection
Play Button
High-density EEG Recordings of the Freely Moving Mice using Polyimide-based Microelectrode
Authors: Mina Lee, Dongwook Kim, Hee-Sup Shin, Ho-Geun Sung, Jee Hyun Choi.
Institutions: Korea Institute of Science and Technology (KIST), University of Science and Technology, Korea Advanced Nano Fab Center.
Electroencephalogram (EEG) indicates the averaged electrical activity of the neuronal populations on a large-scale level. It is widely utilized as a noninvasive brain monitoring tool in cognitive neuroscience as well as a diagnostic tool for epilepsy and sleep disorders in neurology. However, the underlying mechanism of EEG rhythm generation is still under the veil. Recently introduced polyimide-based microelectrode (PBM-array) for high resolution mouse EEG1 is one of the trials to answer the neurophysiological questions on EEG signals based on a rich genetic resource that the mouse model contains for the analysis of complex EEG generation process. This application of nanofabricated PBM-array to mouse skull is an efficient tool for collecting large-scale brain activity of transgenic mice and accommodates to identify the neural correlates to certain EEG rhythms in conjunction with behavior. However its ultra-thin thickness and bifurcated structure cause a trouble in handling and implantation of PBM-array. In the presented video, the preparation and surgery steps for the implantation of PBM-array on a mouse skull are described step by step. Handling and surgery tips to help researchers succeed in implantation are also provided.
Neuroscience, Issue 47, Electroencephalography (EEG), Mouse, Microelectrode, Brain Imaging
Play Button
In Situ Neutron Powder Diffraction Using Custom-made Lithium-ion Batteries
Authors: William R. Brant, Siegbert Schmid, Guodong Du, Helen E. A. Brand, Wei Kong Pang, Vanessa K. Peterson, Zaiping Guo, Neeraj Sharma.
Institutions: University of Sydney, University of Wollongong, Australian Synchrotron, Australian Nuclear Science and Technology Organisation, University of Wollongong, University of New South Wales.
Li-ion batteries are widely used in portable electronic devices and are considered as promising candidates for higher-energy applications such as electric vehicles.1,2 However, many challenges, such as energy density and battery lifetimes, need to be overcome before this particular battery technology can be widely implemented in such applications.3 This research is challenging, and we outline a method to address these challenges using in situ NPD to probe the crystal structure of electrodes undergoing electrochemical cycling (charge/discharge) in a battery. NPD data help determine the underlying structural mechanism responsible for a range of electrode properties, and this information can direct the development of better electrodes and batteries. We briefly review six types of battery designs custom-made for NPD experiments and detail the method to construct the ‘roll-over’ cell that we have successfully used on the high-intensity NPD instrument, WOMBAT, at the Australian Nuclear Science and Technology Organisation (ANSTO). The design considerations and materials used for cell construction are discussed in conjunction with aspects of the actual in situ NPD experiment and initial directions are presented on how to analyze such complex in situ data.
Physics, Issue 93, In operando, structure-property relationships, electrochemical cycling, electrochemical cells, crystallography, battery performance
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
Environmentally-controlled Microtensile Testing of Mechanically-adaptive Polymer Nanocomposites for ex vivo Characterization
Authors: Allison E. Hess, Kelsey A. Potter, Dustin J. Tyler, Christian A. Zorman, Jeffrey R. Capadona.
Institutions: Louis Stokes Cleveland Department of Veterans Affairs Medical Center, Case Western Reserve University, Case Western Reserve University.
Implantable microdevices are gaining significant attention for several biomedical applications1-4. Such devices have been made from a range of materials, each offering its own advantages and shortcomings5,6. Most prominently, due to the microscale device dimensions, a high modulus is required to facilitate implantation into living tissue. Conversely, the stiffness of the device should match the surrounding tissue to minimize induced local strain7-9. Therefore, we recently developed a new class of bio-inspired materials to meet these requirements by responding to environmental stimuli with a change in mechanical properties10-14. Specifically, our poly(vinyl acetate)-based nanocomposite (PVAc-NC) displays a reduction in stiffness when exposed to water and elevated temperatures (e.g. body temperature). Unfortunately, few methods exist to quantify the stiffness of materials in vivo15, and mechanical testing outside of the physiological environment often requires large samples inappropriate for implantation. Further, stimuli-responsive materials may quickly recover their initial stiffness after explantation. Therefore, we have developed a method by which the mechanical properties of implanted microsamples can be measured ex vivo, with simulated physiological conditions maintained using moisture and temperature control13,16,17. To this end, a custom microtensile tester was designed to accommodate microscale samples13,17 with widely-varying Young's moduli (range of 10 MPa to 5 GPa). As our interests are in the application of PVAc-NC as a biologically-adaptable neural probe substrate, a tool capable of mechanical characterization of samples at the microscale was necessary. This tool was adapted to provide humidity and temperature control, which minimized sample drying and cooling17. As a result, the mechanical characteristics of the explanted sample closely reflect those of the sample just prior to explantation. The overall goal of this method is to quantitatively assess the in vivo mechanical properties, specifically the Young's modulus, of stimuli-responsive, mechanically-adaptive polymer-based materials. This is accomplished by first establishing the environmental conditions that will minimize a change in sample mechanical properties after explantation without contributing to a reduction in stiffness independent of that resulting from implantation. Samples are then prepared for implantation, handling, and testing (Figure 1A). Each sample is implanted into the cerebral cortex of rats, which is represented here as an explanted rat brain, for a specified duration (Figure 1B). At this point, the sample is explanted and immediately loaded into the microtensile tester, and then subjected to tensile testing (Figure 1C). Subsequent data analysis provides insight into the mechanical behavior of these innovative materials in the environment of the cerebral cortex.
Bioengineering, Issue 78, Biophysics, Biomedical Engineering, Molecular Biology, Cellular Biology, Electrical Engineering, Materials Science, Nanotechnology, Nanocomposites, Electrodes, Implanted, Neural Prostheses, Micro-Electrical-Mechanical Systems, Implants, Experimental, mechanical properties (composite materials), Dynamic materials, polymer nanocomposite, Young's modulus, modulus of elasticity, intracortical microelectrode, polymers, biomaterials
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Isolation, Purification and Labeling of Mouse Bone Marrow Neutrophils for Functional Studies and Adoptive Transfer Experiments
Authors: Muthulekha Swamydas, Michail S. Lionakis.
Institutions: National Institute of Allergy and Infectious Diseases, NIH.
Neutrophils are critical effector cells of the innate immune system. They are rapidly recruited at sites of acute inflammation and exert protective or pathogenic effects depending on the inflammatory milieu. Nonetheless, despite the indispensable role of neutrophils in immunity, detailed understanding of the molecular factors that mediate neutrophils' effector and immunopathogenic effects in different infectious diseases and inflammatory conditions is still lacking, partly because of their short half life, the difficulties with handling of these cells and the lack of reliable experimental protocols for obtaining sufficient numbers of neutrophils for downstream functional studies and adoptive transfer experiments. Therefore, simple, fast, economical and reliable methods are highly desirable for harvesting sufficient numbers of mouse neutrophils for assessing functions such as phagocytosis, killing, cytokine production, degranulation and trafficking. To that end, we present a reproducible density gradient centrifugation-based protocol, which can be adapted in any laboratory to isolate large numbers of neutrophils from the bone marrow of mice with high purity and viability. Moreover, we present a simple protocol that uses CellTracker dyes to label the isolated neutrophils, which can then be adoptively transferred into recipient mice and tracked in several tissues for at least 4 hr post-transfer using flow cytometry. Using this approach, differential labeling of neutrophils from wild-type and gene-deficient mice with different CellTracker dyes can be successfully employed to perform competitive repopulation studies for evaluating the direct role of specific genes in trafficking of neutrophils from the blood into target tissues in vivo.
Immunology, Issue 77, Cellular Biology, Infection, Infectious Diseases, Molecular Biology, Medicine, Biomedical Engineering, Bioengineering, Neutrophils, Adoptive Transfer, immunology, Neutrophils, mouse, bone marrow, adoptive transfer, density gradient, labeling, CellTracker, cell, isolation, flow cytometry, animal model
Play Button
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Authors: Amy H. Van Hove, Brandon D. Wilson, Danielle S. W. Benoit.
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g. primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
Play Button
Cross-Modal Multivariate Pattern Analysis
Authors: Kaspar Meyer, Jonas T. Kaplan.
Institutions: University of Southern California.
Multivariate pattern analysis (MVPA) is an increasingly popular method of analyzing functional magnetic resonance imaging (fMRI) data1-4. Typically, the method is used to identify a subject's perceptual experience from neural activity in certain regions of the brain. For instance, it has been employed to predict the orientation of visual gratings a subject perceives from activity in early visual cortices5 or, analogously, the content of speech from activity in early auditory cortices6. Here, we present an extension of the classical MVPA paradigm, according to which perceptual stimuli are not predicted within, but across sensory systems. Specifically, the method we describe addresses the question of whether stimuli that evoke memory associations in modalities other than the one through which they are presented induce content-specific activity patterns in the sensory cortices of those other modalities. For instance, seeing a muted video clip of a glass vase shattering on the ground automatically triggers in most observers an auditory image of the associated sound; is the experience of this image in the "mind's ear" correlated with a specific neural activity pattern in early auditory cortices? Furthermore, is this activity pattern distinct from the pattern that could be observed if the subject were, instead, watching a video clip of a howling dog? In two previous studies7,8, we were able to predict sound- and touch-implying video clips based on neural activity in early auditory and somatosensory cortices, respectively. Our results are in line with a neuroarchitectural framework proposed by Damasio9,10, according to which the experience of mental images that are based on memories - such as hearing the shattering sound of a vase in the "mind's ear" upon seeing the corresponding video clip - is supported by the re-construction of content-specific neural activity patterns in early sensory cortices.
Neuroscience, Issue 57, perception, sensory, cross-modal, top-down, mental imagery, fMRI, MRI, neuroimaging, multivariate pattern analysis, MVPA
Play Button
Functional Mapping with Simultaneous MEG and EEG
Authors: Hesheng Liu, Naoaki Tanaka, Steven Stufflebeam, Seppo Ahlfors, Matti Hämäläinen.
Institutions: MGH - Massachusetts General Hospital.
We use magnetoencephalography (MEG) and electroencephalography (EEG) to locate and determine the temporal evolution in brain areas involved in the processing of simple sensory stimuli. We will use somatosensory stimuli to locate the hand somatosensory areas, auditory stimuli to locate the auditory cortices, visual stimuli in four quadrants of the visual field to locate the early visual areas. These type of experiments are used for functional mapping in epileptic and brain tumor patients to locate eloquent cortices. In basic neuroscience similar experimental protocols are used to study the orchestration of cortical activity. The acquisition protocol includes quality assurance procedures, subject preparation for the combined MEG/EEG study, and acquisition of evoked-response data with somatosensory, auditory, and visual stimuli. We also demonstrate analysis of the data using the equivalent current dipole model and cortically-constrained minimum-norm estimates. Anatomical MRI data are employed in the analysis for visualization and for deriving boundaries of tissue boundaries for forward modeling and cortical location and orientation constraints for the minimum-norm estimates.
JoVE neuroscience, Issue 40, neuroscience, brain, MEG, EEG, functional imaging
Play Button
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Authors: Wenan Chen, Ashwin Belle, Charles Cockrell, Kevin R. Ward, Kayvan Najarian.
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.