JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
An integrated approach for platoon-based simulation and its feasibility assessment.
.
PLoS ONE
PUBLISHED: 03-19-2015
Research on developing mathematical and simulative models to evaluate performance of signalized arterials is still ongoing. In this paper, an integrated model (IM) based on Rakha vehicle dynamics and LWR model is proposed. The IM which imitates actuated performance measurement in signalized arterials is described using continuous timed Petri net with variable speeds (VCPN). This enables systematic discretized description of platoon movement from an upstream signalized intersection towards a downstream signalized intersection. The integration is based on the notion that speed and travel time characteristics in a link can be provided using Rakha model. This will assist the LWR to estimate arrival profiles of vehicles at downstream intersection. One immediate benefit of the model is that platoon arrival profile obtained from the IM can be directly manipulated to estimate queues and delays at the target intersection using input-output analysis without considering the effect of shockwaves. This is less tedious as compared to analysing the LWR model through tracing trajectory of shockwave. Besides, time parameters of a platoon could be estimated for self-scheduling control approach from a cycle to cycle basis. The proposed IM is applied to a test intersection where simulated queues and average delays from the IM are compared with the platoon dispersion model (PDM) implemented in TRANSYT, cell transmission model (CTM) and HCM2000 for both under-saturated and oversaturated situations. The comparisons yielded acceptable and reasonable results, thus ascertained the feasibility and validity of the model.
Authors: Brenda M. Geiger, Lauren E. Frank, Angela D. Caldera-Siu, Emmanuel N. Pothos.
Published: 10-06-2008
ABSTRACT
The ability to measure extracellular basal levels of neurotransmitters in the brain of awake animals allows for the determination of effects of different systemic challenges (pharmacological or physiological) to the CNS. For example, one can directly measure how the animal's midbrain dopamine projections respond to dopamine-releasing drugs like d-amphetamine or natural stimuli like food. In this video, we show you how to implant guide cannulas targeting specific sites in the rat brain, how to insert and implant a microdialysis probe and how to use high performance liquid chromatography coupled with electrochemical detection (HPLC-EC) to measure extracellular levels of oxidizable neurotransmitters and metabolites. Local precise introduction of drugs through the microdialysis probe allows for refined work on site specificity in a compound s mechanism of action. This technique has excellent anatomical and chemical resolution but only modest time resolution as microdialysis samples are usually processed every 20-30 minutes to ensure detectable neurotransmitter levels. Complementary ex vivo tools (i.e., slice and cell culture electrophysiology) can assist with monitoring real-time neurotransmission.
26 Related JoVE Articles!
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Scalable Nanohelices for Predictive Studies and Enhanced 3D Visualization
Authors: Kwyn A. Meagher, Benjamin N. Doblack, Mercedes Ramirez, Lilian P. Davila.
Institutions: University of California Merced, University of California Merced.
Spring-like materials are ubiquitous in nature and of interest in nanotechnology for energy harvesting, hydrogen storage, and biological sensing applications.  For predictive simulations, it has become increasingly important to be able to model the structure of nanohelices accurately.  To study the effect of local structure on the properties of these complex geometries one must develop realistic models.  To date, software packages are rather limited in creating atomistic helical models.  This work focuses on producing atomistic models of silica glass (SiO2) nanoribbons and nanosprings for molecular dynamics (MD) simulations. Using an MD model of “bulk” silica glass, two computational procedures to precisely create the shape of nanoribbons and nanosprings are presented.  The first method employs the AWK programming language and open-source software to effectively carve various shapes of silica nanoribbons from the initial bulk model, using desired dimensions and parametric equations to define a helix.  With this method, accurate atomistic silica nanoribbons can be generated for a range of pitch values and dimensions.  The second method involves a more robust code which allows flexibility in modeling nanohelical structures.  This approach utilizes a C++ code particularly written to implement pre-screening methods as well as the mathematical equations for a helix, resulting in greater precision and efficiency when creating nanospring models.  Using these codes, well-defined and scalable nanoribbons and nanosprings suited for atomistic simulations can be effectively created.  An added value in both open-source codes is that they can be adapted to reproduce different helical structures, independent of material.  In addition, a MATLAB graphical user interface (GUI) is used to enhance learning through visualization and interaction for a general user with the atomistic helical structures.  One application of these methods is the recent study of nanohelices via MD simulations for mechanical energy harvesting purposes.
Physics, Issue 93, Helical atomistic models; open-source coding; graphical user interface; visualization software; molecular dynamics simulations; graphical processing unit accelerated simulations.
51372
Play Button
A Coupled Experiment-finite Element Modeling Methodology for Assessing High Strain Rate Mechanical Response of Soft Biomaterials
Authors: Rajkumar Prabhu, Wilburn R. Whittington, Sourav S. Patnaik, Yuxiong Mao, Mark T. Begonia, Lakiesha N. Williams, Jun Liao, M. F. Horstemeyer.
Institutions: Mississippi State University, Mississippi State University.
This study offers a combined experimental and finite element (FE) simulation approach for examining the mechanical behavior of soft biomaterials (e.g. brain, liver, tendon, fat, etc.) when exposed to high strain rates. This study utilized a Split-Hopkinson Pressure Bar (SHPB) to generate strain rates of 100-1,500 sec-1. The SHPB employed a striker bar consisting of a viscoelastic material (polycarbonate). A sample of the biomaterial was obtained shortly postmortem and prepared for SHPB testing. The specimen was interposed between the incident and transmitted bars, and the pneumatic components of the SHPB were activated to drive the striker bar toward the incident bar. The resulting impact generated a compressive stress wave (i.e. incident wave) that traveled through the incident bar. When the compressive stress wave reached the end of the incident bar, a portion continued forward through the sample and transmitted bar (i.e. transmitted wave) while another portion reversed through the incident bar as a tensile wave (i.e. reflected wave). These waves were measured using strain gages mounted on the incident and transmitted bars. The true stress-strain behavior of the sample was determined from equations based on wave propagation and dynamic force equilibrium. The experimental stress-strain response was three dimensional in nature because the specimen bulged. As such, the hydrostatic stress (first invariant) was used to generate the stress-strain response. In order to extract the uniaxial (one-dimensional) mechanical response of the tissue, an iterative coupled optimization was performed using experimental results and Finite Element Analysis (FEA), which contained an Internal State Variable (ISV) material model used for the tissue. The ISV material model used in the FE simulations of the experimental setup was iteratively calibrated (i.e. optimized) to the experimental data such that the experiment and FEA strain gage values and first invariant of stresses were in good agreement.
Bioengineering, Issue 99, Split-Hopkinson Pressure Bar, High Strain Rate, Finite Element Modeling, Soft Biomaterials, Dynamic Experiments, Internal State Variable Modeling, Brain, Liver, Tendon, Fat
51545
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
51673
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
51705
Play Button
Tissue-simulating Phantoms for Assessing Potential Near-infrared Fluorescence Imaging Applications in Breast Cancer Surgery
Authors: Rick Pleijhuis, Arwin Timmermans, Johannes De Jong, Esther De Boer, Vasilis Ntziachristos, Gooitzen Van Dam.
Institutions: University Medical Center Groningen, Technical University of Munich.
Inaccuracies in intraoperative tumor localization and evaluation of surgical margin status result in suboptimal outcome of breast-conserving surgery (BCS). Optical imaging, in particular near-infrared fluorescence (NIRF) imaging, might reduce the frequency of positive surgical margins following BCS by providing the surgeon with a tool for pre- and intraoperative tumor localization in real-time. In the current study, the potential of NIRF-guided BCS is evaluated using tissue-simulating breast phantoms for reasons of standardization and training purposes. Breast phantoms with optical characteristics comparable to those of normal breast tissue were used to simulate breast conserving surgery. Tumor-simulating inclusions containing the fluorescent dye indocyanine green (ICG) were incorporated in the phantoms at predefined locations and imaged for pre- and intraoperative tumor localization, real-time NIRF-guided tumor resection, NIRF-guided evaluation on the extent of surgery, and postoperative assessment of surgical margins. A customized NIRF camera was used as a clinical prototype for imaging purposes. Breast phantoms containing tumor-simulating inclusions offer a simple, inexpensive, and versatile tool to simulate and evaluate intraoperative tumor imaging. The gelatinous phantoms have elastic properties similar to human tissue and can be cut using conventional surgical instruments. Moreover, the phantoms contain hemoglobin and intralipid for mimicking absorption and scattering of photons, respectively, creating uniform optical properties similar to human breast tissue. The main drawback of NIRF imaging is the limited penetration depth of photons when propagating through tissue, which hinders (noninvasive) imaging of deep-seated tumors with epi-illumination strategies.
Medicine, Issue 91, Breast cancer, tissue-simulating phantoms, NIRF imaging, tumor-simulating inclusions, fluorescence, intraoperative imaging
51776
Play Button
3D Orbital Tracking in a Modified Two-photon Microscope: An Application to the Tracking of Intracellular Vesicles
Authors: Andrea Anzalone, Paolo Annibale, Enrico Gratton.
Institutions: University of California, Irvine.
The objective of this video protocol is to discuss how to perform and analyze a three-dimensional fluorescent orbital particle tracking experiment using a modified two-photon microscope1. As opposed to conventional approaches (raster scan or wide field based on a stack of frames), the 3D orbital tracking allows to localize and follow with a high spatial (10 nm accuracy) and temporal resolution (50 Hz frequency response) the 3D displacement of a moving fluorescent particle on length-scales of hundreds of microns2. The method is based on a feedback algorithm that controls the hardware of a two-photon laser scanning microscope in order to perform a circular orbit around the object to be tracked: the feedback mechanism will maintain the fluorescent object in the center by controlling the displacement of the scanning beam3-5. To demonstrate the advantages of this technique, we followed a fast moving organelle, the lysosome, within a living cell6,7. Cells were plated according to standard protocols, and stained using a commercially lysosome dye. We discuss briefly the hardware configuration and in more detail the control software, to perform a 3D orbital tracking experiment inside living cells. We discuss in detail the parameters required in order to control the scanning microscope and enable the motion of the beam in a closed orbit around the particle. We conclude by demonstrating how this method can be effectively used to track the fast motion of a labeled lysosome along microtubules in 3D within a live cell. Lysosomes can move with speeds in the range of 0.4-0.5 µm/sec, typically displaying a directed motion along the microtubule network8.
Bioengineering, Issue 92, fluorescence, single particle tracking, laser scanning microscope, two-photon, vesicle transport, live-cell imaging, optics
51794
Play Button
Getting to Compliance in Forced Exercise in Rodents: A Critical Standard to Evaluate Exercise Impact in Aging-related Disorders and Disease
Authors: Jennifer C. Arnold, Michael F. Salvatore.
Institutions: Louisiana State University Health Sciences Center.
There is a major increase in the awareness of the positive impact of exercise on improving several disease states with neurobiological basis; these include improving cognitive function and physical performance. As a result, there is an increase in the number of animal studies employing exercise. It is argued that one intrinsic value of forced exercise is that the investigator has control over the factors that can influence the impact of exercise on behavioral outcomes, notably exercise frequency, duration, and intensity of the exercise regimen. However, compliance in forced exercise regimens may be an issue, particularly if potential confounds of employing foot-shock are to be avoided. It is also important to consider that since most cognitive and locomotor impairments strike in the aged individual, determining impact of exercise on these impairments should consider using aged rodents with a highest possible level of compliance to ensure minimal need for test subjects. Here, the pertinent steps and considerations necessary to achieve nearly 100% compliance to treadmill exercise in an aged rodent model will be presented and discussed. Notwithstanding the particular exercise regimen being employed by the investigator, our protocol should be of use to investigators that are particularly interested in the potential impact of forced exercise on aging-related impairments, including aging-related Parkinsonism and Parkinson’s disease.
Behavior, Issue 90, Exercise, locomotor, Parkinson’s disease, aging, treadmill, bradykinesia, Parkinsonism
51827
Play Button
Workflow for High-content, Individual Cell Quantification of Fluorescent Markers from Universal Microscope Data, Supported by Open Source Software
Authors: Simon R. Stockwell, Sibylle Mittnacht.
Institutions: UCL Cancer Institute.
Advances in understanding the control mechanisms governing the behavior of cells in adherent mammalian tissue culture models are becoming increasingly dependent on modes of single-cell analysis. Methods which deliver composite data reflecting the mean values of biomarkers from cell populations risk losing subpopulation dynamics that reflect the heterogeneity of the studied biological system. In keeping with this, traditional approaches are being replaced by, or supported with, more sophisticated forms of cellular assay developed to allow assessment by high-content microscopy. These assays potentially generate large numbers of images of fluorescent biomarkers, which enabled by accompanying proprietary software packages, allows for multi-parametric measurements per cell. However, the relatively high capital costs and overspecialization of many of these devices have prevented their accessibility to many investigators. Described here is a universally applicable workflow for the quantification of multiple fluorescent marker intensities from specific subcellular regions of individual cells suitable for use with images from most fluorescent microscopes. Key to this workflow is the implementation of the freely available Cell Profiler software1 to distinguish individual cells in these images, segment them into defined subcellular regions and deliver fluorescence marker intensity values specific to these regions. The extraction of individual cell intensity values from image data is the central purpose of this workflow and will be illustrated with the analysis of control data from a siRNA screen for G1 checkpoint regulators in adherent human cells. However, the workflow presented here can be applied to analysis of data from other means of cell perturbation (e.g., compound screens) and other forms of fluorescence based cellular markers and thus should be useful for a wide range of laboratories.
Cellular Biology, Issue 94, Image analysis, High-content analysis, Screening, Microscopy, Individual cell analysis, Multiplexed assays
51882
Play Button
Evaluation of a Novel Laser-assisted Coronary Anastomotic Connector - the Trinity Clip - in a Porcine Off-pump Bypass Model
Authors: David Stecher, Glenn Bronkers, Jappe O.T. Noest, Cornelis A.F. Tulleken, Imo E. Hoefer, Lex A. van Herwerden, Gerard Pasterkamp, Marc P. Buijsrogge.
Institutions: University Medical Center Utrecht, Vascular Connect b.v., University Medical Center Utrecht, University Medical Center Utrecht.
To simplify and facilitate beating heart (i.e., off-pump), minimally invasive coronary artery bypass surgery, a new coronary anastomotic connector, the Trinity Clip, is developed based on the excimer laser-assisted nonocclusive anastomosis technique. The Trinity Clip connector enables simplified, sutureless, and nonocclusive connection of the graft to the coronary artery, and an excimer laser catheter laser-punches the opening of the anastomosis. Consequently, owing to the complete nonocclusive anastomosis construction, coronary conditioning (i.e., occluding or shunting) is not necessary, in contrast to the conventional anastomotic technique, hence simplifying the off-pump bypass procedure. Prior to clinical application in coronary artery bypass grafting, the safety and quality of this novel connector will be evaluated in a long-term experimental porcine off-pump coronary artery bypass (OPCAB) study. In this paper, we describe how to evaluate the coronary anastomosis in the porcine OPCAB model using various techniques to assess its quality. Representative results are summarized and visually demonstrated.
Medicine, Issue 93, Anastomosis, coronary, anastomotic connector, anastomotic coupler, excimer laser-assisted nonocclusive anastomosis (ELANA), coronary artery bypass graft (CABG), off-pump coronary artery bypass (OPCAB), beating heart surgery, excimer laser, porcine model, experimental, medical device
52127
Play Button
Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone
Authors: Sandra Zehentmeier, Zoltan Cseresnyes, Juan Escribano Navarro, Raluca A. Niesner, Anja E. Hauser.
Institutions: German Rheumatism Research Center, a Leibniz Institute, German Rheumatism Research Center, a Leibniz Institute, Max-Delbrück Center for Molecular Medicine, Wimasis GmbH, Charité - University of Medicine.
Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data.
Developmental Biology, Issue 98, Image analysis, neighborhood analysis, bone marrow, stromal cells, bone marrow niches, simulation, bone cryosectioning, bone histology
52544
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
50977
Play Button
Designing Silk-silk Protein Alloy Materials for Biomedical Applications
Authors: Xiao Hu, Solomon Duki, Joseph Forys, Jeffrey Hettinger, Justin Buchicchio, Tabbetha Dobbins, Catherine Yang.
Institutions: Rowan University, Rowan University, Cooper Medical School of Rowan University, Rowan University.
Fibrous proteins display different sequences and structures that have been used for various applications in biomedical fields such as biosensors, nanomedicine, tissue regeneration, and drug delivery. Designing materials based on the molecular-scale interactions between these proteins will help generate new multifunctional protein alloy biomaterials with tunable properties. Such alloy material systems also provide advantages in comparison to traditional synthetic polymers due to the materials biodegradability, biocompatibility, and tenability in the body. This article used the protein blends of wild tussah silk (Antheraea pernyi) and domestic mulberry silk (Bombyx mori) as an example to provide useful protocols regarding these topics, including how to predict protein-protein interactions by computational methods, how to produce protein alloy solutions, how to verify alloy systems by thermal analysis, and how to fabricate variable alloy materials including optical materials with diffraction gratings, electric materials with circuits coatings, and pharmaceutical materials for drug release and delivery. These methods can provide important information for designing the next generation multifunctional biomaterials based on different protein alloys.
Bioengineering, Issue 90, protein alloys, biomaterials, biomedical, silk blends, computational simulation, implantable electronic devices
50891
Play Button
The Generation of Higher-order Laguerre-Gauss Optical Beams for High-precision Interferometry
Authors: Ludovico Carbone, Paul Fulda, Charlotte Bond, Frank Brueckner, Daniel Brown, Mengyao Wang, Deepali Lodhia, Rebecca Palmer, Andreas Freise.
Institutions: University of Birmingham.
Thermal noise in high-reflectivity mirrors is a major impediment for several types of high-precision interferometric experiments that aim to reach the standard quantum limit or to cool mechanical systems to their quantum ground state. This is for example the case of future gravitational wave observatories, whose sensitivity to gravitational wave signals is expected to be limited in the most sensitive frequency band, by atomic vibration of their mirror masses. One promising approach being pursued to overcome this limitation is to employ higher-order Laguerre-Gauss (LG) optical beams in place of the conventionally used fundamental mode. Owing to their more homogeneous light intensity distribution these beams average more effectively over the thermally driven fluctuations of the mirror surface, which in turn reduces the uncertainty in the mirror position sensed by the laser light. We demonstrate a promising method to generate higher-order LG beams by shaping a fundamental Gaussian beam with the help of diffractive optical elements. We show that with conventional sensing and control techniques that are known for stabilizing fundamental laser beams, higher-order LG modes can be purified and stabilized just as well at a comparably high level. A set of diagnostic tools allows us to control and tailor the properties of generated LG beams. This enabled us to produce an LG beam with the highest purity reported to date. The demonstrated compatibility of higher-order LG modes with standard interferometry techniques and with the use of standard spherical optics makes them an ideal candidate for application in a future generation of high-precision interferometry.
Physics, Issue 78, Optics, Astronomy, Astrophysics, Gravitational waves, Laser interferometry, Metrology, Thermal noise, Laguerre-Gauss modes, interferometry
50564
Play Button
T-wave Ion Mobility-mass Spectrometry: Basic Experimental Procedures for Protein Complex Analysis
Authors: Izhak Michaelevski, Noam Kirshenbaum, Michal Sharon.
Institutions: Weizmann Institute of Science.
Ion mobility (IM) is a method that measures the time taken for an ion to travel through a pressurized cell under the influence of a weak electric field. The speed by which the ions traverse the drift region depends on their size: large ions will experience a greater number of collisions with the background inert gas (usually N2) and thus travel more slowly through the IM device than those ions that comprise a smaller cross-section. In general, the time it takes for the ions to migrate though the dense gas phase separates them, according to their collision cross-section (Ω). Recently, IM spectrometry was coupled with mass spectrometry and a traveling-wave (T-wave) Synapt ion mobility mass spectrometer (IM-MS) was released. Integrating mass spectrometry with ion mobility enables an extra dimension of sample separation and definition, yielding a three-dimensional spectrum (mass to charge, intensity, and drift time). This separation technique allows the spectral overlap to decrease, and enables resolution of heterogeneous complexes with very similar mass, or mass-to-charge ratios, but different drift times. Moreover, the drift time measurements provide an important layer of structural information, as Ω is related to the overall shape and topology of the ion. The correlation between the measured drift time values and Ω is calculated using a calibration curve generated from calibrant proteins with defined cross-sections1. The power of the IM-MS approach lies in its ability to define the subunit packing and overall shape of protein assemblies at micromolar concentrations, and near-physiological conditions1. Several recent IM studies of both individual proteins2,3 and non-covalent protein complexes4-9, successfully demonstrated that protein quaternary structure is maintained in the gas phase, and highlighted the potential of this approach in the study of protein assemblies of unknown geometry. Here, we provide a detailed description of IMS-MS analysis of protein complexes using the Synapt (Quadrupole-Ion Mobility-Time-of-Flight) HDMS instrument (Waters Ltd; the only commercial IM-MS instrument currently available)10. We describe the basic optimization steps, the calibration of collision cross-sections, and methods for data processing and interpretation. The final step of the protocol discusses methods for calculating theoretical Ω values. Overall, the protocol does not attempt to cover every aspect of IM-MS characterization of protein assemblies; rather, its goal is to introduce the practical aspects of the method to new researchers in the field.
cellular biology, Issue 41, mass spectrometry, ion-mobility, protein complexes, non-covalent interactions, structural biology
1985
Play Button
Automated Midline Shift and Intracranial Pressure Estimation based on Brain CT Images
Authors: Wenan Chen, Ashwin Belle, Charles Cockrell, Kevin R. Ward, Kayvan Najarian.
Institutions: Virginia Commonwealth University, Virginia Commonwealth University Reanimation Engineering Science (VCURES) Center, Virginia Commonwealth University, Virginia Commonwealth University, Virginia Commonwealth University.
In this paper we present an automated system based mainly on the computed tomography (CT) images consisting of two main components: the midline shift estimation and intracranial pressure (ICP) pre-screening system. To estimate the midline shift, first an estimation of the ideal midline is performed based on the symmetry of the skull and anatomical features in the brain CT scan. Then, segmentation of the ventricles from the CT scan is performed and used as a guide for the identification of the actual midline through shape matching. These processes mimic the measuring process by physicians and have shown promising results in the evaluation. In the second component, more features are extracted related to ICP, such as the texture information, blood amount from CT scans and other recorded features, such as age, injury severity score to estimate the ICP are also incorporated. Machine learning techniques including feature selection and classification, such as Support Vector Machines (SVMs), are employed to build the prediction model using RapidMiner. The evaluation of the prediction shows potential usefulness of the model. The estimated ideal midline shift and predicted ICP levels may be used as a fast pre-screening step for physicians to make decisions, so as to recommend for or against invasive ICP monitoring.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Neurobiology, Biophysics, Physiology, Anatomy, Brain CT Image Processing, CT, Midline Shift, Intracranial Pressure Pre-screening, Gaussian Mixture Model, Shape Matching, Machine Learning, traumatic brain injury, TBI, imaging, clinical techniques
3871
Play Button
Functional Assessment of Intestinal Motility and Gut Wall Inflammation in Rodents: Analyses in a Standardized Model of Intestinal Manipulation
Authors: Tim O. Vilz, Marcus Overhaus, Burkhard Stoffels, Martin von Websky, Joerg C. Kalff, Sven Wehner.
Institutions: University of Bonn.
Inflammation of the gastrointestinal tract is a common reason for a variety of human diseases. Animal research models are critical in investigating the complex cellular and molecular of intestinal pathology. Although the tunica mucosa is often the organ of interest in many inflammatory diseases, recent works demonstrated that the muscularis externa (ME) is also a highly immunocompetent organ that harbours a dense network of resident immunocytes.1,2 These works were performed within the standardized model of intestinal manipulation (IM) that leads to inflammation of the bowel wall, mainly limited to the ME. Clinically this inflammation leads to prolonged intestinal dysmotility, known as postoperative ileus (POI) which is a frequent and unavoidable complication after abdominal surgery.3 The inflammation is characterized by liberation of proinflammatory mediators such as IL-64 or IL-1β or inhibitory neurotransmitters like nitric oxide (NO).5 Subsequently, tremendous numbers of immunocytes extravasate into the ME, dominated by polymorphonuclear neutrophils (PMN) and monocytes and finally maintain POI.2 Lasting for days, this intestinal paralysis leads to an increased risk of aspiration, bacterial translocation and infectious complications up to sepsis and multi organ failure and causes a high economic burden.6 In this manuscript we demonstrate the standardized model of IM and in vivo assessment of gastrointestinal transit (GIT) and colonic transit. Furthermore we demonstrate a method for separation of the ME from the tunica mucosa followed by immunological analysis, which is crucial to distinguish between the inflammatory responses in these both highly immunoactive bowel wall compartments. All analyses are easily transferable to any other research models, affecting gastrointestinal function.
Medicine, Issue 67, Immunology, Anatomy, Physiology, intestinal manipulation, muscularis externa, intestinal inflammation, postoperative ileus, gastrointestinal transit, gut wall
4086
Play Button
A Novel Bayesian Change-point Algorithm for Genome-wide Analysis of Diverse ChIPseq Data Types
Authors: Haipeng Xing, Willey Liao, Yifan Mo, Michael Q. Zhang.
Institutions: Stony Brook University, Cold Spring Harbor Laboratory, University of Texas at Dallas.
ChIPseq is a widely used technique for investigating protein-DNA interactions. Read density profiles are generated by using next-sequencing of protein-bound DNA and aligning the short reads to a reference genome. Enriched regions are revealed as peaks, which often differ dramatically in shape, depending on the target protein1. For example, transcription factors often bind in a site- and sequence-specific manner and tend to produce punctate peaks, while histone modifications are more pervasive and are characterized by broad, diffuse islands of enrichment2. Reliably identifying these regions was the focus of our work. Algorithms for analyzing ChIPseq data have employed various methodologies, from heuristics3-5 to more rigorous statistical models, e.g. Hidden Markov Models (HMMs)6-8. We sought a solution that minimized the necessity for difficult-to-define, ad hoc parameters that often compromise resolution and lessen the intuitive usability of the tool. With respect to HMM-based methods, we aimed to curtail parameter estimation procedures and simple, finite state classifications that are often utilized. Additionally, conventional ChIPseq data analysis involves categorization of the expected read density profiles as either punctate or diffuse followed by subsequent application of the appropriate tool. We further aimed to replace the need for these two distinct models with a single, more versatile model, which can capably address the entire spectrum of data types. To meet these objectives, we first constructed a statistical framework that naturally modeled ChIPseq data structures using a cutting edge advance in HMMs9, which utilizes only explicit formulas-an innovation crucial to its performance advantages. More sophisticated then heuristic models, our HMM accommodates infinite hidden states through a Bayesian model. We applied it to identifying reasonable change points in read density, which further define segments of enrichment. Our analysis revealed how our Bayesian Change Point (BCP) algorithm had a reduced computational complexity-evidenced by an abridged run time and memory footprint. The BCP algorithm was successfully applied to both punctate peak and diffuse island identification with robust accuracy and limited user-defined parameters. This illustrated both its versatility and ease of use. Consequently, we believe it can be implemented readily across broad ranges of data types and end users in a manner that is easily compared and contrasted, making it a great tool for ChIPseq data analysis that can aid in collaboration and corroboration between research groups. Here, we demonstrate the application of BCP to existing transcription factor10,11 and epigenetic data12 to illustrate its usefulness.
Genetics, Issue 70, Bioinformatics, Genomics, Molecular Biology, Cellular Biology, Immunology, Chromatin immunoprecipitation, ChIP-Seq, histone modifications, segmentation, Bayesian, Hidden Markov Models, epigenetics
4273
Play Button
Driving Simulation in the Clinic: Testing Visual Exploratory Behavior in Daily Life Activities in Patients with Visual Field Defects
Authors: Johanna Hamel, Antje Kraft, Sven Ohl, Sophie De Beukelaer, Heinrich J. Audebert, Stephan A. Brandt.
Institutions: Universitätsmedizin Charité, Universitätsmedizin Charité, Humboldt Universität zu Berlin.
Patients suffering from homonymous hemianopia after infarction of the posterior cerebral artery (PCA) report different degrees of constraint in daily life, despite similar visual deficits. We assume this could be due to variable development of compensatory strategies such as altered visual scanning behavior. Scanning compensatory therapy (SCT) is studied as part of the visual training after infarction next to vision restoration therapy. SCT consists of learning to make larger eye movements into the blind field enlarging the visual field of search, which has been proven to be the most useful strategy1, not only in natural search tasks but also in mastering daily life activities2. Nevertheless, in clinical routine it is difficult to identify individual levels and training effects of compensatory behavior, since it requires measurement of eye movements in a head unrestrained condition. Studies demonstrated that unrestrained head movements alter the visual exploratory behavior compared to a head-restrained laboratory condition3. Martin et al.4 and Hayhoe et al.5 showed that behavior demonstrated in a laboratory setting cannot be assigned easily to a natural condition. Hence, our goal was to develop a study set-up which uncovers different compensatory oculomotor strategies quickly in a realistic testing situation: Patients are tested in the clinical environment in a driving simulator. SILAB software (Wuerzburg Institute for Traffic Sciences GmbH (WIVW)) was used to program driving scenarios of varying complexity and recording the driver's performance. The software was combined with a head mounted infrared video pupil tracker, recording head- and eye-movements (EyeSeeCam, University of Munich Hospital, Clinical Neurosciences). The positioning of the patient in the driving simulator and the positioning, adjustment and calibration of the camera is demonstrated. Typical performances of a patient with and without compensatory strategy and a healthy control are illustrated in this pilot study. Different oculomotor behaviors (frequency and amplitude of eye- and head-movements) are evaluated very quickly during the drive itself by dynamic overlay pictures indicating where the subjects gaze is located on the screen, and by analyzing the data. Compensatory gaze behavior in a patient leads to a driving performance comparable to a healthy control, while the performance of a patient without compensatory behavior is significantly worse. The data of eye- and head-movement-behavior as well as driving performance are discussed with respect to different oculomotor strategies and in a broader context with respect to possible training effects throughout the testing session and implications on rehabilitation potential.
Medicine, Issue 67, Neuroscience, Physiology, Anatomy, Ophthalmology, compensatory oculomotor behavior, driving simulation, eye movements, homonymous hemianopia, stroke, visual field defects, visual field enlargement
4427
Play Button
Simulation, Fabrication and Characterization of THz Metamaterial Absorbers
Authors: James P. Grant, Iain J.H. McCrindle, David R.S. Cumming.
Institutions: University of Glasgow.
Metamaterials (MM), artificial materials engineered to have properties that may not be found in nature, have been widely explored since the first theoretical1 and experimental demonstration2 of their unique properties. MMs can provide a highly controllable electromagnetic response, and to date have been demonstrated in every technologically relevant spectral range including the optical3, near IR4, mid IR5 , THz6 , mm-wave7 , microwave8 and radio9 bands. Applications include perfect lenses10, sensors11, telecommunications12, invisibility cloaks13 and filters14,15. We have recently developed single band16, dual band17 and broadband18 THz metamaterial absorber devices capable of greater than 80% absorption at the resonance peak. The concept of a MM absorber is especially important at THz frequencies where it is difficult to find strong frequency selective THz absorbers19. In our MM absorber the THz radiation is absorbed in a thickness of ~ λ/20, overcoming the thickness limitation of traditional quarter wavelength absorbers. MM absorbers naturally lend themselves to THz detection applications, such as thermal sensors, and if integrated with suitable THz sources (e.g. QCLs), could lead to compact, highly sensitive, low cost, real time THz imaging systems.
Materials Science, Issue 70, Physics, Engineering, Metamaterial, terahertz, sensing, fabrication, clean room, simulation, FTIR, spectroscopy
50114
Play Button
Movement Retraining using Real-time Feedback of Performance
Authors: Michael Anthony Hunt.
Institutions: University of British Columbia .
Any modification of movement - especially movement patterns that have been honed over a number of years - requires re-organization of the neuromuscular patterns responsible for governing the movement performance. This motor learning can be enhanced through a number of methods that are utilized in research and clinical settings alike. In general, verbal feedback of performance in real-time or knowledge of results following movement is commonly used clinically as a preliminary means of instilling motor learning. Depending on patient preference and learning style, visual feedback (e.g. through use of a mirror or different types of video) or proprioceptive guidance utilizing therapist touch, are used to supplement verbal instructions from the therapist. Indeed, a combination of these forms of feedback is commonplace in the clinical setting to facilitate motor learning and optimize outcomes. Laboratory-based, quantitative motion analysis has been a mainstay in research settings to provide accurate and objective analysis of a variety of movements in healthy and injured populations. While the actual mechanisms of capturing the movements may differ, all current motion analysis systems rely on the ability to track the movement of body segments and joints and to use established equations of motion to quantify key movement patterns. Due to limitations in acquisition and processing speed, analysis and description of the movements has traditionally occurred offline after completion of a given testing session. This paper will highlight a new supplement to standard motion analysis techniques that relies on the near instantaneous assessment and quantification of movement patterns and the display of specific movement characteristics to the patient during a movement analysis session. As a result, this novel technique can provide a new method of feedback delivery that has advantages over currently used feedback methods.
Medicine, Issue 71, Biophysics, Anatomy, Physiology, Physics, Biomedical Engineering, Behavior, Psychology, Kinesiology, Physical Therapy, Musculoskeletal System, Biofeedback, biomechanics, gait, movement, walking, rehabilitation, clinical, training
50182
Play Button
Detection of Architectural Distortion in Prior Mammograms via Analysis of Oriented Patterns
Authors: Rangaraj M. Rangayyan, Shantanu Banik, J.E. Leo Desautels.
Institutions: University of Calgary , University of Calgary .
We demonstrate methods for the detection of architectural distortion in prior mammograms of interval-cancer cases based on analysis of the orientation of breast tissue patterns in mammograms. We hypothesize that architectural distortion modifies the normal orientation of breast tissue patterns in mammographic images before the formation of masses or tumors. In the initial steps of our methods, the oriented structures in a given mammogram are analyzed using Gabor filters and phase portraits to detect node-like sites of radiating or intersecting tissue patterns. Each detected site is then characterized using the node value, fractal dimension, and a measure of angular dispersion specifically designed to represent spiculating patterns associated with architectural distortion. Our methods were tested with a database of 106 prior mammograms of 56 interval-cancer cases and 52 mammograms of 13 normal cases using the features developed for the characterization of architectural distortion, pattern classification via quadratic discriminant analysis, and validation with the leave-one-patient out procedure. According to the results of free-response receiver operating characteristic analysis, our methods have demonstrated the capability to detect architectural distortion in prior mammograms, taken 15 months (on the average) before clinical diagnosis of breast cancer, with a sensitivity of 80% at about five false positives per patient.
Medicine, Issue 78, Anatomy, Physiology, Cancer Biology, angular spread, architectural distortion, breast cancer, Computer-Assisted Diagnosis, computer-aided diagnosis (CAD), entropy, fractional Brownian motion, fractal dimension, Gabor filters, Image Processing, Medical Informatics, node map, oriented texture, Pattern Recognition, phase portraits, prior mammograms, spectral analysis
50341
Play Button
Quasi-light Storage for Optical Data Packets
Authors: Thomas Schneider, Stefan Preußler.
Institutions: Hochschule für Telekommunikation, Leipzig.
Today's telecommunication is based on optical packets which transmit the information in optical fiber networks around the world. Currently, the processing of the signals is done in the electrical domain. Direct storage in the optical domain would avoid the transfer of the packets to the electrical and back to the optical domain in every network node and, therefore, increase the speed and possibly reduce the energy consumption of telecommunications. However, light consists of photons which propagate with the speed of light in vacuum. Thus, the storage of light is a big challenge. There exist some methods to slow down the speed of the light, or to store it in excitations of a medium. However, these methods cannot be used for the storage of optical data packets used in telecommunications networks. Here we show how the time-frequency-coherence, which holds for every signal and therefore for optical packets as well, can be exploited to build an optical memory. We will review the background and show in detail and through examples, how a frequency comb can be used for the copying of an optical packet which enters the memory. One of these time domain copies is then extracted from the memory by a time domain switch. We will show this method for intensity as well as for phase modulated signals.
Physics, Issue 84, optical communications, Optical Light Storage, stimulated Brillouin scattering, Optical Signal Processing, optical data packets, telecommunications
50468
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
Designing and Implementing Nervous System Simulations on LEGO Robots
Authors: Daniel Blustein, Nikolai Rosenthal, Joseph Ayers.
Institutions: Northeastern University, Bremen University of Applied Sciences.
We present a method to use the commercially available LEGO Mindstorms NXT robotics platform to test systems level neuroscience hypotheses. The first step of the method is to develop a nervous system simulation of specific reflexive behaviors of an appropriate model organism; here we use the American Lobster. Exteroceptive reflexes mediated by decussating (crossing) neural connections can explain an animal's taxis towards or away from a stimulus as described by Braitenberg and are particularly well suited for investigation using the NXT platform.1 The nervous system simulation is programmed using LabVIEW software on the LEGO Mindstorms platform. Once the nervous system is tuned properly, behavioral experiments are run on the robot and on the animal under identical environmental conditions. By controlling the sensory milieu experienced by the specimens, differences in behavioral outputs can be observed. These differences may point to specific deficiencies in the nervous system model and serve to inform the iteration of the model for the particular behavior under study. This method allows for the experimental manipulation of electronic nervous systems and serves as a way to explore neuroscience hypotheses specifically regarding the neurophysiological basis of simple innate reflexive behaviors. The LEGO Mindstorms NXT kit provides an affordable and efficient platform on which to test preliminary biomimetic robot control schemes. The approach is also well suited for the high school classroom to serve as the foundation for a hands-on inquiry-based biorobotics curriculum.
Neuroscience, Issue 75, Neurobiology, Bioengineering, Behavior, Mechanical Engineering, Computer Science, Marine Biology, Biomimetics, Marine Science, Neurosciences, Synthetic Biology, Robotics, robots, Modeling, models, Sensory Fusion, nervous system, Educational Tools, programming, software, lobster, Homarus americanus, animal model
50519
Play Button
Vision Training Methods for Sports Concussion Mitigation and Management
Authors: Joseph F. Clark, Angelo Colosimo, James K. Ellis, Robert Mangine, Benjamin Bixenmann, Kimberly Hasselfeld, Patricia Graman, Hagar Elgendy, Gregory Myer, Jon Divine.
Institutions: University of Cincinnati, University of Cincinnati, University of Cincinnati, University of Cincinnati, University of Cincinnati, Cincinnati Children's Hospital Medical Center.
There is emerging evidence supporting the use vision training, including light board training tools, as a concussion baseline and neuro-diagnostic tool and potentially as a supportive component to concussion prevention strategies. This paper is focused on providing detailed methods for select vision training tools and reporting normative data for comparison when vision training is a part of a sports management program. The overall program includes standard vision training methods including tachistoscope, Brock’s string, and strobe glasses, as well as specialized light board training algorithms. Stereopsis is measured as a means to monitor vision training affects. In addition, quantitative results for vision training methods as well as baseline and post-testing *A and Reaction Test measures with progressive scores are reported. Collegiate athletes consistently improve after six weeks of training in their stereopsis, *A and Reaction Test scores. When vision training is initiated as a team wide exercise, the incidence of concussion decreases in players who participate in training compared to players who do not receive the vision training. Vision training produces functional and performance changes that, when monitored, can be used to assess the success of the vision training and can be initiated as part of a sports medical intervention for concussion prevention.
Behavior, Issue 99, Vision training, peripheral vision, functional peripheral vision, concussion, concussion management, diagnosis, rehabilitation, eyes, sight, seeing, sight
52648
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.