JoVE Visualize What is visualize?
Related JoVE Video
 
Pubmed Article
Microinverter Thermal Performance in the Real-World: Measurements and Modeling.
.
PLoS ONE
PUBLISHED: 07-07-2015
Real-world performance, durability and reliability of microinverters are critical concerns for microinverter-equipped photovoltaic systems. We conducted a data-driven study of the thermal performance of 24 new microinverters (Enphase M215) connected to 8 different brands of PV modules on dual-axis trackers at the Solar Durability and Lifetime Extension (SDLE) SunFarm at Case Western Reserve University, based on minute by minute power and thermal data from the microinverters and PV modules along with insolation and environmental data from July through October 2013. The analysis shows the strengths of the associations of microinverter temperature with ambient temperature, PV module temperature, irradiance and AC power of the PV systems. The importance of the covariates are rank ordered. A multiple regression model was developed and tested based on stable solar noon-time data, which gives both an overall function that predicts the temperature of microinverters under typical local conditions, and coefficients adjustments reecting refined prediction of the microinverter temperature connected to the 8 brands of PV modules in the study. The model allows for prediction of internal temperature for the Enphase M215 given similar climatic condition and can be expanded to predict microinverter temperature in fixed-rack and roof-top PV systems. This study is foundational in that similar models built on later stage data in the life of a device could reveal potential influencing factors in performance degradation.
Authors: Rafael Jaramillo, Vera Steinmann, Chuanxi Yang, Katy Hartman, Rupak Chakraborty, Jeremy R. Poindexter, Mariela Lizet Castillo, Roy Gordon, Tonio Buonassisi.
Published: 05-22-2015
ABSTRACT
Tin sulfide (SnS) is a candidate absorber material for Earth-abundant, non-toxic solar cells. SnS offers easy phase control and rapid growth by congruent thermal evaporation, and it absorbs visible light strongly. However, for a long time the record power conversion efficiency of SnS solar cells remained below 2%. Recently we demonstrated new certified record efficiencies of 4.36% using SnS deposited by atomic layer deposition, and 3.88% using thermal evaporation. Here the fabrication procedure for these record solar cells is described, and the statistical distribution of the fabrication process is reported. The standard deviation of efficiency measured on a single substrate is typically over 0.5%. All steps including substrate selection and cleaning, Mo sputtering for the rear contact (cathode), SnS deposition, annealing, surface passivation, Zn(O,S) buffer layer selection and deposition, transparent conductor (anode) deposition, and metallization are described. On each substrate we fabricate 11 individual devices, each with active area 0.25 cm2. Further, a system for high throughput measurements of current-voltage curves under simulated solar light, and external quantum efficiency measurement with variable light bias is described. With this system we are able to measure full data sets on all 11 devices in an automated manner and in minimal time. These results illustrate the value of studying large sample sets, rather than focusing narrowly on the highest performing devices. Large data sets help us to distinguish and remedy individual loss mechanisms affecting our devices.
24 Related JoVE Articles!
Play Button
Assay for Pathogen-Associated Molecular Pattern (PAMP)-Triggered Immunity (PTI) in Plants
Authors: Suma Chakravarthy, André C. Velásquez, Gregory B. Martin.
Institutions: Boyce Thompson Institute for Plant Research, Cornell University.
To perceive potential pathogens in their environment, plants use pattern recognition receptors (PRRs) present on their plasma membranes. PRRs recognize conserved microbial features called pathogen-associated molecular patterns (PAMPs) and this detection leads to PAMP-triggered immunity (PTI), which effectively prevents colonization of plant tissues by non-pathogens1,2. The most well studied system in PTI is the FLS2-dependent pathway3. FLS2 recognizes the PAMP flg22 that is a component of bacterial flagellin. Successful pathogens possess virulence factors or effectors that can suppress PTI and allow the pathogen to cause disease1. Some plants in turn possess resistance genes that detect effectors or their activity, which leads to effector-triggered immunity (ETI)2. We describe a cell death-based assay for PTI modified from Oh and Collmer4. The assay was standardized in N. benthamiana, which is being used increasingly as a model system for the study of plant-pathogen interactions5. PTI is induced by infiltration of a non-pathogenic bacterial strain into leaves. Seven hours later, a bacterial strain that either causes disease or which activates ETI is infiltrated into an area overlapping the original infiltration zone. PTI induced by the first infiltration is able to delay or prevent the appearance of cell death due to the second challenge infiltration. Conversely, the appearance of cell death in the overlapping area of inoculation indicates a breakdown of PTI. Four different combinations of inducers of PTI and challenge inoculations were standardized (Table 1). The assay was tested on non-silenced N. benthamiana plants that served as the control and plants silenced for FLS2 that were predicted to be compromised in their ability to develop PTI.
Jove Infectious Diseases, Plant Biology, Issue 31, plant immunity, pathogen-associated molecular pattern (PAMP), PAMP-triggered immunity (PTI), effector-triggered immunity (ETI), Nicotiana benthamiana
1442
Play Button
Magnetic Tweezers for the Measurement of Twist and Torque
Authors: Jan Lipfert, Mina Lee, Orkide Ordu, Jacob W. J. Kerssemakers, Nynke H. Dekker.
Institutions: Delft University of Technology.
Single-molecule techniques make it possible to investigate the behavior of individual biological molecules in solution in real time. These techniques include so-called force spectroscopy approaches such as atomic force microscopy, optical tweezers, flow stretching, and magnetic tweezers. Amongst these approaches, magnetic tweezers have distinguished themselves by their ability to apply torque while maintaining a constant stretching force. Here, it is illustrated how such a “conventional” magnetic tweezers experimental configuration can, through a straightforward modification of its field configuration to minimize the magnitude of the transverse field, be adapted to measure the degree of twist in a biological molecule. The resulting configuration is termed the freely-orbiting magnetic tweezers. Additionally, it is shown how further modification of the field configuration can yield a transverse field with a magnitude intermediate between that of the “conventional” magnetic tweezers and the freely-orbiting magnetic tweezers, which makes it possible to directly measure the torque stored in a biological molecule. This configuration is termed the magnetic torque tweezers. The accompanying video explains in detail how the conversion of conventional magnetic tweezers into freely-orbiting magnetic tweezers and magnetic torque tweezers can be accomplished, and demonstrates the use of these techniques. These adaptations maintain all the strengths of conventional magnetic tweezers while greatly expanding the versatility of this powerful instrument.
Bioengineering, Issue 87, magnetic tweezers, magnetic torque tweezers, freely-orbiting magnetic tweezers, twist, torque, DNA, single-molecule techniques
51503
Play Button
Transplantation of Pulmonary Valve Using a Mouse Model of Heterotopic Heart Transplantation
Authors: Yong-Ung Lee, Tai Yi, Iyore James, Shuhei Tara, Alexander J. Stuber, Kejal V. Shah, Avione Y. Lee, Tadahisa Sugiura, Narutoshi Hibino, Toshiharu Shinoka, Christopher K. Breuer.
Institutions: Nationwide Children's Hospital, Nationwide Children's Hospital, Nationwide Children's Hospital.
Tissue engineered heart valves, especially decellularized valves, are starting to gain momentum in clinical use of reconstructive surgery with mixed results. However, the cellular and molecular mechanisms of the neotissue development, valve thickening, and stenosis development are not researched extensively. To answer the above questions, we developed a murine heterotopic heart valve transplantation model. A heart valve was harvested from a valve donor mouse and transplanted to a heart donor mouse. The heart with a new valve was transplanted heterotopically to a recipient mouse. The transplanted heart showed its own heartbeat, independent of the recipient’s heartbeat. The blood flow was quantified using a high frequency ultrasound system with a pulsed wave Doppler. The flow through the implanted pulmonary valve showed forward flow with minimal regurgitation and the peak flow was close to 100 mm/sec. This murine model of heart valve transplantation is highly versatile, so it can be modified and adapted to provide different hemodynamic environments and/or can be used with various transgenic mice to study neotissue development in a tissue engineered heart valve.
Medicine, Issue 89, tissue engineering, pulmonary valve, congenital heart defect, decellularized heart valve, transgenic mouse model, heterotopic heart transplantation
51695
Play Button
Determination of Protein-ligand Interactions Using Differential Scanning Fluorimetry
Authors: Mirella Vivoli, Halina R. Novak, Jennifer A. Littlechild, Nicholas J. Harmer.
Institutions: University of Exeter.
A wide range of methods are currently available for determining the dissociation constant between a protein and interacting small molecules. However, most of these require access to specialist equipment, and often require a degree of expertise to effectively establish reliable experiments and analyze data. Differential scanning fluorimetry (DSF) is being increasingly used as a robust method for initial screening of proteins for interacting small molecules, either for identifying physiological partners or for hit discovery. This technique has the advantage that it requires only a PCR machine suitable for quantitative PCR, and so suitable instrumentation is available in most institutions; an excellent range of protocols are already available; and there are strong precedents in the literature for multiple uses of the method. Past work has proposed several means of calculating dissociation constants from DSF data, but these are mathematically demanding. Here, we demonstrate a method for estimating dissociation constants from a moderate amount of DSF experimental data. These data can typically be collected and analyzed within a single day. We demonstrate how different models can be used to fit data collected from simple binding events, and where cooperative binding or independent binding sites are present. Finally, we present an example of data analysis in a case where standard models do not apply. These methods are illustrated with data collected on commercially available control proteins, and two proteins from our research program. Overall, our method provides a straightforward way for researchers to rapidly gain further insight into protein-ligand interactions using DSF.
Biophysics, Issue 91, differential scanning fluorimetry, dissociation constant, protein-ligand interactions, StepOne, cooperativity, WcbI.
51809
Play Button
A Cognitive Paradigm to Investigate Interference in Working Memory by Distractions and Interruptions
Authors: Jacki Janowich, Jyoti Mishra, Adam Gazzaley.
Institutions: University of New Mexico, University of California, San Francisco, University of California, San Francisco, University of California, San Francisco.
Goal-directed behavior is often impaired by interference from the external environment, either in the form of distraction by irrelevant information that one attempts to ignore, or by interrupting information that demands attention as part of another (secondary) task goal. Both forms of external interference have been shown to detrimentally impact the ability to maintain information in working memory (WM). Emerging evidence suggests that these different types of external interference exert different effects on behavior and may be mediated by distinct neural mechanisms. Better characterizing the distinct neuro-behavioral impact of irrelevant distractions versus attended interruptions is essential for advancing an understanding of top-down attention, resolution of external interference, and how these abilities become degraded in healthy aging and in neuropsychiatric conditions. This manuscript describes a novel cognitive paradigm developed the Gazzaley lab that has now been modified into several distinct versions used to elucidate behavioral and neural correlates of interference, by to-be-ignored distractors versus to-be-attended interruptors. Details are provided on variants of this paradigm for investigating interference in visual and auditory modalities, at multiple levels of stimulus complexity, and with experimental timing optimized for electroencephalography (EEG) or functional magnetic resonance imaging (fMRI) studies. In addition, data from younger and older adult participants obtained using this paradigm is reviewed and discussed in the context of its relationship with the broader literatures on external interference and age-related neuro-behavioral changes in resolving interference in working memory.
Behavior, Issue 101, Attention, interference, distraction, interruption, working memory, aging, multi-tasking, top-down attention, EEG, fMRI
52226
Play Button
Measurement of the Pressure-volume Curve in Mouse Lungs
Authors: Nathachit Limjunyawong, Jonathan Fallica, Maureen R. Horton, Wayne Mitzner.
Institutions: Johns Hopkins University.
In recent decades the mouse has become the primary animal model of a variety of lung diseases. In models of emphysema or fibrosis, the essential phenotypic changes are best assessed by measurement of the changes in lung elasticity. To best understand specific mechanisms underlying such pathologies in mice, it is essential to make functional measurements that can reflect the developing pathology. Although there are many ways to measure elasticity, the classical method is that of the total lung pressure-volume (PV) curve done over the whole range of lung volumes. This measurement has been made on adult lungs from nearly all mammalian species dating back almost 100 years, and such PV curves also played a major role in the discovery and understanding of the function of pulmonary surfactant in fetal lung development. Unfortunately, such total PV curves have not been widely reported in the mouse, despite the fact that they can provide useful information on the macroscopic effects of structural changes in the lung. Although partial PV curves measuring just the changes in lung volume are sometimes reported, without a measure of absolute volume, the nonlinear nature of the total PV curve makes these partial ones very difficult to interpret. In the present study, we describe a standardized way to measure the total PV curve. We have then tested the ability of these curves to detect changes in mouse lung structure in two common lung pathologies, emphysema and fibrosis. Results showed significant changes in several variables consistent with expected structural changes with these pathologies. This measurement of the lung PV curve in mice thus provides a straightforward means to monitor the progression of the pathophysiologic changes over time and the potential effect of therapeutic procedures.
Medicine, Issue 95, Lung compliance, Lung hysteresis, Pulmonary surfactant, Lung elasticity, Quasistatic compliance, Fibrosis, Emphysema
52376
Play Button
Forward Genetics Screens Using Macrophages to Identify Toxoplasma gondii Genes Important for Resistance to IFN-γ-Dependent Cell Autonomous Immunity
Authors: Odaelys Walwyn, Sini Skariah, Brian Lynch, Nathaniel Kim, Yukari Ueda, Neal Vohora, Josh Choe, Dana G. Mordue.
Institutions: New York Medical College.
Toxoplasma gondii, the causative agent of toxoplasmosis, is an obligate intracellular protozoan pathogen. The parasite invades and replicates within virtually any warm blooded vertebrate cell type. During parasite invasion of a host cell, the parasite creates a parasitophorous vacuole (PV) that originates from the host cell membrane independent of phagocytosis within which the parasite replicates. While IFN-dependent-innate and cell mediated immunity is important for eventual control of infection, innate immune cells, including neutrophils, monocytes and dendritic cells, can also serve as vehicles for systemic dissemination of the parasite early in infection. An approach is described that utilizes the host innate immune response, in this case macrophages, in a forward genetic screen to identify parasite mutants with a fitness defect in infected macrophages following activation but normal invasion and replication in naïve macrophages. Thus, the screen isolates parasite mutants that have a specific defect in their ability to resist the effects of macrophage activation. The paper describes two broad phenotypes of mutant parasites following activation of infected macrophages: parasite stasis versus parasite degradation, often in amorphous vacuoles. The parasite mutants are then analyzed to identify the responsible parasite genes specifically important for resistance to induced mediators of cell autonomous immunity. The paper presents a general approach for the forward genetics screen that, in theory, can be modified to target parasite genes important for resistance to specific antimicrobial mediators. It also describes an approach to evaluate the specific macrophage antimicrobial mediators to which the parasite mutant is susceptible. Activation of infected macrophages can also promote parasite differentiation from the tachyzoite to bradyzoite stage that maintains chronic infection. Therefore, methodology is presented to evaluate the importance of the identified parasite gene to establishment of chronic infection.
Immunology, Issue 97, Toxoplasma, macrophages, innate immunity, intracellular pathogen, immune evasion, infectious disease, forward genetics, parasite
52556
Play Button
Cardiac Catheterization in Mice to Measure the Pressure Volume Relationship: Investigating the Bowditch Effect
Authors: Bo Zhang, Jonathan P. Davis, Mark T. Ziolo.
Institutions: The Ohio State University, Huazhong University of Science and Technology.
Animal models that mimic human cardiac disorders have been created to test potential therapeutic strategies. A key component to evaluating these strategies is to examine their effects on heart function. There are several techniques to measure in vivo cardiac mechanics (e.g., echocardiography, pressure/volume relations, etc.). Compared to echocardiography, real-time left ventricular (LV) pressure/volume analysis via catheterization is more precise and insightful in assessing LV function. Additionally, LV pressure/volume analysis provides the ability to instantaneously record changes during manipulations of contractility (e.g., β-adrenergic stimulation) and pathological insults (e.g., ischemia/reperfusion injury). In addition to the maximum (+dP/dt) and minimum (-dP/dt) rate of pressure change in the LV, an accurate assessment of LV function via several load-independent indexes (e.g., end systolic pressure volume relationship and preload recruitable stroke work) can be attained. Heart rate has a significant effect on LV contractility such that an increase in the heart rate is the primary mechanism to increase cardiac output (i.e., Bowditch effect). Thus, when comparing hemodynamics between experimental groups, it is necessary to have similar heart rates. Furthermore, a hallmark of many cardiomyopathy models is a decrease in contractile reserve (i.e., decreased Bowditch effect). Consequently, vital information can be obtained by determining the effects of increasing heart rate on contractility. Our and others data has demonstrated that the neuronal nitric oxide synthase (NOS1) knockout mouse has decreased contractility. Here we describe the procedure of measuring LV pressure/volume with increasing heart rates using the NOS1 knockout mouse model.
Medicine, Issue 100, murine, catheterization, contractility, PV loops, end-systolic pressure volume relationship, preload recruitable stroke work, NOS1
52618
Play Button
Development of a Quantitative Recombinase Polymerase Amplification Assay with an Internal Positive Control
Authors: Zachary A. Crannell, Brittany Rohrman, Rebecca Richards-Kortum.
Institutions: Rice University.
It was recently demonstrated that recombinase polymerase amplification (RPA), an isothermal amplification platform for pathogen detection, may be used to quantify DNA sample concentration using a standard curve. In this manuscript, a detailed protocol for developing and implementing a real-time quantitative recombinase polymerase amplification assay (qRPA assay) is provided. Using HIV-1 DNA quantification as an example, the assembly of real-time RPA reactions, the design of an internal positive control (IPC) sequence, and co-amplification of the IPC and target of interest are all described. Instructions and data processing scripts for the construction of a standard curve using data from multiple experiments are provided, which may be used to predict the concentration of unknown samples or assess the performance of the assay. Finally, an alternative method for collecting real-time fluorescence data with a microscope and a stage heater as a step towards developing a point-of-care qRPA assay is described. The protocol and scripts provided may be used for the development of a qRPA assay for any DNA target of interest.
Genetics, Issue 97, recombinase polymerase amplification, isothermal amplification, quantitative, diagnostic, HIV-1, viral load
52620
Play Button
The Double-H Maze: A Robust Behavioral Test for Learning and Memory in Rodents
Authors: Robert D. Kirch, Richard C. Pinnell, Ulrich G. Hofmann, Jean-Christophe Cassel.
Institutions: University Hospital Freiburg, UMR 7364 Université de Strasbourg, CNRS, Neuropôle de Strasbourg.
Spatial cognition research in rodents typically employs the use of maze tasks, whose attributes vary from one maze to the next. These tasks vary by their behavioral flexibility and required memory duration, the number of goals and pathways, and also the overall task complexity. A confounding feature in many of these tasks is the lack of control over the strategy employed by the rodents to reach the goal, e.g., allocentric (declarative-like) or egocentric (procedural) based strategies. The double-H maze is a novel water-escape memory task that addresses this issue, by allowing the experimenter to direct the type of strategy learned during the training period. The double-H maze is a transparent device, which consists of a central alleyway with three arms protruding on both sides, along with an escape platform submerged at the extremity of one of these arms. Rats can be trained using an allocentric strategy by alternating the start position in the maze in an unpredictable manner (see protocol 1; §4.7), thus requiring them to learn the location of the platform based on the available allothetic cues. Alternatively, an egocentric learning strategy (protocol 2; §4.8) can be employed by releasing the rats from the same position during each trial, until they learn the procedural pattern required to reach the goal. This task has been proven to allow for the formation of stable memory traces. Memory can be probed following the training period in a misleading probe trial, in which the starting position for the rats alternates. Following an egocentric learning paradigm, rats typically resort to an allocentric-based strategy, but only when their initial view on the extra-maze cues differs markedly from their original position. This task is ideally suited to explore the effects of drugs/perturbations on allocentric/egocentric memory performance, as well as the interactions between these two memory systems.
Behavior, Issue 101, Double-H maze, spatial memory, procedural memory, consolidation, allocentric, egocentric, habits, rodents, video tracking system
52667
Play Button
Novel 3D/VR Interactive Environment for MD Simulations, Visualization and Analysis
Authors: Benjamin N. Doblack, Tim Allis, Lilian P. Dávila.
Institutions: University of California Merced.
The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.
Physics, Issue 94, Computational systems, visualization and immersive environments, interactive learning, graphical processing unit accelerated simulations, molecular dynamics simulations, nanostructures.
51384
Play Button
Demonstrating a Multi-drug Resistant Mycobacterium tuberculosis Amplification Microarray
Authors: Yvonne Linger, Alexander Kukhtin, Julia Golova, Alexander Perov, Peter Qu, Christopher Knickerbocker, Christopher G. Cooney, Darrell P. Chandler.
Institutions: Akonni Biosystems, Inc..
Simplifying microarray workflow is a necessary first step for creating MDR-TB microarray-based diagnostics that can be routinely used in lower-resource environments. An amplification microarray combines asymmetric PCR amplification, target size selection, target labeling, and microarray hybridization within a single solution and into a single microfluidic chamber. A batch processing method is demonstrated with a 9-plex asymmetric master mix and low-density gel element microarray for genotyping multi-drug resistant Mycobacterium tuberculosis (MDR-TB). The protocol described here can be completed in 6 hr and provide correct genotyping with at least 1,000 cell equivalents of genomic DNA. Incorporating on-chip wash steps is feasible, which will result in an entirely closed amplicon method and system. The extent of multiplexing with an amplification microarray is ultimately constrained by the number of primer pairs that can be combined into a single master mix and still achieve desired sensitivity and specificity performance metrics, rather than the number of probes that are immobilized on the array. Likewise, the total analysis time can be shortened or lengthened depending on the specific intended use, research question, and desired limits of detection. Nevertheless, the general approach significantly streamlines microarray workflow for the end user by reducing the number of manually intensive and time-consuming processing steps, and provides a simplified biochemical and microfluidic path for translating microarray-based diagnostics into routine clinical practice.
Immunology, Issue 86, MDR-TB, gel element microarray, closed amplicon, drug resistance, rifampin, isoniazid, streptomycin, ethambutol
51256
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
51216
Play Button
Particle Agglutination Method for Poliovirus Identification
Authors: Minetaro Arita, Souji Masujima, Takaji Wakita, Hiroyuki Shimizu.
Institutions: National Institute of Infectious Diseases, Fujirebio Inc..
In the Global Polio Eradication Initiative, laboratory diagnosis plays a critical role by isolating and identifying PV from the stool samples of acute flaccid paralysis (AFP) cases. In the World Health Organization (WHO) Global Polio Laboratory Network, PV isolation and identification are currently being performed by using cell culture system and real-time RT-PCR, respectively. In the post-eradication era of PV, simple and rapid identification procedures would be helpful for rapid confirmation of polio cases at the national laboratories. In the present study, we will show the procedure of novel PA assay developed for PV identification. This PA assay utilizes interaction of PV receptor (PVR) molecule and virion that is specific and uniform affinity to all the serotypes of PV. The procedure is simple (one step procedure in reaction plates) and rapid (results can be obtained within 2 h of reaction), and the result is visually observed (observation of agglutination of gelatin particles).
Immunology, Issue 50, Poliovirus, identification, particle agglutination, virus receptor
2824
Play Button
Development of Obliterative Bronchiolitis in a Murine Model of Orthotopic Lung Transplantation
Authors: Hidemi Suzuki, Lin Fan, David S. Wilkes.
Institutions: Indiana University School of Medicine, Indiana University School of Medicine.
Orthotopic lung transplantation in rats was first reported by Asimacopoulos and colleagues in 1971 1. Currently, this method is well accepted and standardized not only for the study of allo-rejection but also between syngeneic strains for examining mechanisms of ischemia-reperfusion injury after lung transplantation. Although the application of the rat and other large animal model 2 contributed significantly to the elucidation of these studies, the scope of those investigations is limited by the scarcity of knockout and transgenic rats. Due to no effective therapies for obliterative bronchiolitis, the leading cause of death in lung transplant patients, there has been an intensive search for pre-clinical models that replicate obliterative bronchiolitis. The tracheal allograft model is the most widely used and may reproduce some of the histopathologic features of obliterative bronchiolitis 3. However, the lack of an intact vasculature with no connection to the recipient's conducting airways, and incomplete pathologic features of obliterative bronchiolitis limit the utility of this model 4. Unlike transplantation of other solid organs, vascularized mouse lung transplants have only recently been reported by Okazaki and colleagues for the first time in 2007 5. Applying the basic principles of the rat lung transplant, our lab initiated the obliterative bronchiolitis model using minor histoincompatible antigen murine orthotopic single-left lung transplants which allows the further study of obliterative bronchiolitis immunopathogenesis6.
Medicine, Issue 65, Immunology, Microbiology, Physiology, lung, transplantation, mouse, obliterative bronchiolitis, vascularized lung transplants
3947
Play Button
Direct Pressure Monitoring Accurately Predicts Pulmonary Vein Occlusion During Cryoballoon Ablation
Authors: Ioanna Kosmidou, Shannnon Wooden, Brian Jones, Thomas Deering, Andrew Wickliffe, Dan Dan.
Institutions: Piedmont Heart Institute, Medtronic Inc..
Cryoballoon ablation (CBA) is an established therapy for atrial fibrillation (AF). Pulmonary vein (PV) occlusion is essential for achieving antral contact and PV isolation and is typically assessed by contrast injection. We present a novel method of direct pressure monitoring for assessment of PV occlusion. Transcatheter pressure is monitored during balloon advancement to the PV antrum. Pressure is recorded via a single pressure transducer connected to the inner lumen of the cryoballoon. Pressure curve characteristics are used to assess occlusion in conjunction with fluoroscopic or intracardiac echocardiography (ICE) guidance. PV occlusion is confirmed when loss of typical left atrial (LA) pressure waveform is observed with recordings of PA pressure characteristics (no A wave and rapid V wave upstroke). Complete pulmonary vein occlusion as assessed with this technique has been confirmed with concurrent contrast utilization during the initial testing of the technique and has been shown to be highly accurate and readily reproducible. We evaluated the efficacy of this novel technique in 35 patients. A total of 128 veins were assessed for occlusion with the cryoballoon utilizing the pressure monitoring technique; occlusive pressure was demonstrated in 113 veins with resultant successful pulmonary vein isolation in 111 veins (98.2%). Occlusion was confirmed with subsequent contrast injection during the initial ten procedures, after which contrast utilization was rapidly reduced or eliminated given the highly accurate identification of occlusive pressure waveform with limited initial training. Verification of PV occlusive pressure during CBA is a novel approach to assessing effective PV occlusion and it accurately predicts electrical isolation. Utilization of this method results in significant decrease in fluoroscopy time and volume of contrast.
Medicine, Issue 72, Anatomy, Physiology, Cardiology, Biomedical Engineering, Surgery, Cardiovascular System, Cardiovascular Diseases, Surgical Procedures, Operative, Investigative Techniques, Atrial fibrillation, Cryoballoon Ablation, Pulmonary Vein Occlusion, Pulmonary Vein Isolation, electrophysiology, catheterizatoin, heart, vein, clinical, surgical device, surgical techniques
50247
Play Button
Concurrent Quantitative Conductivity and Mechanical Properties Measurements of Organic Photovoltaic Materials using AFM
Authors: Maxim P. Nikiforov, Seth B. Darling.
Institutions: Argonne National Laboratory, University of Chicago.
Organic photovoltaic (OPV) materials are inherently inhomogeneous at the nanometer scale. Nanoscale inhomogeneity of OPV materials affects performance of photovoltaic devices. Thus, understanding of spatial variations in composition as well as electrical properties of OPV materials is of paramount importance for moving PV technology forward.1,2 In this paper, we describe a protocol for quantitative measurements of electrical and mechanical properties of OPV materials with sub-100 nm resolution. Currently, materials properties measurements performed using commercially available AFM-based techniques (PeakForce, conductive AFM) generally provide only qualitative information. The values for resistance as well as Young's modulus measured using our method on the prototypical ITO/PEDOT:PSS/P3HT:PC61BM system correspond well with literature data. The P3HT:PC61BM blend separates onto PC61BM-rich and P3HT-rich domains. Mechanical properties of PC61BM-rich and P3HT-rich domains are different, which allows for domain attribution on the surface of the film. Importantly, combining mechanical and electrical data allows for correlation of the domain structure on the surface of the film with electrical properties variation measured through the thickness of the film.
Materials Science, Issue 71, Nanotechnology, Mechanical Engineering, Electrical Engineering, Computer Science, Physics, electrical transport properties in solids, condensed matter physics, thin films (theory, deposition and growth), conductivity (solid state), AFM, atomic force microscopy, electrical properties, mechanical properties, organic photovoltaics, microengineering, photovoltaics
50293
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (http://www.proteinwisdom.org), a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
50476
Play Button
The Generation of Higher-order Laguerre-Gauss Optical Beams for High-precision Interferometry
Authors: Ludovico Carbone, Paul Fulda, Charlotte Bond, Frank Brueckner, Daniel Brown, Mengyao Wang, Deepali Lodhia, Rebecca Palmer, Andreas Freise.
Institutions: University of Birmingham.
Thermal noise in high-reflectivity mirrors is a major impediment for several types of high-precision interferometric experiments that aim to reach the standard quantum limit or to cool mechanical systems to their quantum ground state. This is for example the case of future gravitational wave observatories, whose sensitivity to gravitational wave signals is expected to be limited in the most sensitive frequency band, by atomic vibration of their mirror masses. One promising approach being pursued to overcome this limitation is to employ higher-order Laguerre-Gauss (LG) optical beams in place of the conventionally used fundamental mode. Owing to their more homogeneous light intensity distribution these beams average more effectively over the thermally driven fluctuations of the mirror surface, which in turn reduces the uncertainty in the mirror position sensed by the laser light. We demonstrate a promising method to generate higher-order LG beams by shaping a fundamental Gaussian beam with the help of diffractive optical elements. We show that with conventional sensing and control techniques that are known for stabilizing fundamental laser beams, higher-order LG modes can be purified and stabilized just as well at a comparably high level. A set of diagnostic tools allows us to control and tailor the properties of generated LG beams. This enabled us to produce an LG beam with the highest purity reported to date. The demonstrated compatibility of higher-order LG modes with standard interferometry techniques and with the use of standard spherical optics makes them an ideal candidate for application in a future generation of high-precision interferometry.
Physics, Issue 78, Optics, Astronomy, Astrophysics, Gravitational waves, Laser interferometry, Metrology, Thermal noise, Laguerre-Gauss modes, interferometry
50564
Play Button
Test Samples for Optimizing STORM Super-Resolution Microscopy
Authors: Daniel J. Metcalf, Rebecca Edwards, Neelam Kumarswami, Alex E. Knight.
Institutions: National Physical Laboratory.
STORM is a recently developed super-resolution microscopy technique with up to 10 times better resolution than standard fluorescence microscopy techniques. However, as the image is acquired in a very different way than normal, by building up an image molecule-by-molecule, there are some significant challenges for users in trying to optimize their image acquisition. In order to aid this process and gain more insight into how STORM works we present the preparation of 3 test samples and the methodology of acquiring and processing STORM super-resolution images with typical resolutions of between 30-50 nm. By combining the test samples with the use of the freely available rainSTORM processing software it is possible to obtain a great deal of information about image quality and resolution. Using these metrics it is then possible to optimize the imaging procedure from the optics, to sample preparation, dye choice, buffer conditions, and image acquisition settings. We also show examples of some common problems that result in poor image quality, such as lateral drift, where the sample moves during image acquisition and density related problems resulting in the 'mislocalization' phenomenon.
Molecular Biology, Issue 79, Genetics, Bioengineering, Biomedical Engineering, Biophysics, Basic Protocols, HeLa Cells, Actin Cytoskeleton, Coated Vesicles, Receptor, Epidermal Growth Factor, Actins, Fluorescence, Endocytosis, Microscopy, STORM, super-resolution microscopy, nanoscopy, cell biology, fluorescence microscopy, test samples, resolution, actin filaments, fiducial markers, epidermal growth factor, cell, imaging
50579
Play Button
Oscillation and Reaction Board Techniques for Estimating Inertial Properties of a Below-knee Prosthesis
Authors: Jeremy D. Smith, Abbie E. Ferris, Gary D. Heise, Richard N. Hinrichs, Philip E. Martin.
Institutions: University of Northern Colorado, Arizona State University, Iowa State University.
The purpose of this study was two-fold: 1) demonstrate a technique that can be used to directly estimate the inertial properties of a below-knee prosthesis, and 2) contrast the effects of the proposed technique and that of using intact limb inertial properties on joint kinetic estimates during walking in unilateral, transtibial amputees. An oscillation and reaction board system was validated and shown to be reliable when measuring inertial properties of known geometrical solids. When direct measurements of inertial properties of the prosthesis were used in inverse dynamics modeling of the lower extremity compared with inertial estimates based on an intact shank and foot, joint kinetics at the hip and knee were significantly lower during the swing phase of walking. Differences in joint kinetics during stance, however, were smaller than those observed during swing. Therefore, researchers focusing on the swing phase of walking should consider the impact of prosthesis inertia property estimates on study outcomes. For stance, either one of the two inertial models investigated in our study would likely lead to similar outcomes with an inverse dynamics assessment.
Bioengineering, Issue 87, prosthesis inertia, amputee locomotion, below-knee prosthesis, transtibial amputee
50977
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
51047
Play Button
Characterization of Thermal Transport in One-dimensional Solid Materials
Authors: Guoqing Liu, Huan Lin, Xiaoduan Tang, Kevin Bergler, Xinwei Wang.
Institutions: Iowa State University.
The TET (transient electro-thermal) technique is an effective approach developed to measure the thermal diffusivity of solid materials, including conductive, semi-conductive or nonconductive one-dimensional structures. This technique broadens the measurement scope of materials (conductive and nonconductive) and improves the accuracy and stability. If the sample (especially biomaterials, such as human head hair, spider silk, and silkworm silk) is not conductive, it will be coated with a gold layer to make it electronically conductive. The effect of parasitic conduction and radiative losses on the thermal diffusivity can be subtracted during data processing. Then the real thermal conductivity can be calculated with the given value of volume-based specific heat (ρcp), which can be obtained from calibration, noncontact photo-thermal technique or measuring the density and specific heat separately. In this work, human head hair samples are used to show how to set up the experiment, process the experimental data, and subtract the effect of parasitic conduction and radiative losses.
Physics, Issue 83, thermal transport, thermal diffusivity, thermal conductivity, transient electro-thermal technique, volume-based specific heat, human head hair
51144
Play Button
Fabrication of High Contrast Gratings for the Spectrum Splitting Dispersive Element in a Concentrated Photovoltaic System
Authors: Yuhan Yao, He Liu, Wei Wu.
Institutions: University of Sothern California.
High contrast gratings are designed and fabricated and its application is proposed in a parallel spectrum splitting dispersive element that can improve the solar conversion efficiency of a concentrated photovoltaic system. The proposed system will also lower the solar cell cost in the concentrated photovoltaic system by replacing the expensive tandem solar cells with the cost-effective single junction solar cells. The structures and the parameters of high contrast gratings for the dispersive elements were numerically optimized. The large-area fabrication of high contrast gratings was experimentally demonstrated using nanoimprint lithography and dry etching. The quality of grating material and the performance of the fabricated device were both experimentally characterized. By analyzing the measurement results, the possible side effects from the fabrication processes are discussed and several methods that have the potential to improve the fabrication processes are proposed, which can help to increase the optical efficiency of the fabricated devices.
Engineering, Issue 101, Parallel spectrum splitting, dispersive element, high contrast grating, concentrated photovoltaic system, nanoimprint lithography, reactive ion etching
52913
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.