JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Why are product prices in online markets not converging?
PUBLISHED: 01-01-2013
Why are product prices in online markets dispersed in spite of very small search costs? To address this question, we construct a unique dataset from a Japanese price comparison site, which records price quotes offered by e-retailers as well as customers clicks on products, which occur when they proceed to purchase the product. The novelty of our approach is that we seek to extract useful information on the source of price dispersion from the shape of price distributions rather than focusing merely on the standard deviation or the coefficient of variation of prices, as previous studies have done. We find that the distribution of prices retailers quote for a particular product at a particular point in time (divided by the lowest price) follows an exponential distribution, showing the presence of substantial price dispersion. For example, 20 percent of all retailers quote prices that are more than 50 percent higher than the lowest price. Next, comparing the probability that customers click on a retailer with a particular rank and the probability that retailers post prices at a particular rank, we show that both decline exponentially with price rank and that the exponents associated with the probabilities are quite close. This suggests that the reason why some retailers set prices at a level substantially higher than the lowest price is that they know that some customers will choose them even at that high price. Based on these findings, we hypothesize that price dispersion in online markets stems from heterogeneity in customers preferences over retailers; that is, customers choose a set of candidate retailers based on their preferences, which are heterogeneous across customers, and then pick a particular retailer among the candidates based on the price ranking.
Authors: Thomas Bodmer, Angelika Ströhle.
Published: 04-09-2012
Tuberculosis (TB) due to Mycobacterium tuberculosis (MTB) remains a major public health issue: the infection affects up to one third of the world population1, and almost two million people are killed by TB each year.2 Universal access to high-quality, patient-centered treatment for all TB patients is emphasized by WHO's Stop TB Strategy.3 The rapid detection of MTB in respiratory specimens and drug therapy based on reliable drug resistance testing results are a prerequisite for the successful implementation of this strategy. However, in many areas of the world, TB diagnosis still relies on insensitive, poorly standardized sputum microscopy methods. Ineffective TB detection and the emergence and transmission of drug-resistant MTB strains increasingly jeopardize global TB control activities.2 Effective diagnosis of pulmonary TB requires the availability - on a global scale - of standardized, easy-to-use, and robust diagnostic tools that would allow the direct detection of both the MTB complex and resistance to key antibiotics, such as rifampicin (RIF). The latter result can serve as marker for multidrug-resistant MTB (MDR TB) and has been reported in > 95% of the MDR-TB isolates.4, 5 The rapid availability of reliable test results is likely to directly translate into sound patient management decisions that, ultimately, will cure the individual patient and break the chain of TB transmission in the community.2 Cepheid's (Sunnyvale, CA, U.S.A.) Xpert MTB/RIF assay6, 7 meets the demands outlined above in a remarkable manner. It is a nucleic-acids amplification test for 1) the detection of MTB complex DNA in sputum or concentrated sputum sediments; and 2) the detection of RIF resistance-associated mutations of the rpoB gene.8 It is designed for use with Cepheid's GeneXpert Dx System that integrates and automates sample processing, nucleic acid amplification, and detection of the target sequences using real-time PCR and reverse transcriptase PCR. The system consists of an instrument, personal computer, barcode scanner, and preloaded software for running tests and viewing the results.9 It employs single-use disposable Xpert MTB/RIF cartridges that hold PCR reagents and host the PCR process. Because the cartridges are self-contained, cross-contamination between samples is eliminated.6 Current nucleic acid amplification methods used to detect MTB are complex, labor-intensive, and technically demanding. The Xpert MTB/RIF assay has the potential to bring standardized, sensitive and very specific diagnostic testing for both TB and drug resistance to universal-access point-of-care settings3, provided that they will be able to afford it. In order to facilitate access, the Foundation for Innovative New Diagnostics (FIND) has negotiated significant price reductions. Current FIND-negotiated prices, along with the list of countries eligible for the discounts, are available on the web.10
26 Related JoVE Articles!
Play Button
Contrast Ultrasound Targeted Treatment of Gliomas in Mice via Drug-Bearing Nanoparticle Delivery and Microvascular Ablation
Authors: Caitlin W. Burke, Richard J. Price.
Institutions: University of Virginia , University of Virginia.
We are developing minimally-invasive contrast agent microbubble based therapeutic approaches in which the permeabilization and/or ablation of the microvasculature are controlled by varying ultrasound pulsing parameters. Specifically, we are testing whether such approaches may be used to treat malignant brain tumors through drug delivery and microvascular ablation. Preliminary studies have been performed to determine whether targeted drug-bearing nanoparticle delivery can be facilitated by the ultrasound mediated destruction of "composite" delivery agents comprised of 100nm poly(lactide-co-glycolide) (PLAGA) nanoparticles that are adhered to albumin shelled microbubbles. We denote these agents as microbubble-nanoparticle composite agents (MNCAs). When targeted to subcutaneous C6 gliomas with ultrasound, we observed an immediate 4.6-fold increase in nanoparticle delivery in MNCA treated tumors over tumors treated with microbubbles co-administered with nanoparticles and a 8.5 fold increase over non-treated tumors. Furthermore, in many cancer applications, we believe it may be desirable to perform targeted drug delivery in conjunction with ablation of the tumor microcirculation, which will lead to tumor hypoxia and apoptosis. To this end, we have tested the efficacy of non-theramal cavitation-induced microvascular ablation, showing that this approach elicits tumor perfusion reduction, apoptosis, significant growth inhibition, and necrosis. Taken together, these results indicate that our ultrasound-targeted approach has the potential to increase therapeutic efficiency by creating tumor necrosis through microvascular ablation and/or simultaneously enhancing the drug payload in gliomas.
Medicine, Issue 46, microbubbles, targeted drug delivery, nanoparticles, ultrasound
Play Button
Chicken Embryo Spinal Cord Slice Culture Protocol
Authors: Kristina C. Tubby, Dee Norval, Stephen R. Price.
Institutions: University College London.
Slice cultures can facilitate the manipulation of embryo development both pharmacologically and through gene manipulations. In this reduced system, potential lethal side effects due to systemic drug applications can be overcome. However, culture conditions must ensure that normal development proceeds within the reduced environment of the slice. We have focused on the development of the spinal cord, particularly that of spinal motor neurons. We systematically varied culture conditions of chicken embryo slices from the point at which most spinal motor neurons had been born. We assayed the number and type of motor neurons that survived during the culture period and the position of those motor neurons compared to that in vivo. We found that serum type and neurotrophic factors were required during the culture period and were able to keep motor neurons alive for at least 24 hr and allow those motor neurons to migrate to appropriate positions in the spinal cord. We present these culture conditions and the methodology of preparing the embryo slice cultures using eviscerated chicken embryos embedded in agarose and sliced using a vibratome.
Developmental Biology, Issue 73, Neurobiology, Neuroscience, Medicine, Cellular Biology, Molecular Biology, Anatomy, Physiology, Biomedical Engineering, Genetics, Surgery, Cells, Animal Structures, Embryonic Structures, Nervous System, spinal cord, embryo, development, Slice-Culture, motor neuron, neurons, immunostaining, chick, imaging, animal model
Play Button
Murine Corneal Transplantation: A Model to Study the Most Common Form of Solid Organ Transplantation
Authors: Xiao-Tang Yin, Deena A. Tajfirouz, Patrick M. Stuart.
Institutions: Saint Louis University.
Corneal transplantation is the most common form of organ transplantation in the United States with between 45,000 and 55,000 procedures performed each year. While several animal models exist for this procedure and mice are the species that is most commonly used. The reasons for using mice are the relative cost of using this species, the existence of many genetically defined strains that allow for the study of immune responses, and the existence of an extensive array of reagents that can be used to further define responses in this species. This model has been used to define factors in the cornea that are responsible for the relative immune privilege status of this tissue that enables corneal allografts to survive acute rejection in the absence of immunosuppressive therapy. It has also been used to define those factors that are most important in rejection of such allografts. Consequently, much of what we know concerning mechanisms of both corneal allograft acceptance and rejection are due to studies using a murine model of corneal transplantation. In addition to describing a model for acute corneal allograft rejection, we also present for the first time a model of late-term corneal allograft rejection.
Immunology, Issue 93, Transplantation, Allograft Responses, Immune Privilege, Cornea, Inflammatory cells, T cells, Macrophages
Play Button
Molecular Entanglement and Electrospinnability of Biopolymers
Authors: Lingyan Kong, Gregory R. Ziegler.
Institutions: Pennsylvania State University.
Electrospinning is a fascinating technique to fabricate micro- to nano-scale fibers from a wide variety of materials. For biopolymers, molecular entanglement of the constituent polymers in the spinning dope was found to be an essential prerequisite for successful electrospinning. Rheology is a powerful tool to probe the molecular conformation and interaction of biopolymers. In this report, we demonstrate the protocol for utilizing rheology to evaluate the electrospinnability of two biopolymers, starch and pullulan, from their dimethyl sulfoxide (DMSO)/water dispersions. Well-formed starch and pullulan fibers with average diameters in the submicron to micron range were obtained. Electrospinnability was evaluated by visual and microscopic observation of the fibers formed. By correlating the rheological properties of the dispersions to their electrospinnability, we demonstrate that molecular conformation, molecular entanglement, and shear viscosity all affect electrospinning. Rheology is not only useful in solvent system selection and process optimization, but also in understanding the mechanism of fiber formation on a molecular level.
Bioengineering, Issue 91, electrospinning, rheology, molecular entanglement, fiber, nanofiber, biopolymer, polysaccharides, starch, pullulan
Play Button
Real-Time DC-dynamic Biasing Method for Switching Time Improvement in Severely Underdamped Fringing-field Electrostatic MEMS Actuators
Authors: Joshua Small, Adam Fruehling, Anurag Garg, Xiaoguang Liu, Dimitrios Peroulis.
Institutions: University of California, Davis, Texas Instruments, Purdue University.
Mechanically underdamped electrostatic fringing-field MEMS actuators are well known for their fast switching operation in response to a unit step input bias voltage. However, the tradeoff for the improved switching performance is a relatively long settling time to reach each gap height in response to various applied voltages. Transient applied bias waveforms are employed to facilitate reduced switching times for electrostatic fringing-field MEMS actuators with high mechanical quality factors. Removing the underlying substrate of the fringing-field actuator creates the low mechanical damping environment necessary to effectively test the concept. The removal of the underlying substrate also a has substantial improvement on the reliability performance of the device in regards to failure due to stiction. Although DC-dynamic biasing is useful in improving settling time, the required slew rates for typical MEMS devices may place aggressive requirements on the charge pumps for fully-integrated on-chip designs. Additionally, there may be challenges integrating the substrate removal step into the back-end-of-line commercial CMOS processing steps. Experimental validation of fabricated actuators demonstrates an improvement of 50x in switching time when compared to conventional step biasing results. Compared to theoretical calculations, the experimental results are in good agreement.
Physics, Issue 90, microelectromechanical systems, actuators, switching time, settling time, electrostatic devices, micromachining, thin film devices
Play Button
Mouse Genome Engineering Using Designer Nucleases
Authors: Mario Hermann, Tomas Cermak, Daniel F. Voytas, Pawel Pelczar.
Institutions: University of Zurich, University of Minnesota.
Transgenic mice carrying site-specific genome modifications (knockout, knock-in) are of vital importance for dissecting complex biological systems as well as for modeling human diseases and testing therapeutic strategies. Recent advances in the use of designer nucleases such as zinc finger nucleases (ZFNs), transcription activator-like effector nucleases (TALENs), and the clustered regularly interspaced short palindromic repeats (CRISPR)/CRISPR-associated (Cas) 9 system for site-specific genome engineering open the possibility to perform rapid targeted genome modification in virtually any laboratory species without the need to rely on embryonic stem (ES) cell technology. A genome editing experiment typically starts with identification of designer nuclease target sites within a gene of interest followed by construction of custom DNA-binding domains to direct nuclease activity to the investigator-defined genomic locus. Designer nuclease plasmids are in vitro transcribed to generate mRNA for microinjection of fertilized mouse oocytes. Here, we provide a protocol for achieving targeted genome modification by direct injection of TALEN mRNA into fertilized mouse oocytes.
Genetics, Issue 86, Oocyte microinjection, Designer nucleases, ZFN, TALEN, Genome Engineering
Play Button
Tracking Microbial Contamination in Retail Environments Using Fluorescent Powder - A Retail Delicatessen Environment Example
Authors: Sujata A. Sirsat, Kawon Kim, Kristen E. Gibson, Phillip G. Crandall, Steven C. Ricke, Jack A. Neal.
Institutions: University of Houston, University of Arkansas.
Cross contamination of foodborne pathogens in the retail environment is a significant public health issue contributing to an increased risk for foodborne illness. Ready-to-eat (RTE) processed foods such as deli meats, cheese, and in some cases fresh produce, have been involved in foodborne disease outbreaks due to contamination with pathogens such as Listeria monocytogenes. With respect to L. monocytogenes, deli slicers are often the main source of cross contamination. The goal of this study was to use a fluorescent compound to simulate bacterial contamination and track this contamination in a retail setting. A mock deli kitchen was designed to simulate the retail environment. Deli meat was inoculated with the fluorescent compound and volunteers were recruited to complete a set of tasks similar to those expected of a food retail employee. The volunteers were instructed to slice, package, and store the meat in a deli refrigerator. The potential cross contamination was tracked in the mock retail environment by swabbing specific areas and measuring the optical density of the swabbed area with a spectrophotometer. The results indicated that the refrigerator (i.e. deli case) grip and various areas on the slicer had the highest risk for cross contamination. The results of this study may be used to develop more focused training material for retail employees. In addition, similar methodologies could also be used to track microbial contamination in food production environments (e.g. small farms), hospitals, nursing homes, cruise ships, and hotels.
Environmental Sciences, Issue 85, cross contamination, retail deli, fluorescent powder, Listeria monocytogenes, foodborne pathogens
Play Button
DNA Fingerprinting of Mycobacterium leprae Strains Using Variable Number Tandem Repeat (VNTR) - Fragment Length Analysis (FLA)
Authors: Ronald W. Jensen, Jason Rivest, Wei Li, Varalakshmi Vissa.
Institutions: Colorado State University.
The study of the transmission of leprosy is particularly difficult since the causative agent, Mycobacterium leprae, cannot be cultured in the laboratory. The only sources of the bacteria are leprosy patients, and experimentally infected armadillos and nude mice. Thus, many of the methods used in modern epidemiology are not available for the study of leprosy. Despite an extensive global drug treatment program for leprosy implemented by the WHO1, leprosy remains endemic in many countries with approximately 250,000 new cases each year.2 The entire M. leprae genome has been mapped3,4 and many loci have been identified that have repeated segments of 2 or more base pairs (called micro- and minisatellites).5 Clinical strains of M. leprae may vary in the number of tandem repeated segments (short tandem repeats, STR) at many of these loci.5,6,7 Variable number tandem repeat (VNTR)5 analysis has been used to distinguish different strains of the leprosy bacilli. Some of the loci appear to be more stable than others, showing less variation in repeat numbers, while others seem to change more rapidly, sometimes in the same patient. While the variability of certain VNTRs has brought up questions regarding their suitability for strain typing7,8,9, the emerging data suggest that analyzing multiple loci, which are diverse in their stability, can be used as a valuable epidemiological tool. Multiple locus VNTR analysis (MLVA)10 has been used to study leprosy evolution and transmission in several countries including China11,12, Malawi8, the Philippines10,13, and Brazil14. MLVA involves multiple steps. First, bacterial DNA is extracted along with host tissue DNA from clinical biopsies or slit skin smears (SSS).10 The desired loci are then amplified from the extracted DNA via polymerase chain reaction (PCR). Fluorescently-labeled primers for 4-5 different loci are used per reaction, with 18 loci being amplified in a total of four reactions.10 The PCR products may be subjected to agarose gel electrophoresis to verify the presence of the desired DNA segments, and then submitted for fluorescent fragment length analysis (FLA) using capillary electrophoresis. DNA from armadillo passaged bacteria with a known number of repeat copies for each locus is used as a positive control. The FLA chromatograms are then examined using Peak Scanner software and fragment length is converted to number of VNTR copies (allele). Finally, the VNTR haplotypes are analyzed for patterns, and when combined with patient clinical data can be used to track distribution of strain types.
Immunology, Issue 53, Mycobacterium leprae, leprosy, biopsy, STR, VNTR, PCR, fragment length analysis
Play Button
Polymerase Chain Reaction: Basic Protocol Plus Troubleshooting and Optimization Strategies
Authors: Todd C. Lorenz.
Institutions: University of California, Los Angeles .
In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: ● Set up reactions and thermal cycling conditions for a conventional PCR experiment ● Understand the function of various reaction components and their overall effect on a PCR experiment ● Design and optimize a PCR experiment for any DNA template ● Troubleshoot failed PCR experiments
Basic Protocols, Issue 63, PCR, optimization, primer design, melting temperature, Tm, troubleshooting, additives, enhancers, template DNA quantification, thermal cycler, molecular biology, genetics
Play Button
Engineering Platform and Experimental Protocol for Design and Evaluation of a Neurally-controlled Powered Transfemoral Prosthesis
Authors: Fan Zhang, Ming Liu, Stephen Harper, Michael Lee, He Huang.
Institutions: North Carolina State University & University of North Carolina at Chapel Hill, University of North Carolina School of Medicine, Atlantic Prosthetics & Orthotics, LLC.
To enable intuitive operation of powered artificial legs, an interface between user and prosthesis that can recognize the user's movement intent is desired. A novel neural-machine interface (NMI) based on neuromuscular-mechanical fusion developed in our previous study has demonstrated a great potential to accurately identify the intended movement of transfemoral amputees. However, this interface has not yet been integrated with a powered prosthetic leg for true neural control. This study aimed to report (1) a flexible platform to implement and optimize neural control of powered lower limb prosthesis and (2) an experimental setup and protocol to evaluate neural prosthesis control on patients with lower limb amputations. First a platform based on a PC and a visual programming environment were developed to implement the prosthesis control algorithms, including NMI training algorithm, NMI online testing algorithm, and intrinsic control algorithm. To demonstrate the function of this platform, in this study the NMI based on neuromuscular-mechanical fusion was hierarchically integrated with intrinsic control of a prototypical transfemoral prosthesis. One patient with a unilateral transfemoral amputation was recruited to evaluate our implemented neural controller when performing activities, such as standing, level-ground walking, ramp ascent, and ramp descent continuously in the laboratory. A novel experimental setup and protocol were developed in order to test the new prosthesis control safely and efficiently. The presented proof-of-concept platform and experimental setup and protocol could aid the future development and application of neurally-controlled powered artificial legs.
Biomedical Engineering, Issue 89, neural control, powered transfemoral prosthesis, electromyography (EMG), neural-machine interface, experimental setup and protocol
Play Button
Diffusion Tensor Magnetic Resonance Imaging in the Analysis of Neurodegenerative Diseases
Authors: Hans-Peter Müller, Jan Kassubek.
Institutions: University of Ulm.
Diffusion tensor imaging (DTI) techniques provide information on the microstructural processes of the cerebral white matter (WM) in vivo. The present applications are designed to investigate differences of WM involvement patterns in different brain diseases, especially neurodegenerative disorders, by use of different DTI analyses in comparison with matched controls. DTI data analysis is performed in a variate fashion, i.e. voxelwise comparison of regional diffusion direction-based metrics such as fractional anisotropy (FA), together with fiber tracking (FT) accompanied by tractwise fractional anisotropy statistics (TFAS) at the group level in order to identify differences in FA along WM structures, aiming at the definition of regional patterns of WM alterations at the group level. Transformation into a stereotaxic standard space is a prerequisite for group studies and requires thorough data processing to preserve directional inter-dependencies. The present applications show optimized technical approaches for this preservation of quantitative and directional information during spatial normalization in data analyses at the group level. On this basis, FT techniques can be applied to group averaged data in order to quantify metrics information as defined by FT. Additionally, application of DTI methods, i.e. differences in FA-maps after stereotaxic alignment, in a longitudinal analysis at an individual subject basis reveal information about the progression of neurological disorders. Further quality improvement of DTI based results can be obtained during preprocessing by application of a controlled elimination of gradient directions with high noise levels. In summary, DTI is used to define a distinct WM pathoanatomy of different brain diseases by the combination of whole brain-based and tract-based DTI analysis.
Medicine, Issue 77, Neuroscience, Neurobiology, Molecular Biology, Biomedical Engineering, Anatomy, Physiology, Neurodegenerative Diseases, nuclear magnetic resonance, NMR, MR, MRI, diffusion tensor imaging, fiber tracking, group level comparison, neurodegenerative diseases, brain, imaging, clinical techniques
Play Button
Measuring the Subjective Value of Risky and Ambiguous Options using Experimental Economics and Functional MRI Methods
Authors: Ifat Levy, Lior Rosenberg Belmaker, Kirk Manson, Agnieszka Tymula, Paul W. Glimcher.
Institutions: Yale School of Medicine, Yale School of Medicine, New York University , New York University , New York University .
Most of the choices we make have uncertain consequences. In some cases the probabilities for different possible outcomes are precisely known, a condition termed "risky". In other cases when probabilities cannot be estimated, this is a condition described as "ambiguous". While most people are averse to both risk and ambiguity1,2, the degree of those aversions vary substantially across individuals, such that the subjective value of the same risky or ambiguous option can be very different for different individuals. We combine functional MRI (fMRI) with an experimental economics-based method3 to assess the neural representation of the subjective values of risky and ambiguous options4. This technique can be now used to study these neural representations in different populations, such as different age groups and different patient populations. In our experiment, subjects make consequential choices between two alternatives while their neural activation is tracked using fMRI. On each trial subjects choose between lotteries that vary in their monetary amount and in either the probability of winning that amount or the ambiguity level associated with winning. Our parametric design allows us to use each individual's choice behavior to estimate their attitudes towards risk and ambiguity, and thus to estimate the subjective values that each option held for them. Another important feature of the design is that the outcome of the chosen lottery is not revealed during the experiment, so that no learning can take place, and thus the ambiguous options remain ambiguous and risk attitudes are stable. Instead, at the end of the scanning session one or few trials are randomly selected and played for real money. Since subjects do not know beforehand which trials will be selected, they must treat each and every trial as if it and it alone was the one trial on which they will be paid. This design ensures that we can estimate the true subjective value of each option to each subject. We then look for areas in the brain whose activation is correlated with the subjective value of risky options and for areas whose activation is correlated with the subjective value of ambiguous options.
Neuroscience, Issue 67, Medicine, Molecular Biology, fMRI, magnetic resonance imaging, decision-making, value, uncertainty, risk, ambiguity
Play Button
Mizoroki-Heck Cross-coupling Reactions Catalyzed by Dichloro{bis[1,1',1''-(phosphinetriyl)tripiperidine]}palladium Under Mild Reaction Conditions
Authors: Miriam Oberholzer, Christian M. Frech.
Institutions: University of Zürich, Zürich University of Applied Sciences.
Dichloro-bis(aminophosphine) complexes of palladium with the general formula of [(P{(NC5H10)3-n(C6H11)n})2Pd(Cl)2] (where n = 0-2), belong to a new family of easy accessible, very cheap, and air stable, but highly active and universally applicable C-C cross-coupling catalysts with an excellent functional group tolerance. Dichloro{bis[1,1',1''-(phosphinetriyl)tripiperidine]}palladium [(P(NC5H10)3)2Pd(Cl)2] (1), the least stable complex within this series towards protons; e.g. in the form of water, allows an eased nanoparticle formation and hence, proved to be the most active Heck catalyst within this series at 100 °C and is a very rare example of an effective and versatile catalyst system that efficiently operates under mild reaction conditions. Rapid and complete catalyst degradation under work-up conditions into phosphonates, piperidinium salts and other, palladium-containing decomposition products assure an easy separation of the coupling products from catalyst and ligands. The facile, cheap, and rapid synthesis of 1,1',1"-(phosphinetriyl)tripiperidine and 1 respectively, the simple and convenient use as well as its excellent catalytic performance in the Heck reaction at 100 °C make 1 to one of the most attractive and greenest Heck catalysts available. We provide here the visualized protocols for the ligand and catalyst syntheses as well as the reaction protocol for Heck reactions performed at 10 mmol scale at 100 °C and show that this catalyst is suitable for its use in organic syntheses.
Chemistry, Issue 85, Heck reaction, C-C cross-coupling, Catalysis, Catalysts, green chemistry, Palladium, Aminophosphines, Palladium nanoparticles, Reaction mechanism, water-induced ligand degradation
Play Button
Corneal Donor Tissue Preparation for Endothelial Keratoplasty
Authors: Maria A. Woodward, Michael Titus, Kyle Mavin, Roni M. Shtein.
Institutions: University of Michigan , MidWest Eye Banks.
Over the past ten years, corneal transplantation surgical techniques have undergone revolutionary changes1,2. Since its inception, traditional full thickness corneal transplantation has been the treatment to restore sight in those limited by corneal disease. Some disadvantages to this approach include a high degree of post-operative astigmatism, lack of predictable refractive outcome, and disturbance to the ocular surface. The development of Descemet's stripping endothelial keratoplasty (DSEK), transplanting only the posterior corneal stroma, Descemet's membrane, and endothelium, has dramatically changed treatment of corneal endothelial disease. DSEK is performed through a smaller incision; this technique avoids 'open sky' surgery with its risk of hemorrhage or expulsion, decreases the incidence of postoperative wound dehiscence, reduces unpredictable refractive outcomes, and may decrease the rate of transplant rejection3-6. Initially, cornea donor posterior lamellar dissection for DSEK was performed manually1 resulting in variable graft thickness and damage to the delicate corneal endothelial tissue during tissue processing. Automated lamellar dissection (Descemet's stripping automated endothelial keratoplasty, DSAEK) was developed to address these issues. Automated dissection utilizes the same technology as LASIK corneal flap creation with a mechanical microkeratome blade that helps to create uniform and thin tissue grafts for DSAEK surgery with minimal corneal endothelial cell loss in tissue processing. Eye banks have been providing full thickness corneas for surgical transplantation for many years. In 2006, eye banks began to develop methodologies for supplying precut corneal tissue for endothelial keratoplasty. With the input of corneal surgeons, eye banks have developed thorough protocols to safely and effectively prepare posterior lamellar tissue for DSAEK surgery. This can be performed preoperatively at the eye bank. Research shows no significant difference in terms of the quality of the tissue7 or patient outcomes8,9 using eye bank precut tissue versus surgeon-prepared tissue for DSAEK surgery. For most corneal surgeons, the availability of precut DSAEK corneal tissue saves time and money10, and reduces the stress of performing the donor corneal dissection in the operating room. In part because of the ability of the eye banks to provide high quality posterior lamellar corneal in a timely manner, DSAEK has become the standard of care for surgical management of corneal endothelial disease. The procedure that we are describing is the preparation of the posterior lamellar cornea at the eye bank for transplantation in DSAEK surgery (Figure 1).
Medicine, Issue 64, Physiology, Cornea, transplantation, DSAEK, DSEK, endothelial keratoplasty, lamellar, graft, Moria, microkeratome, precut, Fuchs dystrophy
Play Button
Optimized Negative Staining: a High-throughput Protocol for Examining Small and Asymmetric Protein Structure by Electron Microscopy
Authors: Matthew Rames, Yadong Yu, Gang Ren.
Institutions: The Molecular Foundry.
Structural determination of proteins is rather challenging for proteins with molecular masses between 40 - 200 kDa. Considering that more than half of natural proteins have a molecular mass between 40 - 200 kDa1,2, a robust and high-throughput method with a nanometer resolution capability is needed. Negative staining (NS) electron microscopy (EM) is an easy, rapid, and qualitative approach which has frequently been used in research laboratories to examine protein structure and protein-protein interactions. Unfortunately, conventional NS protocols often generate structural artifacts on proteins, especially with lipoproteins that usually form presenting rouleaux artifacts. By using images of lipoproteins from cryo-electron microscopy (cryo-EM) as a standard, the key parameters in NS specimen preparation conditions were recently screened and reported as the optimized NS protocol (OpNS), a modified conventional NS protocol 3 . Artifacts like rouleaux can be greatly limited by OpNS, additionally providing high contrast along with reasonably high‐resolution (near 1 nm) images of small and asymmetric proteins. These high-resolution and high contrast images are even favorable for an individual protein (a single object, no average) 3D reconstruction, such as a 160 kDa antibody, through the method of electron tomography4,5. Moreover, OpNS can be a high‐throughput tool to examine hundreds of samples of small proteins. For example, the previously published mechanism of 53 kDa cholesteryl ester transfer protein (CETP) involved the screening and imaging of hundreds of samples 6. Considering cryo-EM rarely successfully images proteins less than 200 kDa has yet to publish any study involving screening over one hundred sample conditions, it is fair to call OpNS a high-throughput method for studying small proteins. Hopefully the OpNS protocol presented here can be a useful tool to push the boundaries of EM and accelerate EM studies into small protein structure, dynamics and mechanisms.
Environmental Sciences, Issue 90, small and asymmetric protein structure, electron microscopy, optimized negative staining
Play Button
Protein WISDOM: A Workbench for In silico De novo Design of BioMolecules
Authors: James Smadbeck, Meghan B. Peterson, George A. Khoury, Martin S. Taylor, Christodoulos A. Floudas.
Institutions: Princeton University.
The aim of de novo protein design is to find the amino acid sequences that will fold into a desired 3-dimensional structure with improvements in specific properties, such as binding affinity, agonist or antagonist behavior, or stability, relative to the native sequence. Protein design lies at the center of current advances drug design and discovery. Not only does protein design provide predictions for potentially useful drug targets, but it also enhances our understanding of the protein folding process and protein-protein interactions. Experimental methods such as directed evolution have shown success in protein design. However, such methods are restricted by the limited sequence space that can be searched tractably. In contrast, computational design strategies allow for the screening of a much larger set of sequences covering a wide variety of properties and functionality. We have developed a range of computational de novo protein design methods capable of tackling several important areas of protein design. These include the design of monomeric proteins for increased stability and complexes for increased binding affinity. To disseminate these methods for broader use we present Protein WISDOM (, a tool that provides automated methods for a variety of protein design problems. Structural templates are submitted to initialize the design process. The first stage of design is an optimization sequence selection stage that aims at improving stability through minimization of potential energy in the sequence space. Selected sequences are then run through a fold specificity stage and a binding affinity stage. A rank-ordered list of the sequences for each step of the process, along with relevant designed structures, provides the user with a comprehensive quantitative assessment of the design. Here we provide the details of each design method, as well as several notable experimental successes attained through the use of the methods.
Genetics, Issue 77, Molecular Biology, Bioengineering, Biochemistry, Biomedical Engineering, Chemical Engineering, Computational Biology, Genomics, Proteomics, Protein, Protein Binding, Computational Biology, Drug Design, optimization (mathematics), Amino Acids, Peptides, and Proteins, De novo protein and peptide design, Drug design, In silico sequence selection, Optimization, Fold specificity, Binding affinity, sequencing
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Measuring Oral Fatty Acid Thresholds, Fat Perception, Fatty Food Liking, and Papillae Density in Humans
Authors: Rivkeh Y. Haryono, Madeline A. Sprajcer, Russell S. J. Keast.
Institutions: Deakin University.
Emerging evidence from a number of laboratories indicates that humans have the ability to identify fatty acids in the oral cavity, presumably via fatty acid receptors housed on taste cells. Previous research has shown that an individual's oral sensitivity to fatty acid, specifically oleic acid (C18:1) is associated with body mass index (BMI), dietary fat consumption, and the ability to identify fat in foods. We have developed a reliable and reproducible method to assess oral chemoreception of fatty acids, using a milk and C18:1 emulsion, together with an ascending forced choice triangle procedure. In parallel, a food matrix has been developed to assess an individual's ability to perceive fat, in addition to a simple method to assess fatty food liking. As an added measure tongue photography is used to assess papillae density, with higher density often being associated with increased taste sensitivity.
Neuroscience, Issue 88, taste, overweight and obesity, dietary fat, fatty acid, diet, fatty food liking, detection threshold
Play Button
A Restriction Enzyme Based Cloning Method to Assess the In vitro Replication Capacity of HIV-1 Subtype C Gag-MJ4 Chimeric Viruses
Authors: Daniel T. Claiborne, Jessica L. Prince, Eric Hunter.
Institutions: Emory University, Emory University.
The protective effect of many HLA class I alleles on HIV-1 pathogenesis and disease progression is, in part, attributed to their ability to target conserved portions of the HIV-1 genome that escape with difficulty. Sequence changes attributed to cellular immune pressure arise across the genome during infection, and if found within conserved regions of the genome such as Gag, can affect the ability of the virus to replicate in vitro. Transmission of HLA-linked polymorphisms in Gag to HLA-mismatched recipients has been associated with reduced set point viral loads. We hypothesized this may be due to a reduced replication capacity of the virus. Here we present a novel method for assessing the in vitro replication of HIV-1 as influenced by the gag gene isolated from acute time points from subtype C infected Zambians. This method uses restriction enzyme based cloning to insert the gag gene into a common subtype C HIV-1 proviral backbone, MJ4. This makes it more appropriate to the study of subtype C sequences than previous recombination based methods that have assessed the in vitro replication of chronically derived gag-pro sequences. Nevertheless, the protocol could be readily modified for studies of viruses from other subtypes. Moreover, this protocol details a robust and reproducible method for assessing the replication capacity of the Gag-MJ4 chimeric viruses on a CEM-based T cell line. This method was utilized for the study of Gag-MJ4 chimeric viruses derived from 149 subtype C acutely infected Zambians, and has allowed for the identification of residues in Gag that affect replication. More importantly, the implementation of this technique has facilitated a deeper understanding of how viral replication defines parameters of early HIV-1 pathogenesis such as set point viral load and longitudinal CD4+ T cell decline.
Infectious Diseases, Issue 90, HIV-1, Gag, viral replication, replication capacity, viral fitness, MJ4, CEM, GXR25
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
An Affordable HIV-1 Drug Resistance Monitoring Method for Resource Limited Settings
Authors: Justen Manasa, Siva Danaviah, Sureshnee Pillay, Prevashinee Padayachee, Hloniphile Mthiyane, Charity Mkhize, Richard John Lessells, Christopher Seebregts, Tobias F. Rinke de Wit, Johannes Viljoen, David Katzenstein, Tulio De Oliveira.
Institutions: University of KwaZulu-Natal, Durban, South Africa, Jembi Health Systems, University of Amsterdam, Stanford Medical School.
HIV-1 drug resistance has the potential to seriously compromise the effectiveness and impact of antiretroviral therapy (ART). As ART programs in sub-Saharan Africa continue to expand, individuals on ART should be closely monitored for the emergence of drug resistance. Surveillance of transmitted drug resistance to track transmission of viral strains already resistant to ART is also critical. Unfortunately, drug resistance testing is still not readily accessible in resource limited settings, because genotyping is expensive and requires sophisticated laboratory and data management infrastructure. An open access genotypic drug resistance monitoring method to manage individuals and assess transmitted drug resistance is described. The method uses free open source software for the interpretation of drug resistance patterns and the generation of individual patient reports. The genotyping protocol has an amplification rate of greater than 95% for plasma samples with a viral load >1,000 HIV-1 RNA copies/ml. The sensitivity decreases significantly for viral loads <1,000 HIV-1 RNA copies/ml. The method described here was validated against a method of HIV-1 drug resistance testing approved by the United States Food and Drug Administration (FDA), the Viroseq genotyping method. Limitations of the method described here include the fact that it is not automated and that it also failed to amplify the circulating recombinant form CRF02_AG from a validation panel of samples, although it amplified subtypes A and B from the same panel.
Medicine, Issue 85, Biomedical Technology, HIV-1, HIV Infections, Viremia, Nucleic Acids, genetics, antiretroviral therapy, drug resistance, genotyping, affordable
Play Button
From Voxels to Knowledge: A Practical Guide to the Segmentation of Complex Electron Microscopy 3D-Data
Authors: Wen-Ting Tsai, Ahmed Hassan, Purbasha Sarkar, Joaquin Correa, Zoltan Metlagel, Danielle M. Jorgens, Manfred Auer.
Institutions: Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory, Lawrence Berkeley National Laboratory.
Modern 3D electron microscopy approaches have recently allowed unprecedented insight into the 3D ultrastructural organization of cells and tissues, enabling the visualization of large macromolecular machines, such as adhesion complexes, as well as higher-order structures, such as the cytoskeleton and cellular organelles in their respective cell and tissue context. Given the inherent complexity of cellular volumes, it is essential to first extract the features of interest in order to allow visualization, quantification, and therefore comprehension of their 3D organization. Each data set is defined by distinct characteristics, e.g., signal-to-noise ratio, crispness (sharpness) of the data, heterogeneity of its features, crowdedness of features, presence or absence of characteristic shapes that allow for easy identification, and the percentage of the entire volume that a specific region of interest occupies. All these characteristics need to be considered when deciding on which approach to take for segmentation. The six different 3D ultrastructural data sets presented were obtained by three different imaging approaches: resin embedded stained electron tomography, focused ion beam- and serial block face- scanning electron microscopy (FIB-SEM, SBF-SEM) of mildly stained and heavily stained samples, respectively. For these data sets, four different segmentation approaches have been applied: (1) fully manual model building followed solely by visualization of the model, (2) manual tracing segmentation of the data followed by surface rendering, (3) semi-automated approaches followed by surface rendering, or (4) automated custom-designed segmentation algorithms followed by surface rendering and quantitative analysis. Depending on the combination of data set characteristics, it was found that typically one of these four categorical approaches outperforms the others, but depending on the exact sequence of criteria, more than one approach may be successful. Based on these data, we propose a triage scheme that categorizes both objective data set characteristics and subjective personal criteria for the analysis of the different data sets.
Bioengineering, Issue 90, 3D electron microscopy, feature extraction, segmentation, image analysis, reconstruction, manual tracing, thresholding
Play Button
Determining Cell Number During Cell Culture using the Scepter Cell Counter
Authors: Kathleen Ongena, Chandreyee Das, Janet L. Smith, Sónia Gil, Grace Johnston.
Institutions: Millipore Inc.
Counting cells is often a necessary but tedious step for in vitro cell culture. Consistent cell concentrations ensure experimental reproducibility and accuracy. Cell counts are important for monitoring cell health and proliferation rate, assessing immortalization or transformation, seeding cells for subsequent experiments, transfection or infection, and preparing for cell-based assays. It is important that cell counts be accurate, consistent, and fast, particularly for quantitative measurements of cellular responses. Despite this need for speed and accuracy in cell counting, 71% of 400 researchers surveyed1 who count cells using a hemocytometer. While hemocytometry is inexpensive, it is laborious and subject to user bias and misuse, which results in inaccurate counts. Hemocytometers are made of special optical glass on which cell suspensions are loaded in specified volumes and counted under a microscope. Sources of errors in hemocytometry include: uneven cell distribution in the sample, too many or too few cells in the sample, subjective decisions as to whether a given cell falls within the defined counting area, contamination of the hemocytometer, user-to-user variation, and variation of hemocytometer filling rate2. To alleviate the tedium associated with manual counting, 29% of researchers count cells using automated cell counting devices; these include vision-based counters, systems that detect cells using the Coulter principle, or flow cytometry1. For most researchers, the main barrier to using an automated system is the price associated with these large benchtop instruments1. The Scepter cell counter is an automated handheld device that offers the automation and accuracy of Coulter counting at a relatively low cost. The system employs the Coulter principle of impedance-based particle detection3 in a miniaturized format using a combination of analog and digital hardware for sensing, signal processing, data storage, and graphical display. The disposable tip is engineered with a microfabricated, cell- sensing zone that enables discrimination by cell size and cell volume at sub-micron and sub-picoliter resolution. Enhanced with precision liquid-handling channels and electronics, the Scepter cell counter reports cell population statistics graphically displayed as a histogram.
Cellular Biology, Issue 45, Scepter, cell counting, cell culture, hemocytometer, Coulter, Impedance-based particle detection
Play Button
Corneal Donor Tissue Preparation for Descemet's Membrane Endothelial Keratoplasty
Authors: Hassan N. Tausif, Lauren Johnson, Michael Titus, Kyle Mavin, Navasuja Chandrasekaran, Maria A. Woodward, Roni M. Shtein, Shahzad I. Mian.
Institutions: University of Michigan, MidWest Eye Banks.
Descemet’s Membrane Endothelial Keratoplasty (DMEK) is a form of corneal transplantation in which only a single cell layer, the corneal endothelium, along with its basement membrane (Descemet's membrane) is introduced onto the recipient's posterior stroma3. Unlike Descemet’s Stripping Automated Endothelial Keratoplasty (DSAEK), where additional donor stroma is introduced, no unnatural stroma-to-stroma interface is created. As a result, the natural anatomy of the cornea is preserved as much as possible allowing for improved recovery time and visual acuity4. Endothelial Keratoplasty (EK) is the procedure of choice for treatment of endothelial dysfunction. The advantages of EK include rapid recovery of vision, preservation of ocular integrity and minimal refractive change due to use of a small, peripheral incision1. DSAEK utilizes donor tissue prepared with partial thickness stroma and endothelium. The rapid success and utilization of this procedure can be attributed to availability of eye-bank prepared precut tissue. The benefits of eye-bank preparation of donor tissue include elimination of need for specialized equipment in the operating room and availability of back up donor tissue in case of tissue perforation during preparation. In addition, high volume preparation of donor tissue by eye-bank technicians may provide improved quality of donor tissue. DSAEK may have limited best corrected visual acuity due to creation of a stromal interface between the donor and recipient cornea. Elimination of this interface with transplantation of only donor Descemet's membrane and endothelium in DMEK may improve visual outcomes and reduce complications after EK5. Similar to DSAEK, long term success and acceptance of DMEK is dependent on ease of availability of precut, eye-bank prepared donor tissue. Here we present a stepwise approach to donor tissue preparation which may reduce some barriers eye-banks face in providing DMEK grafts.
Medicine, Issue 91, DMEK, EK, endothelial keratoplasty, Descemet’s membrane endothelial keratoplasty, corneal transplantation, eye bank, donor tissue preparation
Play Button
Spatial Multiobjective Optimization of Agricultural Conservation Practices using a SWAT Model and an Evolutionary Algorithm
Authors: Sergey Rabotyagov, Todd Campbell, Adriana Valcu, Philip Gassman, Manoj Jha, Keith Schilling, Calvin Wolter, Catherine Kling.
Institutions: University of Washington, Iowa State University, North Carolina A&T University, Iowa Geological and Water Survey.
Finding the cost-efficient (i.e., lowest-cost) ways of targeting conservation practice investments for the achievement of specific water quality goals across the landscape is of primary importance in watershed management. Traditional economics methods of finding the lowest-cost solution in the watershed context (e.g.,5,12,20) assume that off-site impacts can be accurately described as a proportion of on-site pollution generated. Such approaches are unlikely to be representative of the actual pollution process in a watershed, where the impacts of polluting sources are often determined by complex biophysical processes. The use of modern physically-based, spatially distributed hydrologic simulation models allows for a greater degree of realism in terms of process representation but requires a development of a simulation-optimization framework where the model becomes an integral part of optimization. Evolutionary algorithms appear to be a particularly useful optimization tool, able to deal with the combinatorial nature of a watershed simulation-optimization problem and allowing the use of the full water quality model. Evolutionary algorithms treat a particular spatial allocation of conservation practices in a watershed as a candidate solution and utilize sets (populations) of candidate solutions iteratively applying stochastic operators of selection, recombination, and mutation to find improvements with respect to the optimization objectives. The optimization objectives in this case are to minimize nonpoint-source pollution in the watershed, simultaneously minimizing the cost of conservation practices. A recent and expanding set of research is attempting to use similar methods and integrates water quality models with broadly defined evolutionary optimization methods3,4,9,10,13-15,17-19,22,23,25. In this application, we demonstrate a program which follows Rabotyagov et al.'s approach and integrates a modern and commonly used SWAT water quality model7 with a multiobjective evolutionary algorithm SPEA226, and user-specified set of conservation practices and their costs to search for the complete tradeoff frontiers between costs of conservation practices and user-specified water quality objectives. The frontiers quantify the tradeoffs faced by the watershed managers by presenting the full range of costs associated with various water quality improvement goals. The program allows for a selection of watershed configurations achieving specified water quality improvement goals and a production of maps of optimized placement of conservation practices.
Environmental Sciences, Issue 70, Plant Biology, Civil Engineering, Forest Sciences, Water quality, multiobjective optimization, evolutionary algorithms, cost efficiency, agriculture, development
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.