An experimental study is performed to measure the terminal settling velocities of spherical particles in surfactant based shear thinning viscoelastic (VES) fluids. The measurements are made for particles settling in unbounded fluids and fluids between parallel walls. VES fluids over a wide range of rheological properties are prepared and rheologically characterized. The rheological characterization involves steady shear-viscosity and dynamic oscillatory-shear measurements to quantify the viscous and elastic properties respectively. The settling velocities under unbounded conditions are measured in beakers having diameters at least 25x the diameter of particles. For measuring settling velocities between parallel walls, two experimental cells with different wall spacing are constructed. Spherical particles of varying sizes are gently dropped in the fluids and allowed to settle. The process is recorded with a high resolution video camera and the trajectory of the particle is recorded using image analysis software. Terminal settling velocities are calculated from the data.
The impact of elasticity on settling velocity in unbounded fluids is quantified by comparing the experimental settling velocity to the settling velocity calculated by the inelastic drag predictions of Renaud et al.1 Results show that elasticity of fluids can increase or decrease the settling velocity. The magnitude of reduction/increase is a function of the rheological properties of the fluids and properties of particles. Confining walls are observed to cause a retardation effect on settling and the retardation is measured in terms of wall factors.
16 Related JoVE Articles!
From Fast Fluorescence Imaging to Molecular Diffusion Law on Live Cell Membranes in a Commercial Microscope
Institutions: Scuola Normale Superiore, Instituto Italiano di Tecnologia, University of California, Irvine.
It has become increasingly evident that the spatial distribution and the motion of membrane components like lipids and proteins are key factors in the regulation of many cellular functions. However, due to the fast dynamics and the tiny structures involved, a very high spatio-temporal resolution is required to catch the real behavior of molecules. Here we present the experimental protocol for studying the dynamics of fluorescently-labeled plasma-membrane proteins and lipids in live cells with high spatiotemporal resolution. Notably, this approach doesn’t need to track each molecule, but it calculates population behavior using all molecules in a given region of the membrane. The starting point is a fast imaging of a given region on the membrane. Afterwards, a complete spatio-temporal autocorrelation function is calculated correlating acquired images at increasing time delays, for example each 2, 3, n repetitions. It is possible to demonstrate that the width of the peak of the spatial autocorrelation function increases at increasing time delay as a function of particle movement due to diffusion. Therefore, fitting of the series of autocorrelation functions enables to extract the actual protein mean square displacement from imaging (iMSD), here presented in the form of apparent diffusivity vs average displacement. This yields a quantitative view of the average dynamics of single molecules with nanometer accuracy. By using a GFP-tagged variant of the Transferrin Receptor (TfR) and an ATTO488 labeled 1-palmitoyl-2-hydroxy-sn
-glycero-3-phosphoethanolamine (PPE) it is possible to observe the spatiotemporal regulation of protein and lipid diffusion on µm-sized membrane regions in the micro-to-milli-second time range.
Bioengineering, Issue 92, fluorescence, protein dynamics, lipid dynamics, membrane heterogeneity, transient confinement, single molecule, GFP
Procedure for the Development of Multi-depth Circular Cross-sectional Endothelialized Microchannels-on-a-chip
Institutions: West Virginia University, University of California at Riverside.
Efforts have been focused on developing in vitro
assays for the study of microvessels because in vivo
animal studies are more time-consuming, expensive, and observation and quantification are very challenging. However, conventional in vitro
microvessel assays have limitations when representing in vivo
microvessels with respect to three-dimensional (3D) geometry and providing continuous fluid flow. Using a combination of photolithographic reflowable photoresist technique, soft lithography, and microfluidics, we have developed a multi-depth circular cross-sectional endothelialized microchannels-on-a-chip, which mimics the 3D geometry of in vivo
microvessels and runs under controlled continuous perfusion flow. A positive reflowable photoresist was used to fabricate a master mold with a semicircular cross-sectional microchannel network. By the alignment and bonding of the two polydimethylsiloxane (PDMS) microchannels replicated from the master mold, a cylindrical microchannel network was created. The diameters of the microchannels can be well controlled. In addition, primary human umbilical vein endothelial cells (HUVECs) seeded inside the chip showed that the cells lined the inner surface of the microchannels under controlled perfusion lasting for a time period between 4 days to 2 weeks.
Bioengineering, Issue 80, Bioengineering, Tissue Engineering, Miniaturization, Microtechnology, Microfluidics, Reflow photoresist, PDMS, Perfusion flow, Primary endothelial cells
An Inverse Analysis Approach to the Characterization of Chemical Transport in Paints
Institutions: U.S. Army Edgewood Chemical Biological Center, OptiMetrics, Inc., a DCS Company.
The ability to directly characterize chemical transport and interactions that occur within a material (i.e.
, subsurface dynamics) is a vital component in understanding contaminant mass transport and the ability to decontaminate materials. If a material is contaminated, over time, the transport of highly toxic chemicals (such as chemical warfare agent species) out of the material can result in vapor exposure or transfer to the skin, which can result in percutaneous exposure to personnel who interact with the material. Due to the high toxicity of chemical warfare agents, the release of trace chemical quantities is of significant concern. Mapping subsurface concentration distribution and transport characteristics of absorbed agents enables exposure hazards to be assessed in untested conditions. Furthermore, these tools can be used to characterize subsurface reaction dynamics to ultimately design improved decontaminants or decontamination procedures. To achieve this goal, an inverse analysis mass transport modeling approach was developed that utilizes time-resolved mass spectroscopy measurements of vapor emission from contaminated paint coatings as the input parameter for calculation of subsurface concentration profiles. Details are provided on sample preparation, including contaminant and material handling, the application of mass spectrometry for the measurement of emitted contaminant vapor, and the implementation of inverse analysis using a physics-based diffusion model to determine transport properties of live chemical warfare agents including distilled mustard (HD) and the nerve agent VX.
Chemistry, Issue 90, Vacuum, vapor emission, chemical warfare agent, contamination, mass transport, inverse analysis, volatile organic compound, paint, coating
Quantification of Global Diastolic Function by Kinematic Modeling-based Analysis of Transmitral Flow via the Parametrized Diastolic Filling Formalism
Institutions: Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis, Washington University in St. Louis.
Quantitative cardiac function assessment remains a challenge for physiologists and clinicians. Although historically invasive methods have comprised the only means available, the development of noninvasive imaging modalities (echocardiography, MRI, CT) having high temporal and spatial resolution provide a new window for quantitative diastolic function assessment. Echocardiography is the agreed upon standard for diastolic function assessment, but indexes in current clinical use merely utilize selected features of chamber dimension (M-mode) or blood/tissue motion (Doppler) waveforms without incorporating the physiologic causal determinants of the motion itself. The recognition that all left ventricles (LV) initiate filling by serving as mechanical suction pumps allows global diastolic function to be assessed based on laws of motion that apply to all chambers. What differentiates one heart from another are the parameters of the equation of motion that governs filling. Accordingly, development of the Parametrized Diastolic Filling (PDF) formalism has shown that the entire range of clinically observed early transmitral flow (Doppler E-wave) patterns are extremely well fit by the laws of damped oscillatory motion. This permits analysis of individual E-waves in accordance with a causal mechanism (recoil-initiated suction) that yields three (numerically) unique lumped parameters whose physiologic analogues are chamber stiffness (k
), viscoelasticity/relaxation (c
), and load (xo
). The recording of transmitral flow (Doppler E-waves) is standard practice in clinical cardiology and, therefore, the echocardiographic recording method is only briefly reviewed. Our focus is on determination of the PDF parameters from routinely recorded E-wave data. As the highlighted results indicate, once the PDF parameters have been obtained from a suitable number of load varying E-waves, the investigator is free to use the parameters or construct indexes from the parameters (such as stored energy 1/2kxo2
, maximum A-V pressure gradient kxo
, load independent index of diastolic function, etc
.) and select the aspect of physiology or pathophysiology to be quantified.
Bioengineering, Issue 91, cardiovascular physiology, ventricular mechanics, diastolic function, mathematical modeling, Doppler echocardiography, hemodynamics, biomechanics
Characterization of Thermal Transport in One-dimensional Solid Materials
Institutions: Iowa State University.
The TET (transient electro-thermal) technique is an effective approach developed to measure the thermal diffusivity of solid materials, including conductive, semi-conductive or nonconductive one-dimensional structures. This technique broadens the measurement scope of materials (conductive and nonconductive) and improves the accuracy and stability. If the sample (especially biomaterials, such as human head hair, spider silk, and silkworm silk) is not conductive, it will be coated with a gold layer to make it electronically conductive. The effect of parasitic conduction and radiative losses on the thermal diffusivity can be subtracted during data processing. Then the real thermal conductivity can be calculated with the given value of volume-based specific heat (ρcp
), which can be obtained from calibration, noncontact photo-thermal technique or measuring the density and specific heat separately. In this work, human head hair samples are used to show how to set up the experiment, process the experimental data, and subtract the effect of parasitic conduction and radiative losses.
Physics, Issue 83, thermal transport, thermal diffusivity, thermal conductivity, transient electro-thermal technique, volume-based specific heat, human head hair
Reduction in Left Ventricular Wall Stress and Improvement in Function in Failing Hearts using Algisyl-LVR
Institutions: UCSF/VA Medical Center, LoneStar Heart, Inc..
Injection of Algisyl-LVR, a treatment under clinical development, is intended to treat patients with dilated cardiomyopathy. This treatment was recently used for the first time in patients who had symptomatic heart failure. In all patients, cardiac function of the left ventricle (LV) improved significantly, as manifested by consistent reduction of the LV volume and wall stress. Here we describe this novel treatment procedure and the methods used to quantify its effects on LV wall stress and function.
Algisyl-LVR is a biopolymer gel consisting of Na+
-Alginate and Ca2+
-Alginate. The treatment procedure was carried out by mixing these two components and then combining them into one syringe for intramyocardial injections. This mixture was injected at 10 to 19 locations mid-way between the base and apex of the LV free wall in patients.
Magnetic resonance imaging (MRI), together with mathematical modeling, was used to quantify the effects of this treatment in patients before treatment and at various time points during recovery. The epicardial and endocardial surfaces were first digitized from the MR images to reconstruct the LV geometry at end-systole and at end-diastole. Left ventricular cavity volumes were then measured from these reconstructed surfaces.
Mathematical models of the LV were created from these MRI-reconstructed surfaces to calculate regional myofiber stress. Each LV model was constructed so that 1) it deforms according to a previously validated stress-strain relationship of the myocardium, and 2) the predicted LV cavity volume from these models matches the corresponding MRI-measured volume at end-diastole and end-systole. Diastolic filling was simulated by loading the LV endocardial surface with a prescribed end-diastolic pressure. Systolic contraction was simulated by concurrently loading the endocardial surface with a prescribed end-systolic pressure and adding active contraction in the myofiber direction. Regional myofiber stress at end-diastole and end-systole was computed from the deformed LV based on the stress-strain relationship.
Medicine, Issue 74, Biomedical Engineering, Anatomy, Physiology, Biophysics, Molecular Biology, Surgery, Cardiology, Cardiovascular Diseases, bioinjection, ventricular wall stress, mathematical model, heart failure, cardiac function, myocardium, left ventricle, LV, MRI, imaging, clinical techniques
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Longitudinal Measurement of Extracellular Matrix Rigidity in 3D Tumor Models Using Particle-tracking Microrheology
Institutions: University of Massachusetts Boston.
The mechanical microenvironment has been shown to act as a crucial regulator of tumor growth behavior and signaling, which is itself remodeled and modified as part of a set of complex, two-way mechanosensitive interactions. While the development of biologically-relevant 3D tumor models have facilitated mechanistic studies on the impact of matrix rheology on tumor growth, the inverse problem of mapping changes in the mechanical environment induced by tumors remains challenging. Here, we describe the implementation of particle-tracking microrheology (PTM) in conjunction with 3D models of pancreatic cancer as part of a robust and viable approach for longitudinally monitoring physical changes in the tumor microenvironment, in situ
. The methodology described here integrates a system of preparing in vitro
3D models embedded in a model extracellular matrix (ECM) scaffold of Type I collagen with fluorescently labeled probes uniformly distributed for position- and time-dependent microrheology measurements throughout the specimen. In vitro
tumors are plated and probed in parallel conditions using multiwell imaging plates. Drawing on established methods, videos of tracer probe movements are transformed via the Generalized Stokes Einstein Relation (GSER) to report the complex frequency-dependent viscoelastic shear modulus, G*(ω)
. Because this approach is imaging-based, mechanical characterization is also mapped onto large transmitted-light spatial fields to simultaneously report qualitative changes in 3D tumor size and phenotype. Representative results showing contrasting mechanical response in sub-regions associated with localized invasion-induced matrix degradation as well as system calibration, validation data are presented. Undesirable outcomes from common experimental errors and troubleshooting of these issues are also presented. The 96-well 3D culture plating format implemented in this protocol is conducive to correlation of microrheology measurements with therapeutic screening assays or molecular imaging to gain new insights into impact of treatments or biochemical stimuli on the mechanical microenvironment.
Bioengineering, Issue 88, viscoelasticity, mechanobiology, extracellular matrix (ECM), matrix remodeling, 3D tumor models, tumor microenvironment, stroma, matrix metalloprotease (MMP), epithelial-mesenchymal transition (EMT)
Linearization of the Bradford Protein Assay
Institutions: Tel Aviv University.
Determination of microgram quantities of protein in the Bradford Coomassie brilliant blue assay is accomplished by measurement of absorbance at 590 nm. This most common assay enables rapid and simple protein quantification in cell lysates, cellular fractions, or recombinant protein samples, for the purpose of normalization of biochemical measurements. However, an intrinsic nonlinearity compromises the sensitivity and accuracy of this method. It is shown that under standard assay conditions, the ratio of the absorbance measurements at 590 nm and 450 nm is strictly linear with protein concentration. This simple procedure increases the accuracy and improves the sensitivity of the assay about 10-fold, permitting quantification down to 50 ng of bovine serum albumin. Furthermore, the interference commonly introduced by detergents that are used to create the cell lysates is greatly reduced by the new protocol. A linear equation developed on the basis of mass action and Beer's law perfectly fits the experimental data.
Cellular Biology, Issue 38, Bradford, protein assay, protein quantification, Coomassie brilliant blue
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2
proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness
) (Figure 1
). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6
. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7
. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Waste Water Derived Electroactive Microbial Biofilms: Growth, Maintenance, and Basic Characterization
Institutions: UFZ - Helmholtz-Centre for Environmental Research.
The growth of anodic electroactive microbial biofilms from waste water inocula in a fed-batch reactor is demonstrated using a three-electrode setup controlled by a potentiostat. Thereby the use of potentiostats allows an exact adjustment of the electrode potential and ensures reproducible microbial culturing conditions. During growth the current production is monitored using chronoamperometry (CA). Based on these data the maximum current density (jmax
) and the coulombic efficiency (CE
) are discussed as measures for characterization of the bioelectrocatalytic performance. Cyclic voltammetry (CV), a nondestructive, i.e
. noninvasive, method, is used to study the extracellular electron transfer (EET) of electroactive bacteria. CV measurements are performed on anodic biofilm electrodes in the presence of the microbial substrate, i.e
. turnover conditions, and in the absence of the substrate, i.e.
nonturnover conditions, using different scan rates. Subsequently, data analysis is exemplified and fundamental thermodynamic parameters of the microbial EET are derived and explained: peak potential (Ep
), peak current density (jp
), formal potential (Ef
) and peak separation (ΔEp
). Additionally the limits of the method and the state-of the art data analysis are addressed. Thereby this video-article shall provide a guide for the basic experimental steps and the fundamental data analysis.
Environmental Sciences, Issue 82, Electrochemistry, Microbial fuel cell, microbial bioelectrochemical system, cyclic voltammetry, electroactive bacteria, microbial bioelectrochemistry, bioelectrocatalysis
Modeling Biological Membranes with Circuit Boards and Measuring Electrical Signals in Axons: Student Laboratory Exercises
Institutions: University of Kentucky, University of Toronto.
This is a demonstration of how electrical models can be used to characterize biological membranes. This exercise also introduces biophysical terminology used in electrophysiology. The same equipment is used in the membrane model as on live preparations. Some properties of an isolated nerve cord are investigated: nerve action potentials, recruitment of neurons, and responsiveness of the nerve cord to environmental factors.
Basic Protocols, Issue 47, Invertebrate, Crayfish, Modeling, Student laboratory, Nerve cord
Multi-electrode Array Recordings of Neuronal Avalanches in Organotypic Cultures
Institutions: National Institute of Mental Health.
The cortex is spontaneously active, even in the absence of any particular input or motor output. During development, this activity is important for the migration and differentiation of cortex cell types and the formation of neuronal connections1
. In the mature animal, ongoing activity reflects the past and the present state of an animal into which sensory stimuli are seamlessly integrated to compute future actions. Thus, a clear understanding of the organization of ongoing i.e. spontaneous activity is a prerequisite to understand cortex function.
Numerous recording techniques revealed that ongoing activity in cortex is comprised of many neurons whose individual activities transiently sum to larger events that can be detected in the local field potential (LFP) with extracellular microelectrodes, or in the electroencephalogram (EEG), the magnetoencephalogram (MEG), and the BOLD signal from functional magnetic resonance imaging (fMRI). The LFP is currently the method of choice when studying neuronal population activity with high temporal and spatial resolution at the mesoscopic scale (several thousands of neurons). At the extracellular microelectrode, locally synchronized activities of spatially neighbored neurons result in rapid deflections in the LFP up to several hundreds of microvolts. When using an array of microelectrodes, the organizations of such deflections can be conveniently monitored in space and time.
Neuronal avalanches describe the scale-invariant spatiotemporal organization of ongoing neuronal activity in the brain2,3
. They are specific to the superficial layers of cortex as established in vitro4,5
, in vivo
in the anesthetized rat 6
, and in the awake monkey7
. Importantly, both theoretical and empirical studies2,8-10
suggest that neuronal avalanches indicate an exquisitely balanced critical state dynamics of cortex that optimizes information transfer and information processing.
In order to study the mechanisms of neuronal avalanche development, maintenance, and regulation, in vitro
preparations are highly beneficial, as they allow for stable recordings of avalanche activity under precisely controlled conditions. The current protocol describes how to study neuronal avalanches in vitro by taking advantage of superficial layer development in organotypic cortex cultures, i.e. slice cultures, grown on planar, integrated microelectrode arrays (MEA; see also 11-14
Neuroscience, Issue 54, neuronal activity, neuronal avalanches, organotypic culture, slice culture, microelectrode array, electrophysiology, local field potential, extracellular spikes
A Simple Stimulatory Device for Evoking Point-like Tactile Stimuli: A Searchlight for LFP to Spike Transitions
Institutions: National Research Council, National Research Council, University of Manchester.
Current neurophysiological research has the aim to develop methodologies to investigate the signal route from neuron to neuron, namely in the transitions from spikes to Local Field Potentials (LFPs) and from LFPs to spikes.
LFPs have a complex dependence on spike activity and their relation is still poorly understood1
. The elucidation of these signal relations would be helpful both for clinical diagnostics (e.g.
stimulation paradigms for Deep Brain Stimulation) and for a deeper comprehension of neural coding strategies in normal and pathological conditions (e.g.
epilepsy, Parkinson disease, chronic pain). To this aim, one has to solve technical issues related to stimulation devices, stimulation paradigms and computational analyses. Therefore, a custom-made stimulation device was developed in order to deliver stimuli well regulated in space and time that does not incur in mechanical resonance. Subsequently, as an exemplification, a set of reliable LFP-spike relationships was extracted.
The performance of the device was investigated by extracellular recordings, jointly spikes and LFP responses to the applied stimuli, from the rat Primary Somatosensory cortex. Then, by means of a multi-objective optimization strategy, a predictive model for spike occurrence based on LFPs was estimated.
The application of this paradigm shows that the device is adequately suited to deliver high frequency tactile stimulation, outperforming common piezoelectric actuators. As a proof of the efficacy of the device, the following results were presented: 1) the timing and reliability of LFP responses well match the spike responses, 2) LFPs are sensitive to the stimulation history and capture not only the average response but also the trial-to-trial fluctuations in the spike activity and, finally, 3) by using the LFP signal it is possible to estimate a range of predictive models that capture different aspects of the spike activity.
Neuroscience, Issue 85, LFP, spike, tactile stimulus, Multiobjective function, Neuron, somatosensory cortex
Barnes Maze Testing Strategies with Small and Large Rodent Models
Institutions: University of Missouri, Food and Drug Administration.
Spatial learning and memory of laboratory rodents is often assessed via navigational ability in mazes, most popular of which are the water and dry-land (Barnes) mazes. Improved performance over sessions or trials is thought to reflect learning and memory of the escape cage/platform location. Considered less stressful than water mazes, the Barnes maze is a relatively simple design of a circular platform top with several holes equally spaced around the perimeter edge. All but one of the holes are false-bottomed or blind-ending, while one leads to an escape cage. Mildly aversive stimuli (e.g.
bright overhead lights) provide motivation to locate the escape cage. Latency to locate the escape cage can be measured during the session; however, additional endpoints typically require video recording. From those video recordings, use of automated tracking software can generate a variety of endpoints that are similar to those produced in water mazes (e.g.
distance traveled, velocity/speed, time spent in the correct quadrant, time spent moving/resting, and confirmation of latency). Type of search strategy (i.e.
random, serial, or direct) can be categorized as well. Barnes maze construction and testing methodologies can differ for small rodents, such as mice, and large rodents, such as rats. For example, while extra-maze cues are effective for rats, smaller wild rodents may require intra-maze cues with a visual barrier around the maze. Appropriate stimuli must be identified which motivate the rodent to locate the escape cage. Both Barnes and water mazes can be time consuming as 4-7 test trials are typically required to detect improved learning and memory performance (e.g.
shorter latencies or path lengths to locate the escape platform or cage) and/or differences between experimental groups. Even so, the Barnes maze is a widely employed behavioral assessment measuring spatial navigational abilities and their potential disruption by genetic, neurobehavioral manipulations, or drug/ toxicant exposure.
Behavior, Issue 84, spatial navigation, rats, Peromyscus, mice, intra- and extra-maze cues, learning, memory, latency, search strategy, escape motivation
Bringing the Visible Universe into Focus with Robo-AO
Institutions: California Institute of Technology, California Institute of Technology, University of Toronto, Inter-University Centre for Astronomy & Astrophysics, Observatories of the Carnegie Institution for Science, Weizmann Institute of Science.
The angular resolution of ground-based optical telescopes is limited by the degrading effects of the turbulent atmosphere. In the absence of an atmosphere, the angular resolution of a typical telescope is limited only by diffraction, i.e.
, the wavelength of interest, λ
, divided by the size of its primary mirror's aperture, D
. For example, the Hubble Space Telescope (HST), with a 2.4-m primary mirror, has an angular resolution at visible wavelengths of ~0.04 arc seconds. The atmosphere is composed of air at slightly different temperatures, and therefore different indices of refraction, constantly mixing. Light waves are bent as they pass through the inhomogeneous atmosphere. When a telescope on the ground focuses these light waves, instantaneous images appear fragmented, changing as a function of time. As a result, long-exposure images acquired using ground-based telescopes - even telescopes with four times the diameter of HST - appear blurry and have an angular resolution of roughly 0.5 to 1.5 arc seconds at best.
Astronomical adaptive-optics systems compensate for the effects of atmospheric turbulence. First, the shape of the incoming non-planar wave is determined using measurements of a nearby bright star by a wavefront sensor. Next, an element in the optical system, such as a deformable mirror, is commanded to correct the shape of the incoming light wave. Additional corrections are made at a rate sufficient to keep up with the dynamically changing atmosphere through which the telescope looks, ultimately producing diffraction-limited images.
The fidelity of the wavefront sensor measurement is based upon how well the incoming light is spatially and temporally sampled1
. Finer sampling requires brighter reference objects. While the brightest stars can serve as reference objects for imaging targets from several to tens of arc seconds away in the best conditions, most interesting astronomical targets do not have sufficiently bright stars nearby. One solution is to focus a high-power laser beam in the direction of the astronomical target to create an artificial reference of known shape, also known as a 'laser guide star'. The Robo-AO laser adaptive optics system2,3
employs a 10-W ultraviolet laser focused at a distance of 10 km to generate a laser guide star. Wavefront sensor measurements of the laser guide star drive the adaptive optics correction resulting in diffraction-limited images that have an angular resolution of ~0.1 arc seconds on a 1.5-m telescope.
Physics, Issue 72, Astronomy, Mechanical Engineering, Astrophysics, Optics, Adaptive optics, lasers, wavefront sensing, robotics, stars, galaxies, imaging, supernova, telescopes