JoVE Visualize What is visualize?
Related JoVE Video
Pubmed Article
Improved glomerular filtration rate estimation by an artificial neural network.
PUBLISHED: 02-01-2013
Accurate evaluation of glomerular filtration rates (GFRs) is of critical importance in clinical practice. A previous study showed that models based on artificial neural networks (ANNs) could achieve a better performance than traditional equations. However, large-sample cross-sectional surveys have not resolved questions about ANN performance.
To enable intuitive operation of powered artificial legs, an interface between user and prosthesis that can recognize the user's movement intent is desired. A novel neural-machine interface (NMI) based on neuromuscular-mechanical fusion developed in our previous study has demonstrated a great potential to accurately identify the intended movement of transfemoral amputees. However, this interface has not yet been integrated with a powered prosthetic leg for true neural control. This study aimed to report (1) a flexible platform to implement and optimize neural control of powered lower limb prosthesis and (2) an experimental setup and protocol to evaluate neural prosthesis control on patients with lower limb amputations. First a platform based on a PC and a visual programming environment were developed to implement the prosthesis control algorithms, including NMI training algorithm, NMI online testing algorithm, and intrinsic control algorithm. To demonstrate the function of this platform, in this study the NMI based on neuromuscular-mechanical fusion was hierarchically integrated with intrinsic control of a prototypical transfemoral prosthesis. One patient with a unilateral transfemoral amputation was recruited to evaluate our implemented neural controller when performing activities, such as standing, level-ground walking, ramp ascent, and ramp descent continuously in the laboratory. A novel experimental setup and protocol were developed in order to test the new prosthesis control safely and efficiently. The presented proof-of-concept platform and experimental setup and protocol could aid the future development and application of neurally-controlled powered artificial legs.
25 Related JoVE Articles!
Play Button
Swimming Performance Assessment in Fishes
Authors: Keith B. Tierney.
Institutions: University of Alberta.
Swimming performance tests of fish have been integral to studies of muscle energetics, swimming mechanics, gas exchange, cardiac physiology, disease, pollution, hypoxia and temperature. This paper describes a flexible protocol to assess fish swimming performance using equipment in which water velocity can be controlled. The protocol involves one to several stepped increases in flow speed that are intended to cause fish to fatigue. Step speeds and their duration can be set to capture swimming abilities of different physiological and ecological relevance. Most frequently step size is set to determine critical swimming velocity (Ucrit), which is intended to capture maximum sustained swimming ability. Traditionally this test has consisted of approximately ten steps each of 20 min duration. However, steps of shorter duration (e.g. 1 min) are increasingly being utilized to capture acceleration ability or burst swimming performance. Regardless of step size, swimming tests can be repeated over time to gauge individual variation and recovery ability. Endpoints related to swimming such as measures of metabolic rate, fin use, ventilation rate, and of behavior, such as the distance between schooling fish, are often included before, during and after swimming tests. Given the diversity of fish species, the number of unexplored research questions, and the importance of many species to global ecology and economic health, studies of fish swimming performance will remain popular and invaluable for the foreseeable future.
Physiology, Issue 51, fish, swimming, Ucrit, burst, sustained, prolonged, schooling performance
Play Button
Correlating Behavioral Responses to fMRI Signals from Human Prefrontal Cortex: Examining Cognitive Processes Using Task Analysis
Authors: Joseph F.X. DeSouza, Shima Ovaysikia, Laura K. Pynn.
Institutions: Centre for Vision Research, York University, Centre for Vision Research, York University.
The aim of this methods paper is to describe how to implement a neuroimaging technique to examine complementary brain processes engaged by two similar tasks. Participants' behavior during task performance in an fMRI scanner can then be correlated to the brain activity using the blood-oxygen-level-dependent signal. We measure behavior to be able to sort correct trials, where the subject performed the task correctly and then be able to examine the brain signals related to correct performance. Conversely, if subjects do not perform the task correctly, and these trials are included in the same analysis with the correct trials we would introduce trials that were not only for correct performance. Thus, in many cases these errors can be used themselves to then correlate brain activity to them. We describe two complementary tasks that are used in our lab to examine the brain during suppression of an automatic responses: the stroop1 and anti-saccade tasks. The emotional stroop paradigm instructs participants to either report the superimposed emotional 'word' across the affective faces or the facial 'expressions' of the face stimuli1,2. When the word and the facial expression refer to different emotions, a conflict between what must be said and what is automatically read occurs. The participant has to resolve the conflict between two simultaneously competing processes of word reading and facial expression. Our urge to read out a word leads to strong 'stimulus-response (SR)' associations; hence inhibiting these strong SR's is difficult and participants are prone to making errors. Overcoming this conflict and directing attention away from the face or the word requires the subject to inhibit bottom up processes which typically directs attention to the more salient stimulus. Similarly, in the anti-saccade task3,4,5,6, where an instruction cue is used to direct only attention to a peripheral stimulus location but then the eye movement is made to the mirror opposite position. Yet again we measure behavior by recording the eye movements of participants which allows for the sorting of the behavioral responses into correct and error trials7 which then can be correlated to brain activity. Neuroimaging now allows researchers to measure different behaviors of correct and error trials that are indicative of different cognitive processes and pinpoint the different neural networks involved.
Neuroscience, Issue 64, fMRI, eyetracking, BOLD, attention, inhibition, Magnetic Resonance Imaging, MRI
Play Button
Lensfree On-chip Tomographic Microscopy Employing Multi-angle Illumination and Pixel Super-resolution
Authors: Serhan O. Isikman, Waheb Bishara, Aydogan Ozcan.
Institutions: University of California, Los Angeles , University of California, Los Angeles , University of California, Los Angeles .
Tomographic imaging has been a widely used tool in medicine as it can provide three-dimensional (3D) structural information regarding objects of different size scales. In micrometer and millimeter scales, optical microscopy modalities find increasing use owing to the non-ionizing nature of visible light, and the availability of a rich set of illumination sources (such as lasers and light-emitting-diodes) and detection elements (such as large format CCD and CMOS detector-arrays). Among the recently developed optical tomographic microscopy modalities, one can include optical coherence tomography, optical diffraction tomography, optical projection tomography and light-sheet microscopy. 1-6 These platforms provide sectional imaging of cells, microorganisms and model animals such as C. elegans, zebrafish and mouse embryos. Existing 3D optical imagers generally have relatively bulky and complex architectures, limiting the availability of these equipments to advanced laboratories, and impeding their integration with lab-on-a-chip platforms and microfluidic chips. To provide an alternative tomographic microscope, we recently developed lensfree optical tomography (LOT) as a high-throughput, compact and cost-effective optical tomography modality. 7 LOT discards the use of lenses and bulky optical components, and instead relies on multi-angle illumination and digital computation to achieve depth-resolved imaging of micro-objects over a large imaging volume. LOT can image biological specimen at a spatial resolution of <1 μm x <1 μm x <3 μm in the x, y and z dimensions, respectively, over a large imaging volume of 15-100 mm3, and can be particularly useful for lab-on-a-chip platforms.
Bioengineering, Issue 66, Electrical Engineering, Mechanical Engineering, lensfree imaging, lensless imaging, on-chip microscopy, lensfree tomography, 3D microscopy, pixel super-resolution, C. elegans, optical sectioning, lab-on-a-chip
Play Button
Synthesis of an Intein-mediated Artificial Protein Hydrogel
Authors: Miguel A. Ramirez, Zhilei Chen.
Institutions: Texas A&M University, College Station, Texas A&M University, College Station.
We present the synthesis of a highly stable protein hydrogel mediated by a split-intein-catalyzed protein trans-splicing reaction. The building blocks of this hydrogel are two protein block-copolymers each containing a subunit of a trimeric protein that serves as a crosslinker and one half of a split intein. A highly hydrophilic random coil is inserted into one of the block-copolymers for water retention. Mixing of the two protein block copolymers triggers an intein trans-splicing reaction, yielding a polypeptide unit with crosslinkers at either end that rapidly self-assembles into a hydrogel. This hydrogel is very stable under both acidic and basic conditions, at temperatures up to 50 °C, and in organic solvents. The hydrogel rapidly reforms after shear-induced rupture. Incorporation of a "docking station peptide" into the hydrogel building block enables convenient incorporation of "docking protein"-tagged target proteins. The hydrogel is compatible with tissue culture growth media, supports the diffusion of 20 kDa molecules, and enables the immobilization of bioactive globular proteins. The application of the intein-mediated protein hydrogel as an organic-solvent-compatible biocatalyst was demonstrated by encapsulating the horseradish peroxidase enzyme and corroborating its activity.
Bioengineering, Issue 83, split-intein, self-assembly, shear-thinning, enzyme, immobilization, organic synthesis
Play Button
Microwave-assisted Functionalization of Poly(ethylene glycol) and On-resin Peptides for Use in Chain Polymerizations and Hydrogel Formation
Authors: Amy H. Van Hove, Brandon D. Wilson, Danielle S. W. Benoit.
Institutions: University of Rochester, University of Rochester, University of Rochester Medical Center.
One of the main benefits to using poly(ethylene glycol) (PEG) macromers in hydrogel formation is synthetic versatility. The ability to draw from a large variety of PEG molecular weights and configurations (arm number, arm length, and branching pattern) affords researchers tight control over resulting hydrogel structures and properties, including Young’s modulus and mesh size. This video will illustrate a rapid, efficient, solvent-free, microwave-assisted method to methacrylate PEG precursors into poly(ethylene glycol) dimethacrylate (PEGDM). This synthetic method provides much-needed starting materials for applications in drug delivery and regenerative medicine. The demonstrated method is superior to traditional methacrylation methods as it is significantly faster and simpler, as well as more economical and environmentally friendly, using smaller amounts of reagents and solvents. We will also demonstrate an adaptation of this technique for on-resin methacrylamide functionalization of peptides. This on-resin method allows the N-terminus of peptides to be functionalized with methacrylamide groups prior to deprotection and cleavage from resin. This allows for selective addition of methacrylamide groups to the N-termini of the peptides while amino acids with reactive side groups (e.g. primary amine of lysine, primary alcohol of serine, secondary alcohols of threonine, and phenol of tyrosine) remain protected, preventing functionalization at multiple sites. This article will detail common analytical methods (proton Nuclear Magnetic Resonance spectroscopy (;H-NMR) and Matrix Assisted Laser Desorption Ionization Time of Flight mass spectrometry (MALDI-ToF)) to assess the efficiency of the functionalizations. Common pitfalls and suggested troubleshooting methods will be addressed, as will modifications of the technique which can be used to further tune macromer functionality and resulting hydrogel physical and chemical properties. Use of synthesized products for the formation of hydrogels for drug delivery and cell-material interaction studies will be demonstrated, with particular attention paid to modifying hydrogel composition to affect mesh size, controlling hydrogel stiffness and drug release.
Chemistry, Issue 80, Poly(ethylene glycol), peptides, polymerization, polymers, methacrylation, peptide functionalization, 1H-NMR, MALDI-ToF, hydrogels, macromer synthesis
Play Button
Automated, Quantitative Cognitive/Behavioral Screening of Mice: For Genetics, Pharmacology, Animal Cognition and Undergraduate Instruction
Authors: C. R. Gallistel, Fuat Balci, David Freestone, Aaron Kheifets, Adam King.
Institutions: Rutgers University, Koç University, New York University, Fairfield University.
We describe a high-throughput, high-volume, fully automated, live-in 24/7 behavioral testing system for assessing the effects of genetic and pharmacological manipulations on basic mechanisms of cognition and learning in mice. A standard polypropylene mouse housing tub is connected through an acrylic tube to a standard commercial mouse test box. The test box has 3 hoppers, 2 of which are connected to pellet feeders. All are internally illuminable with an LED and monitored for head entries by infrared (IR) beams. Mice live in the environment, which eliminates handling during screening. They obtain their food during two or more daily feeding periods by performing in operant (instrumental) and Pavlovian (classical) protocols, for which we have written protocol-control software and quasi-real-time data analysis and graphing software. The data analysis and graphing routines are written in a MATLAB-based language created to simplify greatly the analysis of large time-stamped behavioral and physiological event records and to preserve a full data trail from raw data through all intermediate analyses to the published graphs and statistics within a single data structure. The data-analysis code harvests the data several times a day and subjects it to statistical and graphical analyses, which are automatically stored in the "cloud" and on in-lab computers. Thus, the progress of individual mice is visualized and quantified daily. The data-analysis code talks to the protocol-control code, permitting the automated advance from protocol to protocol of individual subjects. The behavioral protocols implemented are matching, autoshaping, timed hopper-switching, risk assessment in timed hopper-switching, impulsivity measurement, and the circadian anticipation of food availability. Open-source protocol-control and data-analysis code makes the addition of new protocols simple. Eight test environments fit in a 48 in x 24 in x 78 in cabinet; two such cabinets (16 environments) may be controlled by one computer.
Behavior, Issue 84, genetics, cognitive mechanisms, behavioral screening, learning, memory, timing
Play Button
5/6th Nephrectomy in Combination with High Salt Diet and Nitric Oxide Synthase Inhibition to Induce Chronic Kidney Disease in the Lewis Rat
Authors: Arianne van Koppen, Marianne C. Verhaar, Lennart G. Bongartz, Jaap A. Joles.
Institutions: University Medical Center Utrecht.
Chronic kidney disease (CKD) is a global problem. Slowing CKD progression is a major health priority. Since CKD is characterized by complex derangements of homeostasis, integrative animal models are necessary to study development and progression of CKD. To study development of CKD and novel therapeutic interventions in CKD, we use the 5/6th nephrectomy ablation model, a well known experimental model of progressive renal disease, resembling several aspects of human CKD. The gross reduction in renal mass causes progressive glomerular and tubulo-interstitial injury, loss of remnant nephrons and development of systemic and glomerular hypertension. It is also associated with progressive intrarenal capillary loss, inflammation and glomerulosclerosis. Risk factors for CKD invariably impact on endothelial function. To mimic this, we combine removal of 5/6th of renal mass with nitric oxide (NO) depletion and a high salt diet. After arrival and acclimatization, animals receive a NO synthase inhibitor (NG-nitro-L-Arginine) (L-NNA) supplemented to drinking water (20 mg/L) for a period of 4 weeks, followed by right sided uninephrectomy. One week later, a subtotal nephrectomy (SNX) is performed on the left side. After SNX, animals are allowed to recover for two days followed by LNNA in drinking water (20 mg/L) for a further period of 4 weeks. A high salt diet (6%), supplemented in ground chow (see time line Figure 1), is continued throughout the experiment. Progression of renal failure is followed over time by measuring plasma urea, systolic blood pressure and proteinuria. By six weeks after SNX, renal failure has developed. Renal function is measured using 'gold standard' inulin and para-amino hippuric acid (PAH) clearance technology. This model of CKD is characterized by a reduction in glomerular filtration rate (GFR) and effective renal plasma flow (ERPF), hypertension (systolic blood pressure>150 mmHg), proteinuria (> 50 mg/24 hr) and mild uremia (>10 mM). Histological features include tubulo-interstitial damage reflected by inflammation, tubular atrophy and fibrosis and focal glomerulosclerosis leading to massive reduction of healthy glomeruli within the remnant population (<10%). Follow-up until 12 weeks after SNX shows further progression of CKD.
Medicine, Issue 77, Anatomy, Physiology, Biomedical Engineering, Surgery, Nephrology Kidney Diseases, Glomerular Filtration Rate, Hemodynamics, Surgical Procedures, Operative, Chronic kidney disease, remnant kidney, chronic renal diseases, kidney, Nitric Oxide depletion, NO depletion, high salt diet, proteinuria, uremia, glomerulosclerosis, transgenic rat, animal model
Play Button
Simultaneous Long-term Recordings at Two Neuronal Processing Stages in Behaving Honeybees
Authors: Martin Fritz Brill, Maren Reuter, Wolfgang Rössler, Martin Fritz Strube-Bloss.
Institutions: University of Würzburg.
In both mammals and insects neuronal information is processed in different higher and lower order brain centers. These centers are coupled via convergent and divergent anatomical connections including feed forward and feedback wiring. Furthermore, information of the same origin is partially sent via parallel pathways to different and sometimes into the same brain areas. To understand the evolutionary benefits as well as the computational advantages of these wiring strategies and especially their temporal dependencies on each other, it is necessary to have simultaneous access to single neurons of different tracts or neuropiles in the same preparation at high temporal resolution. Here we concentrate on honeybees by demonstrating a unique extracellular long term access to record multi unit activity at two subsequent neuropiles1, the antennal lobe (AL), the first olfactory processing stage and the mushroom body (MB), a higher order integration center involved in learning and memory formation, or two parallel neuronal tracts2 connecting the AL with the MB. The latter was chosen as an example and will be described in full. In the supporting video the construction and permanent insertion of flexible multi channel wire electrodes is demonstrated. Pairwise differential amplification of the micro wire electrode channels drastically reduces the noise and verifies that the source of the signal is closely related to the position of the electrode tip. The mechanical flexibility of the used wire electrodes allows stable invasive long term recordings over many hours up to days, which is a clear advantage compared to conventional extra and intracellular in vivo recording techniques.
Neuroscience, Issue 89, honeybee brain, olfaction, extracellular long term recordings, double recordings, differential wire electrodes, single unit, multi-unit recordings
Play Button
Quantifying Glomerular Permeability of Fluorescent Macromolecules Using 2-Photon Microscopy in Munich Wistar Rats
Authors: Ruben M. Sandoval, Bruce A. Molitoris.
Institutions: Indiana University School of Medicine.
Kidney diseases involving urinary loss of large essential macromolecules, such as serum albumin, have long been thought to be caused by alterations in the permeability barrier comprised of podocytes, vascular endothelial cells, and a basement membrane working in unison. Data from our laboratory using intravital 2-photon microscopy revealed a more permeable glomerular filtration barrier (GFB) than previously thought under physiologic conditions, with retrieval of filtered albumin occurring in an early subset of cells called proximal tubule cells (PTC)1,2,3. Previous techniques used to study renal filtration and establishing the characteristic of the filtration barrier involved micropuncture of the lumen of these early tubular segments with sampling of the fluid content and analysis4. These studies determined albumin concentration in the luminal fluid to be virtually non-existent; corresponding closely to what is normally detected in the urine. However, characterization of dextran polymers with defined sizes by this technique revealed those of a size similar to serum albumin had higher levels in the tubular lumen and urine; suggesting increased permeability5. Herein is a detailed outline of the technique used to directly visualize and quantify glomerular fluorescent albumin permeability in vivo. This method allows for detection of filtered albumin across the filtration barrier into Bowman's space (the initial chamber of urinary filtration); and also allows quantification of albumin reabsorption by proximal tubules and visualization of subsequent albumin transcytosis6. The absence of fluorescent albumin along later tubular segments en route to the bladder highlights the efficiency of the retrieval pathway in the earlier proximal tubule segments. Moreover, when this technique was applied to determine permeability of dextrans having a similar size to albumin virtually identical permeability values were reported2. These observations directly support the need to expand the focus of many proteinuric renal diseases to included alterations in proximal tubule cell reclamation.
Medicine, Issue 74, Biomedical Engineering, Molecular Biology, Cellular Biology, Anatomy, Physiology, Surgery, Nephrology, Kidney Diseases, Two-photon microscopy, Kidney, Glomerulus, Glomerular Sieving Coefficient (GSC), Permeability, Proximal Tubule, Proteinuria, macromolecules, 2 Photon, microscopy, intravital imaging, munich wistar rat, animal model
Play Button
Movement Retraining using Real-time Feedback of Performance
Authors: Michael Anthony Hunt.
Institutions: University of British Columbia .
Any modification of movement - especially movement patterns that have been honed over a number of years - requires re-organization of the neuromuscular patterns responsible for governing the movement performance. This motor learning can be enhanced through a number of methods that are utilized in research and clinical settings alike. In general, verbal feedback of performance in real-time or knowledge of results following movement is commonly used clinically as a preliminary means of instilling motor learning. Depending on patient preference and learning style, visual feedback (e.g. through use of a mirror or different types of video) or proprioceptive guidance utilizing therapist touch, are used to supplement verbal instructions from the therapist. Indeed, a combination of these forms of feedback is commonplace in the clinical setting to facilitate motor learning and optimize outcomes. Laboratory-based, quantitative motion analysis has been a mainstay in research settings to provide accurate and objective analysis of a variety of movements in healthy and injured populations. While the actual mechanisms of capturing the movements may differ, all current motion analysis systems rely on the ability to track the movement of body segments and joints and to use established equations of motion to quantify key movement patterns. Due to limitations in acquisition and processing speed, analysis and description of the movements has traditionally occurred offline after completion of a given testing session. This paper will highlight a new supplement to standard motion analysis techniques that relies on the near instantaneous assessment and quantification of movement patterns and the display of specific movement characteristics to the patient during a movement analysis session. As a result, this novel technique can provide a new method of feedback delivery that has advantages over currently used feedback methods.
Medicine, Issue 71, Biophysics, Anatomy, Physiology, Physics, Biomedical Engineering, Behavior, Psychology, Kinesiology, Physical Therapy, Musculoskeletal System, Biofeedback, biomechanics, gait, movement, walking, rehabilitation, clinical, training
Play Button
Tissue Triage and Freezing for Models of Skeletal Muscle Disease
Authors: Hui Meng, Paul M.L. Janssen, Robert W. Grange, Lin Yang, Alan H. Beggs, Lindsay C. Swanson, Stacy A. Cossette, Alison Frase, Martin K. Childers, Henk Granzier, Emanuela Gussoni, Michael W. Lawlor.
Institutions: Medical College of Wisconsin, The Ohio State University, Virginia Tech, University of Kentucky, Boston Children's Hospital, Harvard Medical School, Cure Congenital Muscular Dystrophy, Joshua Frase Foundation, University of Washington, University of Arizona.
Skeletal muscle is a unique tissue because of its structure and function, which requires specific protocols for tissue collection to obtain optimal results from functional, cellular, molecular, and pathological evaluations. Due to the subtlety of some pathological abnormalities seen in congenital muscle disorders and the potential for fixation to interfere with the recognition of these features, pathological evaluation of frozen muscle is preferable to fixed muscle when evaluating skeletal muscle for congenital muscle disease. Additionally, the potential to produce severe freezing artifacts in muscle requires specific precautions when freezing skeletal muscle for histological examination that are not commonly used when freezing other tissues. This manuscript describes a protocol for rapid freezing of skeletal muscle using isopentane (2-methylbutane) cooled with liquid nitrogen to preserve optimal skeletal muscle morphology. This procedure is also effective for freezing tissue intended for genetic or protein expression studies. Furthermore, we have integrated our freezing protocol into a broader procedure that also describes preferred methods for the short term triage of tissue for (1) single fiber functional studies and (2) myoblast cell culture, with a focus on the minimum effort necessary to collect tissue and transport it to specialized research or reference labs to complete these studies. Overall, this manuscript provides an outline of how fresh tissue can be effectively distributed for a variety of phenotypic studies and thereby provides standard operating procedures (SOPs) for pathological studies related to congenital muscle disease.
Basic Protocol, Issue 89, Tissue, Freezing, Muscle, Isopentane, Pathology, Functional Testing, Cell Culture
Play Button
Cortical Source Analysis of High-Density EEG Recordings in Children
Authors: Joe Bathelt, Helen O'Reilly, Michelle de Haan.
Institutions: UCL Institute of Child Health, University College London.
EEG is traditionally described as a neuroimaging technique with high temporal and low spatial resolution. Recent advances in biophysical modelling and signal processing make it possible to exploit information from other imaging modalities like structural MRI that provide high spatial resolution to overcome this constraint1. This is especially useful for investigations that require high resolution in the temporal as well as spatial domain. In addition, due to the easy application and low cost of EEG recordings, EEG is often the method of choice when working with populations, such as young children, that do not tolerate functional MRI scans well. However, in order to investigate which neural substrates are involved, anatomical information from structural MRI is still needed. Most EEG analysis packages work with standard head models that are based on adult anatomy. The accuracy of these models when used for children is limited2, because the composition and spatial configuration of head tissues changes dramatically over development3.  In the present paper, we provide an overview of our recent work in utilizing head models based on individual structural MRI scans or age specific head models to reconstruct the cortical generators of high density EEG. This article describes how EEG recordings are acquired, processed, and analyzed with pediatric populations at the London Baby Lab, including laboratory setup, task design, EEG preprocessing, MRI processing, and EEG channel level and source analysis. 
Behavior, Issue 88, EEG, electroencephalogram, development, source analysis, pediatric, minimum-norm estimation, cognitive neuroscience, event-related potentials 
Play Button
Unraveling the Unseen Players in the Ocean - A Field Guide to Water Chemistry and Marine Microbiology
Authors: Andreas Florian Haas, Ben Knowles, Yan Wei Lim, Tracey McDole Somera, Linda Wegley Kelly, Mark Hatay, Forest Rohwer.
Institutions: San Diego State University, University of California San Diego.
Here we introduce a series of thoroughly tested and well standardized research protocols adapted for use in remote marine environments. The sampling protocols include the assessment of resources available to the microbial community (dissolved organic carbon, particulate organic matter, inorganic nutrients), and a comprehensive description of the viral and bacterial communities (via direct viral and microbial counts, enumeration of autofluorescent microbes, and construction of viral and microbial metagenomes). We use a combination of methods, which represent a dispersed field of scientific disciplines comprising already established protocols and some of the most recent techniques developed. Especially metagenomic sequencing techniques used for viral and bacterial community characterization, have been established only in recent years, and are thus still subjected to constant improvement. This has led to a variety of sampling and sample processing procedures currently in use. The set of methods presented here provides an up to date approach to collect and process environmental samples. Parameters addressed with these protocols yield the minimum on information essential to characterize and understand the underlying mechanisms of viral and microbial community dynamics. It gives easy to follow guidelines to conduct comprehensive surveys and discusses critical steps and potential caveats pertinent to each technique.
Environmental Sciences, Issue 93, dissolved organic carbon, particulate organic matter, nutrients, DAPI, SYBR, microbial metagenomics, viral metagenomics, marine environment
Play Button
A High-throughput Method for Measurement of Glomerular Filtration Rate in Conscious Mice
Authors: Timo Rieg.
Institutions: University of California, San Diego , San Diego VA Healthcare System.
The measurement of glomerular filtration rate (GFR) is the gold standard in kidney function assessment. Currently, investigators determine GFR by measuring the level of the endogenous biomarker creatinine or exogenously applied radioactive labeled inulin (3H or 14C). Creatinine has the substantial drawback that proximal tubular secretion accounts for ~50% of total renal creatinine excretion and therefore creatinine is not a reliable GFR marker. Depending on the experiment performed, inulin clearance can be determined by an intravenous single bolus injection or continuous infusion (intravenous or osmotic minipump). Both approaches require the collection of plasma or plasma and urine, respectively. Other drawbacks of radioactive labeled inulin include usage of isotopes, time consuming surgical preparation of the animals, and the requirement of a terminal experiment. Here we describe a method which uses a single bolus injection of fluorescein isothiocyanate-(FITC) labeled inulin and the measurement of its fluorescence in 1-2 μl of diluted plasma. By applying a two-compartment model, with 8 blood collections per mouse, it is possible to measure GFR in up to 24 mice per day using a special work-flow protocol. This method only requires brief isoflurane anesthesia with all the blood samples being collected in a non-restrained and awake mouse. Another advantage is that it is possible to follow mice over a period of several months and treatments (i.e. doing paired experiments with dietary changes or drug applications). We hope that this technique of measuring GFR is useful to other investigators studying mouse kidney function and will replace less accurate methods of estimating kidney function, such as plasma creatinine and blood urea nitrogen.
Medicine, Issue 75, Anatomy, Physiology, Biomedical Engineering, Molecular Biology, Nephrology, Kidney Function Tests, Glomerular filtration rate, rats, mice, conscious, creatinine, inulin, Jaffe, hypertension, HPLC, animal model
Play Button
Perceptual and Category Processing of the Uncanny Valley Hypothesis' Dimension of Human Likeness: Some Methodological Issues
Authors: Marcus Cheetham, Lutz Jancke.
Institutions: University of Zurich.
Mori's Uncanny Valley Hypothesis1,2 proposes that the perception of humanlike characters such as robots and, by extension, avatars (computer-generated characters) can evoke negative or positive affect (valence) depending on the object's degree of visual and behavioral realism along a dimension of human likeness (DHL) (Figure 1). But studies of affective valence of subjective responses to variously realistic non-human characters have produced inconsistent findings 3, 4, 5, 6. One of a number of reasons for this is that human likeness is not perceived as the hypothesis assumes. While the DHL can be defined following Mori's description as a smooth linear change in the degree of physical humanlike similarity, subjective perception of objects along the DHL can be understood in terms of the psychological effects of categorical perception (CP) 7. Further behavioral and neuroimaging investigations of category processing and CP along the DHL and of the potential influence of the dimension's underlying category structure on affective experience are needed. This protocol therefore focuses on the DHL and allows examination of CP. Based on the protocol presented in the video as an example, issues surrounding the methodology in the protocol and the use in "uncanny" research of stimuli drawn from morph continua to represent the DHL are discussed in the article that accompanies the video. The use of neuroimaging and morph stimuli to represent the DHL in order to disentangle brain regions neurally responsive to physical human-like similarity from those responsive to category change and category processing is briefly illustrated.
Behavior, Issue 76, Neuroscience, Neurobiology, Molecular Biology, Psychology, Neuropsychology, uncanny valley, functional magnetic resonance imaging, fMRI, categorical perception, virtual reality, avatar, human likeness, Mori, uncanny valley hypothesis, perception, magnetic resonance imaging, MRI, imaging, clinical techniques
Play Button
The Multiple Sclerosis Performance Test (MSPT): An iPad-Based Disability Assessment Tool
Authors: Richard A. Rudick, Deborah Miller, Francois Bethoux, Stephen M. Rao, Jar-Chi Lee, Darlene Stough, Christine Reece, David Schindler, Bernadett Mamone, Jay Alberts.
Institutions: Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation, Cleveland Clinic Foundation.
Precise measurement of neurological and neuropsychological impairment and disability in multiple sclerosis is challenging. We report a new test, the Multiple Sclerosis Performance Test (MSPT), which represents a new approach to quantifying MS related disability. The MSPT takes advantage of advances in computer technology, information technology, biomechanics, and clinical measurement science. The resulting MSPT represents a computer-based platform for precise, valid measurement of MS severity. Based on, but extending the Multiple Sclerosis Functional Composite (MSFC), the MSPT provides precise, quantitative data on walking speed, balance, manual dexterity, visual function, and cognitive processing speed. The MSPT was tested by 51 MS patients and 49 healthy controls (HC). MSPT scores were highly reproducible, correlated strongly with technician-administered test scores, discriminated MS from HC and severe from mild MS, and correlated with patient reported outcomes. Measures of reliability, sensitivity, and clinical meaning for MSPT scores were favorable compared with technician-based testing. The MSPT is a potentially transformative approach for collecting MS disability outcome data for patient care and research. Because the testing is computer-based, test performance can be analyzed in traditional or novel ways and data can be directly entered into research or clinical databases. The MSPT could be widely disseminated to clinicians in practice settings who are not connected to clinical trial performance sites or who are practicing in rural settings, drastically improving access to clinical trials for clinicians and patients. The MSPT could be adapted to out of clinic settings, like the patient’s home, thereby providing more meaningful real world data. The MSPT represents a new paradigm for neuroperformance testing. This method could have the same transformative effect on clinical care and research in MS as standardized computer-adapted testing has had in the education field, with clear potential to accelerate progress in clinical care and research.
Medicine, Issue 88, Multiple Sclerosis, Multiple Sclerosis Functional Composite, computer-based testing, 25-foot walk test, 9-hole peg test, Symbol Digit Modalities Test, Low Contrast Visual Acuity, Clinical Outcome Measure
Play Button
Analysis of Tubular Membrane Networks in Cardiac Myocytes from Atria and Ventricles
Authors: Eva Wagner, Sören Brandenburg, Tobias Kohl, Stephan E. Lehnart.
Institutions: Heart Research Center Goettingen, University Medical Center Goettingen, German Center for Cardiovascular Research (DZHK) partner site Goettingen, University of Maryland School of Medicine.
In cardiac myocytes a complex network of membrane tubules - the transverse-axial tubule system (TATS) - controls deep intracellular signaling functions. While the outer surface membrane and associated TATS membrane components appear to be continuous, there are substantial differences in lipid and protein content. In ventricular myocytes (VMs), certain TATS components are highly abundant contributing to rectilinear tubule networks and regular branching 3D architectures. It is thought that peripheral TATS components propagate action potentials from the cell surface to thousands of remote intracellular sarcoendoplasmic reticulum (SER) membrane contact domains, thereby activating intracellular Ca2+ release units (CRUs). In contrast to VMs, the organization and functional role of TATS membranes in atrial myocytes (AMs) is significantly different and much less understood. Taken together, quantitative structural characterization of TATS membrane networks in healthy and diseased myocytes is an essential prerequisite towards better understanding of functional plasticity and pathophysiological reorganization. Here, we present a strategic combination of protocols for direct quantitative analysis of TATS membrane networks in living VMs and AMs. For this, we accompany primary cell isolations of mouse VMs and/or AMs with critical quality control steps and direct membrane staining protocols for fluorescence imaging of TATS membranes. Using an optimized workflow for confocal or superresolution TATS image processing, binarized and skeletonized data are generated for quantitative analysis of the TATS network and its components. Unlike previously published indirect regional aggregate image analysis strategies, our protocols enable direct characterization of specific components and derive complex physiological properties of TATS membrane networks in living myocytes with high throughput and open access software tools. In summary, the combined protocol strategy can be readily applied for quantitative TATS network studies during physiological myocyte adaptation or disease changes, comparison of different cardiac or skeletal muscle cell types, phenotyping of transgenic models, and pharmacological or therapeutic interventions.
Bioengineering, Issue 92, cardiac myocyte, atria, ventricle, heart, primary cell isolation, fluorescence microscopy, membrane tubule, transverse-axial tubule system, image analysis, image processing, T-tubule, collagenase
Play Button
Flying Insect Detection and Classification with Inexpensive Sensors
Authors: Yanping Chen, Adena Why, Gustavo Batista, Agenor Mafra-Neto, Eamonn Keogh.
Institutions: University of California, Riverside, University of California, Riverside, University of São Paulo - USP, ISCA Technologies.
An inexpensive, noninvasive system that could accurately classify flying insects would have important implications for entomological research, and allow for the development of many useful applications in vector and pest control for both medical and agricultural entomology. Given this, the last sixty years have seen many research efforts devoted to this task. To date, however, none of this research has had a lasting impact. In this work, we show that pseudo-acoustic optical sensors can produce superior data; that additional features, both intrinsic and extrinsic to the insect’s flight behavior, can be exploited to improve insect classification; that a Bayesian classification approach allows to efficiently learn classification models that are very robust to over-fitting, and a general classification framework allows to easily incorporate arbitrary number of features. We demonstrate the findings with large-scale experiments that dwarf all previous works combined, as measured by the number of insects and the number of species considered.
Bioengineering, Issue 92, flying insect detection, automatic insect classification, pseudo-acoustic optical sensors, Bayesian classification framework, flight sound, circadian rhythm
Play Button
Utilization of Microscale Silicon Cantilevers to Assess Cellular Contractile Function In Vitro
Authors: Alec S.T. Smith, Christopher J. Long, Christopher McAleer, Nathaniel Bobbitt, Balaji Srinivasan, James J. Hickman.
Institutions: University of Central Florida.
The development of more predictive and biologically relevant in vitro assays is predicated on the advancement of versatile cell culture systems which facilitate the functional assessment of the seeded cells. To that end, microscale cantilever technology offers a platform with which to measure the contractile functionality of a range of cell types, including skeletal, cardiac, and smooth muscle cells, through assessment of contraction induced substrate bending. Application of multiplexed cantilever arrays provides the means to develop moderate to high-throughput protocols for assessing drug efficacy and toxicity, disease phenotype and progression, as well as neuromuscular and other cell-cell interactions. This manuscript provides the details for fabricating reliable cantilever arrays for this purpose, and the methods required to successfully culture cells on these surfaces. Further description is provided on the steps necessary to perform functional analysis of contractile cell types maintained on such arrays using a novel laser and photo-detector system. The representative data provided highlights the precision and reproducible nature of the analysis of contractile function possible using this system, as well as the wide range of studies to which such technology can be applied. Successful widespread adoption of this system could provide investigators with the means to perform rapid, low cost functional studies in vitro, leading to more accurate predictions of tissue performance, disease development and response to novel therapeutic treatment.
Bioengineering, Issue 92, cantilever, in vitro, contraction, skeletal muscle, NMJ, cardiomyocytes, functional
Play Button
Characterization of Complex Systems Using the Design of Experiments Approach: Transient Protein Expression in Tobacco as a Case Study
Authors: Johannes Felix Buyel, Rainer Fischer.
Institutions: RWTH Aachen University, Fraunhofer Gesellschaft.
Plants provide multiple benefits for the production of biopharmaceuticals including low costs, scalability, and safety. Transient expression offers the additional advantage of short development and production times, but expression levels can vary significantly between batches thus giving rise to regulatory concerns in the context of good manufacturing practice. We used a design of experiments (DoE) approach to determine the impact of major factors such as regulatory elements in the expression construct, plant growth and development parameters, and the incubation conditions during expression, on the variability of expression between batches. We tested plants expressing a model anti-HIV monoclonal antibody (2G12) and a fluorescent marker protein (DsRed). We discuss the rationale for selecting certain properties of the model and identify its potential limitations. The general approach can easily be transferred to other problems because the principles of the model are broadly applicable: knowledge-based parameter selection, complexity reduction by splitting the initial problem into smaller modules, software-guided setup of optimal experiment combinations and step-wise design augmentation. Therefore, the methodology is not only useful for characterizing protein expression in plants but also for the investigation of other complex systems lacking a mechanistic description. The predictive equations describing the interconnectivity between parameters can be used to establish mechanistic models for other complex systems.
Bioengineering, Issue 83, design of experiments (DoE), transient protein expression, plant-derived biopharmaceuticals, promoter, 5'UTR, fluorescent reporter protein, model building, incubation conditions, monoclonal antibody
Play Button
Simultaneous Multicolor Imaging of Biological Structures with Fluorescence Photoactivation Localization Microscopy
Authors: Nikki M. Curthoys, Michael J. Mlodzianoski, Dahan Kim, Samuel T. Hess.
Institutions: University of Maine.
Localization-based super resolution microscopy can be applied to obtain a spatial map (image) of the distribution of individual fluorescently labeled single molecules within a sample with a spatial resolution of tens of nanometers. Using either photoactivatable (PAFP) or photoswitchable (PSFP) fluorescent proteins fused to proteins of interest, or organic dyes conjugated to antibodies or other molecules of interest, fluorescence photoactivation localization microscopy (FPALM) can simultaneously image multiple species of molecules within single cells. By using the following approach, populations of large numbers (thousands to hundreds of thousands) of individual molecules are imaged in single cells and localized with a precision of ~10-30 nm. Data obtained can be applied to understanding the nanoscale spatial distributions of multiple protein types within a cell. One primary advantage of this technique is the dramatic increase in spatial resolution: while diffraction limits resolution to ~200-250 nm in conventional light microscopy, FPALM can image length scales more than an order of magnitude smaller. As many biological hypotheses concern the spatial relationships among different biomolecules, the improved resolution of FPALM can provide insight into questions of cellular organization which have previously been inaccessible to conventional fluorescence microscopy. In addition to detailing the methods for sample preparation and data acquisition, we here describe the optical setup for FPALM. One additional consideration for researchers wishing to do super-resolution microscopy is cost: in-house setups are significantly cheaper than most commercially available imaging machines. Limitations of this technique include the need for optimizing the labeling of molecules of interest within cell samples, and the need for post-processing software to visualize results. We here describe the use of PAFP and PSFP expression to image two protein species in fixed cells. Extension of the technique to living cells is also described.
Basic Protocol, Issue 82, Microscopy, Super-resolution imaging, Multicolor, single molecule, FPALM, Localization microscopy, fluorescent proteins
Play Button
Assessment of Motor Balance and Coordination in Mice using the Balance Beam
Authors: Tinh N. Luong, Holly J. Carlisle, Amber Southwell, Paul H. Patterson.
Institutions: California Institute of Technology.
Brain injury, genetic manipulations, and pharmacological treatments can result in alterations of motor skills in mice. Fine motor coordination and balance can be assessed by the beam walking assay. The goal of this test is for the mouse to stay upright and walk across an elevated narrow beam to a safe platform. This test takes place over 3 consecutive days: 2 days of training and 1 day of testing. Performance on the beam is quantified by measuring the time it takes for the mouse to traverse the beam and the number of paw slips that occur in the process. Here we report the protocol used in our laboratory, and representative results from a cohort of C57BL/6 mice. This task is particularly useful for detecting subtle deficits in motor skills and balance that may not be detected by other motor tests, such as the Rotarod.
Neuroscience, Issue 49, motor skills, coordination, balance beam test, mouse behavior
Play Button
Basics of Multivariate Analysis in Neuroimaging Data
Authors: Christian Georg Habeck.
Institutions: Columbia University.
Multivariate analysis techniques for neuroimaging data have recently received increasing attention as they have many attractive features that cannot be easily realized by the more commonly used univariate, voxel-wise, techniques1,5,6,7,8,9. Multivariate approaches evaluate correlation/covariance of activation across brain regions, rather than proceeding on a voxel-by-voxel basis. Thus, their results can be more easily interpreted as a signature of neural networks. Univariate approaches, on the other hand, cannot directly address interregional correlation in the brain. Multivariate approaches can also result in greater statistical power when compared with univariate techniques, which are forced to employ very stringent corrections for voxel-wise multiple comparisons. Further, multivariate techniques also lend themselves much better to prospective application of results from the analysis of one dataset to entirely new datasets. Multivariate techniques are thus well placed to provide information about mean differences and correlations with behavior, similarly to univariate approaches, with potentially greater statistical power and better reproducibility checks. In contrast to these advantages is the high barrier of entry to the use of multivariate approaches, preventing more widespread application in the community. To the neuroscientist becoming familiar with multivariate analysis techniques, an initial survey of the field might present a bewildering variety of approaches that, although algorithmically similar, are presented with different emphases, typically by people with mathematics backgrounds. We believe that multivariate analysis techniques have sufficient potential to warrant better dissemination. Researchers should be able to employ them in an informed and accessible manner. The current article is an attempt at a didactic introduction of multivariate techniques for the novice. A conceptual introduction is followed with a very simple application to a diagnostic data set from the Alzheimer s Disease Neuroimaging Initiative (ADNI), clearly demonstrating the superior performance of the multivariate approach.
JoVE Neuroscience, Issue 41, fMRI, PET, multivariate analysis, cognitive neuroscience, clinical neuroscience
Play Button
Streamlined Purification of Plasmid DNA From Prokaryotic Cultures
Authors: Laura Pueschel, Hongshan Li, Matthew Hymes.
Institutions: Pall Life Sciences .
We describe the complete process of AcroPrep Advance Filter Plates for 96 plasmid preparations, starting from prokaryotic culture and ending with high purity DNA. Based on multi-well filtration for bacterial lysate clearance and DNA purification, this method creates a streamlined process for plasmid preparation. Filter plates containing silica-based media can easily be processed by vacuum filtration or centrifuge to yield appreciable quantities of plasmid DNA. Quantitative analyses determine the purified plasmid DNA is consistently of high quality with average OD260/280 ratios of 1.97. Overall, plasmid yields offer more pure DNA for downstream applications, such as sequencing and cloning. This streamlined method of using AcroPrep Advance Filter Plates allows for manual, semi-automated or fully-automated processing.
Molecular Biology, Issue 47, Plasmid purification, High-throughput, miniprep, filter plates
Play Button
Using Learning Outcome Measures to assess Doctoral Nursing Education
Authors: Glenn H. Raup, Jeff King, Romana J. Hughes, Natasha Faidley.
Institutions: Harris College of Nursing and Health Sciences, Texas Christian University.
Education programs at all levels must be able to demonstrate successful program outcomes. Grades alone do not represent a comprehensive measurement methodology for assessing student learning outcomes at either the course or program level. The development and application of assessment rubrics provides an unequivocal measurement methodology to ensure a quality learning experience by providing a foundation for improvement based on qualitative and quantitatively measurable, aggregate course and program outcomes. Learning outcomes are the embodiment of the total learning experience and should incorporate assessment of both qualitative and quantitative program outcomes. The assessment of qualitative measures represents a challenge for educators in any level of a learning program. Nursing provides a unique challenge and opportunity as it is the application of science through the art of caring. Quantification of desired student learning outcomes may be enhanced through the development of assessment rubrics designed to measure quantitative and qualitative aspects of the nursing education and learning process. They provide a mechanism for uniform assessment by nursing faculty of concepts and constructs that are otherwise difficult to describe and measure. A protocol is presented and applied to a doctoral nursing education program with recommendations for application and transformation of the assessment rubric to other education programs. Through application of these specially designed rubrics, all aspects of an education program can be adequately assessed to provide information for program assessment that facilitates the closure of the gap between desired and actual student learning outcomes for any desired educational competency.
Medicine, Issue 40, learning, outcomes, measurement, program, assessment, rubric
Copyright © JoVE 2006-2015. All Rights Reserved.
Policies | License Agreement | ISSN 1940-087X
simple hit counter

What is Visualize?

JoVE Visualize is a tool created to match the last 5 years of PubMed publications to methods in JoVE's video library.

How does it work?

We use abstracts found on PubMed and match them to JoVE videos to create a list of 10 to 30 related methods videos.

Video X seems to be unrelated to Abstract Y...

In developing our video relationships, we compare around 5 million PubMed articles to our library of over 4,500 methods videos. In some cases the language used in the PubMed abstracts makes matching that content to a JoVE video difficult. In other cases, there happens not to be any content in our video library that is relevant to the topic of a given abstract. In these cases, our algorithms are trying their best to display videos with relevant content, which can sometimes result in matched videos with only a slight relation.