The future of the scientific process is now open for debate. On May 10th, Daniel Sarewitz published a column in Nature (Nature 485,149 (10 May 2012) doi:10.1038/485149a) decrying the recent trend towards positive publishing bias in the biomedical sciences. This means that more often than not, studies that fail to find positive results are not reported, while any hint of a statistically significant positive result is published. This positive publication bias is substantiated by the now famous Amgen study (C. G. Begley & L. M. Ellis Nature 483, 531–533; 2012) that reported the inability to reproduce 47 of 53 landmark cancer studies.
Calle et al. (2011) - 'A Procedure for Lung Engineering' -a novel technique published in a visualized format
One possible explanation for this lack of reproducibility is that positive results are actually false positives, (i.e. not actually true). Another explanation is supported by a 2005 paper by John Ioannidis (also cited by Sarewitz), titled ‘Why Most Published Research Findings Are False’ (PLoS Med. 2, e124; 2005). Here, Ioannidis (2005) asserts that research results are less likely to be true when the studies have greater ‘flexibility in designs, definitions, outcomes, and analytical modes’. Simply put, your method of study is extremely important in producing un-biased, true research. However, in the current state of most scientific publishing, transparency of methods is embarrassingly poor. If you open any of the highest impact journals, methods sections are relegated to the end of the paper, often in trimmed down and difficult to decipher versions. We exist in a scientific culture that places exceeding importance on positive results, without fully describing, or showing, how these results were obtained.
If Ioannidis (2005) is correct, then reducing variability amongst methods for similar experiments by increasing transparency is one key solution for reducing bias and providing faster verification and falsification of findings. This self-correction is at the heart of the scientific method. By keeping methodology in a diminished form, and not highlighting verified, accurately executed techniques, we fail to control for one highly significant variable in producing unbiased research. The scientific process is at stake here, so we must ask ourselves: how are we going to move forward? We can begin by publishing highly transparent and visible methodology, both novel and ‘gold standard’.
Lee et al. (2012) showing proper technique for loading and running agarose gels
Here, it’s important for novel methods to garner high visibility so that the scientific process can hone, correct and apply them in the appropriate circumstances. For example, in March 2011, Dr. Laura Nicklason published her laboratory’s tissue engineering technique (Calle, E. A., Petersen, T. H., Niklason, L. E. J. Vis. Exp. (49), e2651, DOI: 10.3791/2651 (2011)), which is difficult to reproduce without visualization of the apparatus and methodology. Presented in the multimedia format of a JoVE publication, this complicated technique is now more easily reproduced. Concurrently, it’s critical that the time honored, ‘gold standard’ techniques continue to be executed correctly. For example, a recent JoVE publication, Agarose Gel Electrophoresis for the Separation of DNA Fragments (Lee, P. Y., Costumbrado, J., Hsu, C., Kim, Y. H. J. Vis. Exp. (62), e3923, DOI: 10.3791/3923 (2012), shows best practices for a standard biological technique. In both of these cases, visualizing techniques allows for better execution of experiments, which will improve reproducibility and, hopefully, decrease publication bias.