There are scientists who are taking remarkable steps to address the reproducibility crisis in scientific research. Read about the four scientists who started four unique initiatives to lend their voice to this conversation.
In 2016, we witnessed wide-scale attention about the reproducibility crisis in science. The scientific community understands the critical need to restore the public’s trust in the credibility of scientific research. Here, we highlight four research scientists who started conversations and launched initiatives to raise awareness and help solve the reproducibility crisis. All four initiatives required multiple collaborations, unique innovative ideas, and communication with peers worldwide.
Brian Nosek a professor of psychology at the University of Virginia, cofounded the Center for Open Science (COS) with the mission to “foster openness, integrity, and reproducibility of scientific research”. Under his leadership, COS initiated the Reproducibility Project: Psychology in 2011, during which Nosek recruited 270 psychology researchers to replicate 100 previously published experiments from three reputable psychology journals. Results from the project were published in 2015, with Nosek and his collaborators concluding: “A large portion of replications produced weaker evidence for the original findings despite using materials provided by the original authors, review in advance for methodological fidelity, and high statistical power to detect the original effect sizes.”
Elizabeth Iorns is a cancer biologist and the founder and CEO of Science Exchange, a Silicon Valley-based technology company which runs a platform for “scientists to order experiments from the world’s best service providers.” Iorns explains in her TED MED talk on “Prioritizing Reproducibility for Scientific Advancement” the reason why she left academia and started looking for ways to identify and reward reproducible research. Along with COS, Iorns launched the Reproducibility Project: Cancer Biology in 2013 to replicate breakthrough cancer biology studies published between 2010-2012. On January 19, 2017, the results from five of these completed replication studies were published in eLife offering no clear answers. Out of the five two were substantially reproduced, two were uninterpretable, and one could not be reproduced. “If people had deposited raw data and full protocols at the time of publication, we wouldn’t have to go back to the original authors,” said Iorns (in a recent interview with The Atlantic).
Moshe Pritsker founded JoVE world’s first peer-reviewed science video journal in 2006 while he was a post-doctoral researcher at Harvard Medical School. JoVE is an innovative platform which allows authors to publish their methods in both text and video format, and the video component helps highlight the different nuances and minute details of the experiments. Pritsker has highlighted how scientific methods play a critical role in research reproducibility and it is essential to prevent scientists from getting caught in the cycle of learning and relearning techniques. Most of the experimental methods published in journals are constrained to a limited word-count and authors are often unable to explain their methods in detail under those circumstances. Pritsker’s systemic solution to enhance method replicability has been adopted by researchers worldwide.
John Ioannidis a professor of Medicine and of Health Research and Policy at Stanford University School of Medicine heads the Meta-Research Innovation Center at Stanford (METRICS). METRICS aims at optimizing the reproducibility and efficiency of scientific investigations through evaluation of research practices. In one of his highly-cited PLOS Medicine articles “Why Most Published Research Findings Are False”, Ioannidis highlighted various types of biases in scientific research and publishing. He has suggested academic institutions and journals can play a key role as ‘gatekeepers’ in defending quality and high standards in research-reporting. He also urges researchers to adopt meta-analysis approaches during data evaluation. Ioannidis and his collaborators are constantly evaluating and disseminating relevant information and potential solutions to help deliver more reproducible science.
As these four leading scientists, have shown us, science is facing a reproducibility crisis. We need to focus on innovative, collaborative, and transparent approaches to help solve this crisis. Irreproducible findings can have a far-reaching ripple effect in science, affecting future research studies and scientific decision-making. We urge scientists to be aware of this crisis, implement best practices in publishing research methods and data, and most importantly, continue being a part of the dialogue to resolve this critical challenge.