Assessment of transparency indicators across the biomedical literature: How open is open?

Authors : Stylianos Serghiou, Despina G. Contopoulos-Ioannidis, Kevin W. Boyack, Nico Riedel, Joshua D. Wallach, John P. A. Ioannidis

Recent concerns about the reproducibility of science have led to several calls for more open and transparent research practices and for the monitoring of potential improvements over time. However, with tens of thousands of new biomedical articles published per week, manually mapping and monitoring changes in transparency is unrealistic.

We present an open-source, automated approach to identify 5 indicators of transparency (data sharing, code sharing, conflicts of interest disclosures, funding disclosures, and protocol registration) and apply it across the entire open access biomedical literature of 2.75 million articles on PubMed Central (PMC).

Our results indicate remarkable improvements in some (e.g., conflict of interest [COI] disclosures and funding disclosures), but not other (e.g., protocol registration and code sharing) areas of transparency over time, and map transparency across fields of science, countries, journals, and publishers.

This work has enabled the creation of a large, integrated, and openly available database to expedite further efforts to monitor, understand, and promote transparency and reproducibility in science.

URL : Assessment of transparency indicators across the biomedical literature: How open is open?

DOI : https://doi.org/10.1371/journal.pbio.3001107

Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study

Authors : Tom E. Hardwicke, Manuel Bohn, Kyle MacDonald, Emily Hembacher, Michèle B. Nuijten, Benjamin N. Peloquin, Benjamin E. deMayo, Bria Long, Erica J. Yoon, Michael C. Frank

For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015.

Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy’ (>10% difference) prompting us to request input from original authors.

Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles.

Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected.

Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.

URL : Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study

DOI : https://doi.org/10.1098/rsos.201494

The Rigor and Transparency Index Quality Metric for Assessing Biological and Medical Science Methods

Authors : Joe Menke, Martijn Roelandse, Burak Ozyurt, Maryann Martone, Anita Bandrowski

The reproducibility crisis is a multifaceted problem involving ingrained practices within the scientific community. Fortunately, some causes are addressed by the author’s adherence to rigor and reproducibility criteria, implemented via checklists at various journals.

We developed an automated tool (SciScore) that evaluates research articles based on their adherence to key rigor criteria, including NIH criteria and RRIDs, at an unprecedented scale. We show that despite steady improvements, less than half of the scoring criteria, such as blinding or power analysis, are routinely addressed by authors; digging deeper, we examined the influence of specific checklists on average scores.

The average score for a journal in a given year was named the Rigor and Transparency Index (RTI), a new journal quality metric. We compared the RTI with the Journal Impact Factor and found there was no correlation. The RTI can potentially serve as a proxy for methodological quality.

URL : The Rigor and Transparency Index Quality Metric for Assessing Biological and Medical Science Methods

DOI : https://doi.org/10.1016/j.isci.2020.101698

Questionable Research Practices and Open Science in Quantitative Criminology

Authors : Jason Chinn, Justin Pickett, Simine Vazire, Alex Holcombe

Objectives

Questionable research practices (QRPs) lead to incorrect research results and contribute to irreproducibility in science. Researchers and institutions have proposed open science practices (OSPs) to improve the detectability of QRPs and the credibility of science. We examine the prevalence of QRPs and OSPs in criminology, and researchers’ opinions of those practices.

Methods

We administered an anonymous survey to authors of articles published in criminology journals. Respondents self-reported their own use of 10 QRPs and 5 OSPs. They also estimated the prevalence of use by others, and reported their attitudes toward the practices.

Results

QRPs and OSPs are both common in quantitative criminology, about as common as they are in other fields. Criminologists who responded to our survey support using QRPs in some circumstances, but are even more supportive of using OSPs.

We did not detect a significant relationship between methodological training and either QRP or OSP use.

Support for QRPs is negatively and significantly associated with support for OSPs. Perceived prevalence estimates for some practices resembled a uniform distribution, suggesting criminologists have little knowledge of the proportion of researchers that engage in certain questionable practices.

Conclusions

Most quantitative criminologists in our sample use QRPs, and many use multiple QRPs. The substantial prevalence of QRPs raises questions about the validity and reproducibility of published criminological research.

We found promising levels of OSP use, albeit at levels lagging what researchers endorse. The findings thus suggest that additional reforms are needed to decrease QRP use and increase the use of OSPs.

URL : Questionable Research Practices and Open Science in Quantitative Criminology

DOI : https://doi.org/10.31235/osf.io/bwm7s

Science Communication in the Context of Reproducibility and Replicability: How Nonscientists Navigate Scientific Uncertainty

Author : Emily L. Howell

Scientists stand to gain in obvious ways from recent efforts to develop robust standards for and mechanisms of reproducibility and replicability. Demonstrations of reproducibility and replicability may provide clarity with respect to areas of uncertainty in scientific findings and translate into greater impact for the research.

But when it comes to public perceptions of science, it is less clear what gains might come from recent efforts to improve reproducibility and replicability. For example, could such efforts improve public understandings of scientific uncertainty?

To gain insight into this issue, we would need to know how those views are shaped by media coverage of it, but none of the emergent research on public views of reproducibility and replicability in science considers that question.

We do, however, have the recent report on Reproducibility and Replicability in Science issued by the National Academies of Sciences, Engineering, and Medicine, which provides an overview of public perceptions of uncertainty in science.

Here, I adapt that report to begin a conversation between researchers and practitioners, with the aim of expanding research on public perceptions of scientific uncertainty. This overview draws upon research on risk perception and science communication to describe how the media influences the communication and perception of uncertainty in science.

It ends by presenting recommendations for communicating scientific uncertainty as it pertains to issues of reproducibility and replicability.

URL : Science Communication in the Context of Reproducibility and Replicability: How Nonscientists Navigate Scientific Uncertainty

Original location : https://hdsr.mitpress.mit.edu/pub/3g7u601s/release/2

Reading the fine print: A review and analysis of business journals’ data sharing policies

Authors : Brianne Dosch, Tyler Martindale

Business librarians offer many data services to their researchers. These services are often focused more on discovery, visualization, and analysis than general data management. But, with the replication crisis facing many business disciplines, there is a need for business librarians to offer more data sharing and general data management support to their researchers.

To find evidence of this data need, 146 business journal’s data sharing policies were reviewed and analyzed to uncover meaningful trends in business research. Results of the study indicate data sharing is not mandated by business journals.

However, data sharing is often encouraged and recommended. This journal policy content analysis provides evidence that business researchers have opportunities to share their research data, and with the right data management support, business librarians can play a significant role in improving the data sharing behaviors of business researchers.

DOI : https://doi.org/10.1080/08963568.2020.1847549

Metascience as a scientific social movement

Authors : David Peterson, Aaron Panofsky

Emerging out of the “reproducibility crisis” in science, metascientists have become central players in debates about research integrity, scholarly communication, and science policy. The goal of this article is to introduce metascience to STS scholars, detail the scientific ideology that is apparent in its articles, strategy statements, and research projects, and discuss its institutional and intellectual future.

Put simply, metascience is a scientific social movement that seeks to use the tools of science- especially, quantification and experimentation- to diagnose problems in research practice and improve efficiency.

It draws together data scientists, experimental and statistical methodologists, and open science activists into a project with both intellectual and policy dimensions. Metascientists have been remarkably successful at winning grants, motivating news coverage, and changing policies at science agencies, journals, and universities.

Moreover, metascience represents the apotheosis of several trends in research practice, scientific communication, and science governance including increased attention to methodological and statistical criticism of scientific practice, the promotion of “open science” by science funders and journals, the growing importance of both preprint and data repositories for scientific communication, and the new prominence of data scientists as research makes a turn toward Big Science.

URL : Metascience as a scientific social movement

DOI : https://doi.org/10.31235/osf.io/4dsqa