No raw data, no science: another possible source of the reproducibility crisis

Author : Tsuyoshi Miyakawa

A reproducibility crisis is a situation where many scientific studies cannot be reproduced. Inappropriate practices of science, such as HARKing, p-hacking, and selective reporting of positive results, have been suggested as causes of irreproducibility. In this editorial, I propose that a lack of raw data or data fabrication is another possible cause of irreproducibility.

As an Editor-in-Chief of Molecular Brain, I have handled 180 manuscripts since early 2017 and have made 41 editorial decisions categorized as “Revise before review,” requesting that the authors provide raw data.

Surprisingly, among those 41 manuscripts, 21 were withdrawn without providing raw data, indicating that requiring raw data drove away more than half of the manuscripts. I rejected 19 out of the remaining 20 manuscripts because of insufficient raw data.

Thus, more than 97% of the 41 manuscripts did not present the raw data supporting their results when requested by an editor, suggesting a possibility that the raw data did not exist from the beginning, at least in some portions of these cases.

Considering that any scientific study should be based on raw data, and that data storage space should no longer be a challenge, journals, in principle, should try to have their authors publicize raw data in a public database or journal site upon the publication of the paper to increase reproducibility of the published results and to increase public trust in science.

URL : No raw data, no science: another possible source of the reproducibility crisis

DOI : https://doi.org/10.1186/s13041-020-0552-2

Reproducible and transparent research practices in published neurology research

Authors : Trevor Torgerson, Austin L. Johnson, Jonathan Pollard, Daniel Tritz, Matt Vassar

Background

The objective of this study was to evaluate the nature and extent of reproducible and transparent research practices in neurology publications.

Methods

The NLM catalog was used to identify MEDLINE-indexed neurology journals. A PubMed search of these journals was conducted to retrieve publications over a 5-year period from 2014 to 2018.

A random sample of publications was extracted. Two authors conducted data extraction in a blinded, duplicate fashion using a pilot-tested Google form. This form prompted data extractors to determine whether publications provided access to items such as study materials, raw data, analysis scripts, and protocols.

In addition, we determined if the publication was included in a replication study or systematic review, was preregistered, had a conflict of interest declaration, specified funding sources, and was open access.

Results

Our search identified 223,932 publications meeting the inclusion criteria, from which 400 were randomly sampled. Only 389 articles were accessible, yielding 271 publications with empirical data for analysis.

Our results indicate that 9.4% provided access to materials, 9.2% provided access to raw data, 0.7% provided access to the analysis scripts, 0.7% linked the protocol, and 3.7% were preregistered.

A third of sampled publications lacked funding or conflict of interest statements. No publications from our sample were included in replication studies, but a fifth were cited in a systematic review or meta-analysis.

Conclusions

Currently, published neurology research does not consistently provide information needed for reproducibility. The implications of poor research reporting can both affect patient care and increase research waste. Collaborative intervention by authors, peer reviewers, journals, and funding sources is needed to mitigate this problem.

URL : Reproducible and transparent research practices in published neurology research

DOI : https://doi.org/10.1186/s41073-020-0091-5

 

An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

Authors : Tom E. Hardwicke, Joshua D. Wallach, Mallory C. Kidwell, Theiss Bendixen, Sophia Crüwell, John P. A. Ioannidis

Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility.

Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends reflecting the cumulative contribution of these efforts.

In this study, we manually examined a random sample of 250 articles in order to estimate the prevalence of a range of transparency and reproducibility-related indicators in the social sciences literature published between 2014 and 2017.

Few articles indicated availability of materials (16/151, 11% [95% confidence interval, 7% to 16%]), protocols (0/156, 0% [0% to 1%]), raw data (11/156, 7% [2% to 13%]) or analysis scripts (2/156, 1% [0% to 3%]), and no studies were pre-registered (0/156, 0% [0% to 1%]).

Some articles explicitly disclosed funding sources (or lack of; 74/236, 31% [25% to 37%]) and some declared no conflicts of interest (36/236, 15% [11% to 20%]). Replication studies were rare (2/156, 1% [0% to 3%]).

Few studies were included in evidence synthesis via systematic review (17/151, 11% [7% to 16%]) or meta-analysis (2/151, 1% [0% to 3%]). Less than half the articles were publicly available (101/250, 40% [34% to 47%]).

Minimal adoption of transparency and reproducibility-related research practices could be undermining the credibility and efficiency of social science research. The present study establishes a baseline that can be revisited in the future to assess progress.

URL : An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

DOI : https://doi.org/10.1098/rsos.190806

Reproducible research practices, openness and transparency in health economic evaluations: study protocol for a cross-sectional comparative analysis

Authors : Ferrán Catalá-López, Lisa Caulley, Manuel Ridao, Brian Hutton, Don Husereau, Michael F Drummond, Adolfo Alonso-Arroyo, Manuel Pardo-Fernández, Enrique Bernal-Delgado, Ricard Meneu, Rafael Tabarés-Seisdedos, José Ramón Repullo, David Moher

Introduction

There has been a growing awareness of the need for rigorously and transparent reported health research, to ensure the reproducibility of studies by future researchers.

Health economic evaluations, the comparative analysis of alternative interventions in terms of their costs and consequences, have been promoted as an important tool to inform decision-making.

The objective of this study will be to investigate the extent to which articles of economic evaluations of healthcare interventions indexed in MEDLINE incorporate research practices that promote transparency, openness and reproducibility.

Methods and analysis

This is the study protocol for a cross-sectional comparative analysis. We registered the study protocol within the Open Science Framework (osf.io/gzaxr). We will evaluate a random sample of 600 cost-effectiveness analysis publications, a specific form of health economic evaluations, indexed in MEDLINE during 2012 (n=200), 2019 (n=200) and 2022 (n=200).

We will include published papers written in English reporting an incremental cost-effectiveness ratio in terms of costs per life years gained, quality-adjusted life years and/or disability-adjusted life years. Screening and selection of articles will be conducted by at least two researchers.

Reproducible research practices, openness and transparency in each article will be extracted using a standardised data extraction form by multiple researchers, with a 33% random sample (n=200) extracted in duplicate.

Information on general, methodological and reproducibility items will be reported, stratified by year, citation of the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement and journal. Risk ratios with 95% CIs will be calculated to represent changes in reporting between 2012–2019 and 2019–2022.

Ethics and dissemination

Due to the nature of the proposed study, no ethical approval will be required. All data will be deposited in a cross-disciplinary public repository.

It is anticipated the study findings could be relevant to a variety of audiences. Study findings will be disseminated at scientific conferences and published in peer-reviewed journals.

URL : Reproducible research practices, openness and transparency in health economic evaluations: study protocol for a cross-sectional comparative analysis

DOI : http://dx.doi.org/10.1136/bmjopen-2019-034463

Publishing computational research — A review of infrastructures for reproducible and transparent scholarly communication

Authors : Markus Konkol, Daniel Nüst, Laura Goulier

Funding agencies increasingly ask applicants to include data and software management plans into proposals. In addition, the author guidelines of scientific journals and conferences more often include a statement on data availability, and some reviewers reject unreproducible submissions.

This trend towards open science increases the pressure on authors to provide access to the source code and data underlying the computational results in their scientific papers.

Still, publishing reproducible articles is a demanding task and not achieved simply by providing access to code scripts and data files. Consequently, several projects develop solutions to support the publication of executable analyses alongside articles considering the needs of the aforementioned stakeholders.

The key contribution of this paper is a review of applications addressing the issue of publishing executable computational research results. We compare the approaches across properties relevant for the involved stakeholders, e.g., provided features and deployment options, and also critically discuss trends and limitations.

The review can support publishers to decide which system to integrate into their submission process, editors to recommend tools for researchers, and authors of scientific papers to adhere to reproducibility principles.

URL : https://arxiv.org/abs/2001.00484

Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis

Authors : J Michael Anderson, Andrew Niemann, Austin L Johnson, Courtney Cook, Daniel Tritz, Matt Vassar

Background

Reproducible research is a foundational component for scientific advancements, yet little is known regarding the extent of reproducible research within the dermatology literature.

Objective

This study aimed to determine the quality and transparency of the literature in dermatology journals by evaluating for the presence of 8 indicators of reproducible and transparent research practices.

Methods

By implementing a cross-sectional study design, we conducted an advanced search of publications in dermatology journals from the National Library of Medicine catalog. Our search included articles published between January 1, 2014, and December 31, 2018.

After generating a list of eligible dermatology publications, we then searched for full text PDF versions by using Open Access Button, Google Scholar, and PubMed. Publications were analyzed for 8 indicators of reproducibility and transparency—availability of materials, data, analysis scripts, protocol, preregistration, conflict of interest statement, funding statement, and open access—using a pilot-tested Google Form.

Results

After exclusion, 127 studies with empirical data were included in our analysis. Certain indicators were more poorly reported than others. We found that most publications (113, 88.9%) did not provide unmodified, raw data used to make computations, 124 (97.6%) failed to make the complete protocol available, and 126 (99.2%) did not include step-by-step analysis scripts.

Conclusions

Our sample of studies published in dermatology journals do not appear to include sufficient detail to be accurately and successfully reproduced in their entirety. Solutions to increase the quality, reproducibility, and transparency of dermatology research are warranted.

More robust reporting of key methodological details, open data sharing, and stricter standards journals impose on authors regarding disclosure of study materials might help to better the climate of reproducible research in dermatology.

URL : Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis

DOI : https://doi.org/10.2196/16078

Publishers’ Responsibilities in Promoting Data Quality and Reproducibility

Author : Iain Hrynaszkiewicz

Scholarly publishers can help to increase data quality and reproducible research by promoting transparency and openness.

Increasing transparency can be achieved by publishers in six key areas: (1) understanding researchers’ problems and motivations, by conducting and responding to the findings of surveys; (2) raising awareness of issues and encouraging behavioural and cultural change, by introducing consistent journal policies on sharing research data, code and materials; (3) improving the quality and objectivity of the peer-review process by implementing reporting guidelines and checklists and using technology to identify misconduct; (4) improving scholarly communication infrastructure with journals that publish all scientifically sound research, promoting study registration, partnering with data repositories and providing services that improve data sharing and data curation; (5) increasing incentives for practising open research with data journals and software journals and implementing data citation and badges for transparency; and (6) making research communication more open and accessible, with open-access publishing options, permitting text and data mining and sharing publisher data and metadata and through industry and community collaboration.

This chapter describes practical approaches being taken by publishers, in these six areas, their progress and effectiveness and the implications for researchers publishing their work.

URL : Publishers’ Responsibilities in Promoting Data Quality and Reproducibility

Alternative location : https://link.springer.com/chapter/10.1007%2F164_2019_290