Transparency and open science principles in reporting guidelines in sleep research and chronobiology journals

Authors : Manuel Spitschan, Marlene H. Schmidt, Christine Blume

Background

“Open science” is an umbrella term describing various aspects of transparent and open science practices. The adoption of practices at different levels of the scientific process (e.g., individual researchers, laboratories, institutions) has been rapidly changing the scientific research landscape in the past years, but their uptake differs from discipline to discipline.

Here, we asked to what extent journals in the field of sleep research and chronobiology encourage or even require following transparent and open science principles in their author guidelines.

Methods

We scored the author guidelines of a comprehensive set of 28 sleep and chronobiology journals, including the major outlets in the field, using the standardised Transparency and Openness (TOP) Factor.

This instrument rates the extent to which journals encourage or require following various aspects of open science, including data citation, data transparency, analysis code transparency, materials transparency, design and analysis guidelines, study pre-registration, analysis plan pre-registration, replication, registered reports, and the use of open science badges.

Results

Across the 28 journals, we find low values on the TOP Factor (median [25th, 75th percentile] 2.5 [1, 3], min. 0, max. 9, out of a total possible score of 28) in sleep research and chronobiology journals.

Conclusions

Our findings suggest an opportunity for sleep research and chronobiology journals to further support the recent developments in transparent and open science by implementing transparency and openness principles in their guidelines and making adherence to them mandatory.

URL : Transparency and open science principles in reporting guidelines in sleep research and chronobiology journals

DOI : https://doi.org/10.12688/wellcomeopenres.16111.1

Open peer review: promoting transparency in open science

Authors : Dietmar Wolfram, Peiling Wang, Adam Hembree, Hyoungjoo Park

Open peer review (OPR), where review reports and reviewers’ identities are published alongside the articles, represents one of the last aspects of the open science movement to be widely embraced, although its adoption has been growing since the turn of the century.

This study provides the first comprehensive investigation of OPR adoption, its early adopters and the implementation approaches used. Current bibliographic databases do not systematically index OPR journals, nor do the OPR journals clearly state their policies on open identities and open reports.

Using various methods, we identified 617 OPR journals that published at least one article with open identities or open reports as of 2019 and analyzed their wide-ranging implementations to derive emerging OPR practices.

The findings suggest that: (1) there has been a steady growth in OPR adoption since 2001, when 38 journals initially adopted OPR, with more rapid growth since 2017; (2) OPR adoption is most prevalent in medical and scientific disciplines (79.9%); (3) five publishers are responsible for 81% of the identified OPR journals; (4) early adopter publishers have implemented OPR in different ways, resulting in different levels of transparency.

Across the variations in OPR implementations, two important factors define the degree of transparency: open identities and open reports. Open identities may include reviewer names and affiliation as well as credentials; open reports may include timestamped review histories consisting of referee reports and author rebuttals or a letter from the editor integrating reviewers’ comments.

When and where open reports can be accessed are also important factors indicating the OPR transparency level. Publishers of optional OPR journals should add metric data in their annual status reports.

URL : Open peer review: promoting transparency in open science

DOI : https://doi.org/10.1007/s11192-020-03488-4

Reproducible and transparent research practices in published neurology research

Authors : Trevor Torgerson, Austin L. Johnson, Jonathan Pollard, Daniel Tritz, Matt Vassar

Background

The objective of this study was to evaluate the nature and extent of reproducible and transparent research practices in neurology publications.

Methods

The NLM catalog was used to identify MEDLINE-indexed neurology journals. A PubMed search of these journals was conducted to retrieve publications over a 5-year period from 2014 to 2018.

A random sample of publications was extracted. Two authors conducted data extraction in a blinded, duplicate fashion using a pilot-tested Google form. This form prompted data extractors to determine whether publications provided access to items such as study materials, raw data, analysis scripts, and protocols.

In addition, we determined if the publication was included in a replication study or systematic review, was preregistered, had a conflict of interest declaration, specified funding sources, and was open access.

Results

Our search identified 223,932 publications meeting the inclusion criteria, from which 400 were randomly sampled. Only 389 articles were accessible, yielding 271 publications with empirical data for analysis.

Our results indicate that 9.4% provided access to materials, 9.2% provided access to raw data, 0.7% provided access to the analysis scripts, 0.7% linked the protocol, and 3.7% were preregistered.

A third of sampled publications lacked funding or conflict of interest statements. No publications from our sample were included in replication studies, but a fifth were cited in a systematic review or meta-analysis.

Conclusions

Currently, published neurology research does not consistently provide information needed for reproducibility. The implications of poor research reporting can both affect patient care and increase research waste. Collaborative intervention by authors, peer reviewers, journals, and funding sources is needed to mitigate this problem.

URL : Reproducible and transparent research practices in published neurology research

DOI : https://doi.org/10.1186/s41073-020-0091-5

 

An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

Authors : Tom E. Hardwicke, Joshua D. Wallach, Mallory C. Kidwell, Theiss Bendixen, Sophia Crüwell, John P. A. Ioannidis

Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility.

Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends reflecting the cumulative contribution of these efforts.

In this study, we manually examined a random sample of 250 articles in order to estimate the prevalence of a range of transparency and reproducibility-related indicators in the social sciences literature published between 2014 and 2017.

Few articles indicated availability of materials (16/151, 11% [95% confidence interval, 7% to 16%]), protocols (0/156, 0% [0% to 1%]), raw data (11/156, 7% [2% to 13%]) or analysis scripts (2/156, 1% [0% to 3%]), and no studies were pre-registered (0/156, 0% [0% to 1%]).

Some articles explicitly disclosed funding sources (or lack of; 74/236, 31% [25% to 37%]) and some declared no conflicts of interest (36/236, 15% [11% to 20%]). Replication studies were rare (2/156, 1% [0% to 3%]).

Few studies were included in evidence synthesis via systematic review (17/151, 11% [7% to 16%]) or meta-analysis (2/151, 1% [0% to 3%]). Less than half the articles were publicly available (101/250, 40% [34% to 47%]).

Minimal adoption of transparency and reproducibility-related research practices could be undermining the credibility and efficiency of social science research. The present study establishes a baseline that can be revisited in the future to assess progress.

URL : An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

DOI : https://doi.org/10.1098/rsos.190806

Reproducible research practices, openness and transparency in health economic evaluations: study protocol for a cross-sectional comparative analysis

Authors : Ferrán Catalá-López, Lisa Caulley, Manuel Ridao, Brian Hutton, Don Husereau, Michael F Drummond, Adolfo Alonso-Arroyo, Manuel Pardo-Fernández, Enrique Bernal-Delgado, Ricard Meneu, Rafael Tabarés-Seisdedos, José Ramón Repullo, David Moher

Introduction

There has been a growing awareness of the need for rigorously and transparent reported health research, to ensure the reproducibility of studies by future researchers.

Health economic evaluations, the comparative analysis of alternative interventions in terms of their costs and consequences, have been promoted as an important tool to inform decision-making.

The objective of this study will be to investigate the extent to which articles of economic evaluations of healthcare interventions indexed in MEDLINE incorporate research practices that promote transparency, openness and reproducibility.

Methods and analysis

This is the study protocol for a cross-sectional comparative analysis. We registered the study protocol within the Open Science Framework (osf.io/gzaxr). We will evaluate a random sample of 600 cost-effectiveness analysis publications, a specific form of health economic evaluations, indexed in MEDLINE during 2012 (n=200), 2019 (n=200) and 2022 (n=200).

We will include published papers written in English reporting an incremental cost-effectiveness ratio in terms of costs per life years gained, quality-adjusted life years and/or disability-adjusted life years. Screening and selection of articles will be conducted by at least two researchers.

Reproducible research practices, openness and transparency in each article will be extracted using a standardised data extraction form by multiple researchers, with a 33% random sample (n=200) extracted in duplicate.

Information on general, methodological and reproducibility items will be reported, stratified by year, citation of the Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement and journal. Risk ratios with 95% CIs will be calculated to represent changes in reporting between 2012–2019 and 2019–2022.

Ethics and dissemination

Due to the nature of the proposed study, no ethical approval will be required. All data will be deposited in a cross-disciplinary public repository.

It is anticipated the study findings could be relevant to a variety of audiences. Study findings will be disseminated at scientific conferences and published in peer-reviewed journals.

URL : Reproducible research practices, openness and transparency in health economic evaluations: study protocol for a cross-sectional comparative analysis

DOI : http://dx.doi.org/10.1136/bmjopen-2019-034463

Publishing computational research — A review of infrastructures for reproducible and transparent scholarly communication

Authors : Markus Konkol, Daniel Nüst, Laura Goulier

Funding agencies increasingly ask applicants to include data and software management plans into proposals. In addition, the author guidelines of scientific journals and conferences more often include a statement on data availability, and some reviewers reject unreproducible submissions.

This trend towards open science increases the pressure on authors to provide access to the source code and data underlying the computational results in their scientific papers.

Still, publishing reproducible articles is a demanding task and not achieved simply by providing access to code scripts and data files. Consequently, several projects develop solutions to support the publication of executable analyses alongside articles considering the needs of the aforementioned stakeholders.

The key contribution of this paper is a review of applications addressing the issue of publishing executable computational research results. We compare the approaches across properties relevant for the involved stakeholders, e.g., provided features and deployment options, and also critically discuss trends and limitations.

The review can support publishers to decide which system to integrate into their submission process, editors to recommend tools for researchers, and authors of scientific papers to adhere to reproducibility principles.

URL : https://arxiv.org/abs/2001.00484

Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis

Authors : J Michael Anderson, Andrew Niemann, Austin L Johnson, Courtney Cook, Daniel Tritz, Matt Vassar

Background

Reproducible research is a foundational component for scientific advancements, yet little is known regarding the extent of reproducible research within the dermatology literature.

Objective

This study aimed to determine the quality and transparency of the literature in dermatology journals by evaluating for the presence of 8 indicators of reproducible and transparent research practices.

Methods

By implementing a cross-sectional study design, we conducted an advanced search of publications in dermatology journals from the National Library of Medicine catalog. Our search included articles published between January 1, 2014, and December 31, 2018.

After generating a list of eligible dermatology publications, we then searched for full text PDF versions by using Open Access Button, Google Scholar, and PubMed. Publications were analyzed for 8 indicators of reproducibility and transparency—availability of materials, data, analysis scripts, protocol, preregistration, conflict of interest statement, funding statement, and open access—using a pilot-tested Google Form.

Results

After exclusion, 127 studies with empirical data were included in our analysis. Certain indicators were more poorly reported than others. We found that most publications (113, 88.9%) did not provide unmodified, raw data used to make computations, 124 (97.6%) failed to make the complete protocol available, and 126 (99.2%) did not include step-by-step analysis scripts.

Conclusions

Our sample of studies published in dermatology journals do not appear to include sufficient detail to be accurately and successfully reproduced in their entirety. Solutions to increase the quality, reproducibility, and transparency of dermatology research are warranted.

More robust reporting of key methodological details, open data sharing, and stricter standards journals impose on authors regarding disclosure of study materials might help to better the climate of reproducible research in dermatology.

URL : Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis

DOI : https://doi.org/10.2196/16078