Comparison of Study Results Reported in medRxiv Preprints vs Peer-reviewed Journal Articles

Authors : Guneet Janda, Vishal Khetpal, Xiaoting Shi, Joseph S. Ross, Joshua D. Wallach

Question

What is the concordance among sample size, primary end points, results for primary end points, and interpretations described in preprints of clinical studies posted on medRxiv that are subsequently published in peer-reviewed journals (preprint-journal article pairs)?

Findings

In this cross-sectional study of 547 clinical studies that were initially posted to medRxiv and later published in peer-reviewed journals, 86.4% of preprint-journal article pairs were concordant in terms of sample size, 97.6% in terms of primary end points, 81.1% in terms of results of primary end points, and 96.2% in terms of study interpretations.

Meaning

This study suggests that most clinical studies posted as preprints on medRxiv and subsequently published in peer-reviewed journals had concordant study characteristics, results, and final interpretations.

URL : Comparison of Clinical Study Results Reported in medRxiv Preprints vs Peer-reviewed Journal Articles

Original location : https://jamanetwork.com/journals/jamanetworkopen/fullarticle/2799350

Assessment of transparency indicators across the biomedical literature: How open is open?

Authors : Stylianos Serghiou, Despina G. Contopoulos-Ioannidis, Kevin W. Boyack, Nico Riedel, Joshua D. Wallach, John P. A. Ioannidis

Recent concerns about the reproducibility of science have led to several calls for more open and transparent research practices and for the monitoring of potential improvements over time. However, with tens of thousands of new biomedical articles published per week, manually mapping and monitoring changes in transparency is unrealistic.

We present an open-source, automated approach to identify 5 indicators of transparency (data sharing, code sharing, conflicts of interest disclosures, funding disclosures, and protocol registration) and apply it across the entire open access biomedical literature of 2.75 million articles on PubMed Central (PMC).

Our results indicate remarkable improvements in some (e.g., conflict of interest [COI] disclosures and funding disclosures), but not other (e.g., protocol registration and code sharing) areas of transparency over time, and map transparency across fields of science, countries, journals, and publishers.

This work has enabled the creation of a large, integrated, and openly available database to expedite further efforts to monitor, understand, and promote transparency and reproducibility in science.

URL : Assessment of transparency indicators across the biomedical literature: How open is open?

DOI : https://doi.org/10.1371/journal.pbio.3001107

Publishing at any cost: a cross-sectional study of the amount that medical researchers spend on open access publishing each year

Authors : Mallory K. Ellingson, Xiaoting Shi, Joshua J. Skydel, Kate Nyhan,Richard Lehman, Joseph S. Ross, Joshua D. Wallach

Objective

To estimate the financial costs paid by individual medical researchers from meeting the article processing charges (APCs) levied by open access journals in 2019.

Design

Cross-sectional analysis.

Data sources

Scopus was used to generate two random samples of researchers, the first with a senior author article indexed in the ‘Medicine’ subject area (general researchers) and the second with an article published in the ten highest-impact factor general clinical medicine journals (high-impact researchers) in 2019.

For each researcher, Scopus was used to identify all first and senior author original research or review articles published in 2019. Data were obtained from Scopus, institutional profiles, Journal Citation Reports, publisher databases, the Directory of Open Access Journals, and individual journal websites.

Main outcome measures

Median APCs paid by general and high-impact researchers for all first and senior author research and review articles published in 2019.

Results

There were 241 general and 246 high-impact researchers identified as eligible for our study. In 2019, the general and high-impact researchers published a total of 914 (median 2, IQR 1–5) and 1471 (4, 2–8) first or senior author research or review articles, respectively. 42% (384/914) of the articles from the general researchers and 29% (428/1471) of the articles from the high-impact medical researchers were published in fully open access journals.

The median total APCs paid by general researchers in 2019 was US$191 (US$0–US$2500) and the median total paid by high-impact researchers was US$2900 (US$0–US$5465); the maximum paid by a single researcher in total APCs was US$30115 and US$34676, respectively.

Conclusions

Medical researchers in 2019 were found to have paid between US$0 and US$34676 in total APCs. As journals with APCs become more common, it is important to continue to evaluate the potential cost to researchers, especially on individuals who may not have the funding or institutional resources to cover these costs.

URL : Publishing at any cost: a cross-sectional study of the amount that medical researchers spend on open access publishing each year

DOI : http://dx.doi.org/10.1136/bmjopen-2020-047107

An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

Authors : Tom E. Hardwicke, Joshua D. Wallach, Mallory C. Kidwell, Theiss Bendixen, Sophia Crüwell, John P. A. Ioannidis

Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility.

Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends reflecting the cumulative contribution of these efforts.

In this study, we manually examined a random sample of 250 articles in order to estimate the prevalence of a range of transparency and reproducibility-related indicators in the social sciences literature published between 2014 and 2017.

Few articles indicated availability of materials (16/151, 11% [95% confidence interval, 7% to 16%]), protocols (0/156, 0% [0% to 1%]), raw data (11/156, 7% [2% to 13%]) or analysis scripts (2/156, 1% [0% to 3%]), and no studies were pre-registered (0/156, 0% [0% to 1%]).

Some articles explicitly disclosed funding sources (or lack of; 74/236, 31% [25% to 37%]) and some declared no conflicts of interest (36/236, 15% [11% to 20%]). Replication studies were rare (2/156, 1% [0% to 3%]).

Few studies were included in evidence synthesis via systematic review (17/151, 11% [7% to 16%]) or meta-analysis (2/151, 1% [0% to 3%]). Less than half the articles were publicly available (101/250, 40% [34% to 47%]).

Minimal adoption of transparency and reproducibility-related research practices could be undermining the credibility and efficiency of social science research. The present study establishes a baseline that can be revisited in the future to assess progress.

URL : An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

DOI : https://doi.org/10.1098/rsos.190806

Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017

Authors : Joshua D. Wallach, Kevin W. Boyack, John P. A. Ioannidis

Currently, there is a growing interest in ensuring the transparency and reproducibility of the published scientific literature. According to a previous evaluation of 441 biomedical journals articles published in 2000–2014, the biomedical literature largely lacked transparency in important dimensions.

Here, we surveyed a random sample of 149 biomedical articles published between 2015 and 2017 and determined the proportion reporting sources of public and/or private funding and conflicts of interests, sharing protocols and raw data, and undergoing rigorous independent replication and reproducibility checks.

We also investigated what can be learned about reproducibility and transparency indicators from open access data provided on PubMed. The majority of the 149 studies disclosed some information regarding funding (103, 69.1% [95% confidence interval, 61.0% to 76.3%]) or conflicts of interest (97, 65.1% [56.8% to 72.6%]).

Among the 104 articles with empirical data in which protocols or data sharing would be pertinent, 19 (18.3% [11.6% to 27.3%]) discussed publicly available data; only one (1.0% [0.1% to 6.0%]) included a link to a full study protocol. Among the 97 articles in which replication in studies with different data would be pertinent, there were five replication efforts (5.2% [1.9% to 12.2%]).

Although clinical trial identification numbers and funding details were often provided on PubMed, only two of the articles without a full text article in PubMed Central that discussed publicly available data at the full text level also contained information related to data sharing on PubMed; none had a conflicts of interest statement on PubMed.

Our evaluation suggests that although there have been improvements over the last few years in certain key indicators of reproducibility and transparency, opportunities exist to improve reproducible research practices across the biomedical literature and to make features related to reproducibility more readily visible in PubMed.

URL : Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017

DOI : https://doi.org/10.1371/journal.pbio.2006930