Two years into the Brazilian Reproducibility Initiative: reflections on conducting a large-scale replication of Brazilian biomedical science

Author : Kleber Neves, Clarissa FD Carneiro, Ana Paula Wasilewska-Sampaio, Mariana Abreu, Bruna Valério-Gomes, Pedro B Tan, Olavo B Amaral

Scientists have increasingly recognised that low methodological and analytical rigour combined with publish-or-perish incentives can make the published scientific literature unreliable.

As a response to this, large-scale systematic replications of the literature have emerged as a way to assess the problem empirically. The Brazilian Reproducibility Initiative is one such effort, aimed at estimating the reproducibility of Brazilian biomedical research.

Its goal is to perform multicentre replications of a quasi-random sample of at least 60 experiments from Brazilian articles published over a 20-year period, using a set of common laboratory methods.

In this article, we describe the challenges of managing a multicentre project with collaborating teams across the country, as well as its successes and failures over the first two years.

We end with a brief discussion of the Initiative’s current status and its possible future contributions after the project is concluded in 2021.

URL : Two years into the Brazilian Reproducibility Initiative: reflections on conducting a large-scale replication of Brazilian biomedical science

DOI : https://doi.org/10.1590/0074-02760200328

What senior academics can do to support reproducible and open research: a short, three-step guide

Authors : Olivia Kowalczyk, Alexandra Lautarescu, Elisabet Blok, Lorenza Dall’Aglio, Samuel Westwood

Increasingly, policies are being introduced to reward and recognise open research practices, while the adoption of such practices into research routines is being facilitated by many grassroots initiatives.

However, despite this widespread endorsement and support, open research is yet to be widely adopted, with early career researchers being the notable exception. For open research to become the norm, initiatives should engage academics from all career stages, particularly senior academics (namely senior lecturers, readers, professors) given their routine involvement in determining the quality of research.

Senior academics, however, face unique challenges in implementing policy change and supporting grassroots initiatives. Given that – like all researchers – senior academics are in part motivated by self-interest, this paper lays out three feasible steps that senior academics can take to improve the quality and productivity of their research, that also serve to engender open research.

These steps include a) change hiring criteria, b) change how scholarly outputs are credited, and c) change to funding and publishing with open research. The guidance we provide is accompanied by live, crowd-sourced material for further reading.

URL : What senior academics can do to support reproducible and open research: a short, three-step guide

Original location : https://psyarxiv.com/jyfr7

Low availability of code in ecology: A call for urgent action

Authors : Antica Culina, Ilona van den Berg, Simon Evans, Alfredo Sánchez-Tójar

Access to analytical code is essential for transparent and reproducible research. We review the state of code availability in ecology using a random sample of 346 nonmolecular articles published between 2015 and 2019 under mandatory or encouraged code-sharing policies.

Our results call for urgent action to increase code availability: only 27% of eligible articles were accompanied by code. In contrast, data were available for 79% of eligible articles, highlighting that code availability is an important limiting factor for computational reproducibility in ecology.

Although the percentage of ecological journals with mandatory or encouraged code-sharing policies has increased considerably, from 15% in 2015 to 75% in 2020, our results show that code-sharing policies are not adhered to by most authors.

We hope these results will encourage journals, institutions, funding agencies, and researchers to address this alarming situation.

URL : Low availability of code in ecology: A call for urgent action

DOI : https://doi.org/10.1371/journal.pbio.3000763

No raw data, no science: another possible source of the reproducibility crisis

Author : Tsuyoshi Miyakawa

A reproducibility crisis is a situation where many scientific studies cannot be reproduced. Inappropriate practices of science, such as HARKing, p-hacking, and selective reporting of positive results, have been suggested as causes of irreproducibility. In this editorial, I propose that a lack of raw data or data fabrication is another possible cause of irreproducibility.

As an Editor-in-Chief of Molecular Brain, I have handled 180 manuscripts since early 2017 and have made 41 editorial decisions categorized as “Revise before review,” requesting that the authors provide raw data.

Surprisingly, among those 41 manuscripts, 21 were withdrawn without providing raw data, indicating that requiring raw data drove away more than half of the manuscripts. I rejected 19 out of the remaining 20 manuscripts because of insufficient raw data.

Thus, more than 97% of the 41 manuscripts did not present the raw data supporting their results when requested by an editor, suggesting a possibility that the raw data did not exist from the beginning, at least in some portions of these cases.

Considering that any scientific study should be based on raw data, and that data storage space should no longer be a challenge, journals, in principle, should try to have their authors publicize raw data in a public database or journal site upon the publication of the paper to increase reproducibility of the published results and to increase public trust in science.

URL : No raw data, no science: another possible source of the reproducibility crisis

DOI : https://doi.org/10.1186/s13041-020-0552-2

Reproducible and transparent research practices in published neurology research

Authors : Trevor Torgerson, Austin L. Johnson, Jonathan Pollard, Daniel Tritz, Matt Vassar

Background

The objective of this study was to evaluate the nature and extent of reproducible and transparent research practices in neurology publications.

Methods

The NLM catalog was used to identify MEDLINE-indexed neurology journals. A PubMed search of these journals was conducted to retrieve publications over a 5-year period from 2014 to 2018.

A random sample of publications was extracted. Two authors conducted data extraction in a blinded, duplicate fashion using a pilot-tested Google form. This form prompted data extractors to determine whether publications provided access to items such as study materials, raw data, analysis scripts, and protocols.

In addition, we determined if the publication was included in a replication study or systematic review, was preregistered, had a conflict of interest declaration, specified funding sources, and was open access.

Results

Our search identified 223,932 publications meeting the inclusion criteria, from which 400 were randomly sampled. Only 389 articles were accessible, yielding 271 publications with empirical data for analysis.

Our results indicate that 9.4% provided access to materials, 9.2% provided access to raw data, 0.7% provided access to the analysis scripts, 0.7% linked the protocol, and 3.7% were preregistered.

A third of sampled publications lacked funding or conflict of interest statements. No publications from our sample were included in replication studies, but a fifth were cited in a systematic review or meta-analysis.

Conclusions

Currently, published neurology research does not consistently provide information needed for reproducibility. The implications of poor research reporting can both affect patient care and increase research waste. Collaborative intervention by authors, peer reviewers, journals, and funding sources is needed to mitigate this problem.

URL : Reproducible and transparent research practices in published neurology research

DOI : https://doi.org/10.1186/s41073-020-0091-5

 

An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

Authors : Tom E. Hardwicke, Joshua D. Wallach, Mallory C. Kidwell, Theiss Bendixen, Sophia Crüwell, John P. A. Ioannidis

Serious concerns about research quality have catalysed a number of reform initiatives intended to improve transparency and reproducibility and thus facilitate self-correction, increase efficiency and enhance research credibility.

Meta-research has evaluated the merits of some individual initiatives; however, this may not capture broader trends reflecting the cumulative contribution of these efforts.

In this study, we manually examined a random sample of 250 articles in order to estimate the prevalence of a range of transparency and reproducibility-related indicators in the social sciences literature published between 2014 and 2017.

Few articles indicated availability of materials (16/151, 11% [95% confidence interval, 7% to 16%]), protocols (0/156, 0% [0% to 1%]), raw data (11/156, 7% [2% to 13%]) or analysis scripts (2/156, 1% [0% to 3%]), and no studies were pre-registered (0/156, 0% [0% to 1%]).

Some articles explicitly disclosed funding sources (or lack of; 74/236, 31% [25% to 37%]) and some declared no conflicts of interest (36/236, 15% [11% to 20%]). Replication studies were rare (2/156, 1% [0% to 3%]).

Few studies were included in evidence synthesis via systematic review (17/151, 11% [7% to 16%]) or meta-analysis (2/151, 1% [0% to 3%]). Less than half the articles were publicly available (101/250, 40% [34% to 47%]).

Minimal adoption of transparency and reproducibility-related research practices could be undermining the credibility and efficiency of social science research. The present study establishes a baseline that can be revisited in the future to assess progress.

URL : An empirical assessment of transparency and reproducibility-related research practices in the social sciences (2014–2017)

DOI : https://doi.org/10.1098/rsos.190806