Publishing computational research — A review of infrastructures for reproducible and transparent scholarly communication

Authors : Markus Konkol, Daniel Nüst, Laura Goulier

Funding agencies increasingly ask applicants to include data and software management plans into proposals. In addition, the author guidelines of scientific journals and conferences more often include a statement on data availability, and some reviewers reject unreproducible submissions.

This trend towards open science increases the pressure on authors to provide access to the source code and data underlying the computational results in their scientific papers.

Still, publishing reproducible articles is a demanding task and not achieved simply by providing access to code scripts and data files. Consequently, several projects develop solutions to support the publication of executable analyses alongside articles considering the needs of the aforementioned stakeholders.

The key contribution of this paper is a review of applications addressing the issue of publishing executable computational research results. We compare the approaches across properties relevant for the involved stakeholders, e.g., provided features and deployment options, and also critically discuss trends and limitations.

The review can support publishers to decide which system to integrate into their submission process, editors to recommend tools for researchers, and authors of scientific papers to adhere to reproducibility principles.

URL : https://arxiv.org/abs/2001.00484

Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis

Authors : J Michael Anderson, Andrew Niemann, Austin L Johnson, Courtney Cook, Daniel Tritz, Matt Vassar

Background

Reproducible research is a foundational component for scientific advancements, yet little is known regarding the extent of reproducible research within the dermatology literature.

Objective

This study aimed to determine the quality and transparency of the literature in dermatology journals by evaluating for the presence of 8 indicators of reproducible and transparent research practices.

Methods

By implementing a cross-sectional study design, we conducted an advanced search of publications in dermatology journals from the National Library of Medicine catalog. Our search included articles published between January 1, 2014, and December 31, 2018.

After generating a list of eligible dermatology publications, we then searched for full text PDF versions by using Open Access Button, Google Scholar, and PubMed. Publications were analyzed for 8 indicators of reproducibility and transparency—availability of materials, data, analysis scripts, protocol, preregistration, conflict of interest statement, funding statement, and open access—using a pilot-tested Google Form.

Results

After exclusion, 127 studies with empirical data were included in our analysis. Certain indicators were more poorly reported than others. We found that most publications (113, 88.9%) did not provide unmodified, raw data used to make computations, 124 (97.6%) failed to make the complete protocol available, and 126 (99.2%) did not include step-by-step analysis scripts.

Conclusions

Our sample of studies published in dermatology journals do not appear to include sufficient detail to be accurately and successfully reproduced in their entirety. Solutions to increase the quality, reproducibility, and transparency of dermatology research are warranted.

More robust reporting of key methodological details, open data sharing, and stricter standards journals impose on authors regarding disclosure of study materials might help to better the climate of reproducible research in dermatology.

URL : Transparent, Reproducible, and Open Science Practices of Published Literature in Dermatology Journals: Cross-Sectional Analysis

DOI : https://doi.org/10.2196/16078

Open Up – the Mission Statement of the Control of Impulsive Action (Ctrl-ImpAct) Lab on Open Science

Authors : Christina B. Reimer, Zhang Chen, Carsten Bundt, Charlotte Eben, Raquel E. London, Sirarpi Vardanian

The present paper is the mission statement of the Control of Impulsive Action (Ctrl-ImpAct) Lab regarding Open Science. As early-career researchers (ECRs) in the lab, we first state our personal motivation to conduct research based on the principles of Open Science.

We then describe how we incorporate four specific Open Science practices (i.e., Open Methodology, Open Data, Open Source, and Open Access) into our scientific workflow. In more detail, we explain how Open Science practices are embedded into the so-called ‘co-pilot’ system in our lab.

The ‘co-pilot’ researcher is involved in all tasks of the ‘pilot’ researcher, that is designing a study, double-checking experimental and data analysis scripts, as well as writing the manuscript.

The lab has set up this co-pilot system to increase transparency, reduce potential errors that could occur during the entire workflow, and to intensify collaborations between lab members.

Finally, we discuss potential solutions for general problems that could arise when practicing Open Science.

URL : Open Up – the Mission Statement of the Control of Impulsive Action (Ctrl-ImpAct) Lab on Open Science

DOI : http://doi.org/10.5334/pb.494

Survey on Scientific Shared Resource Rigor and Reproducibility

Authors : Kevin L. Knudtson, Robert H. Carnahan, Rebecca L. Hegstad-Davies, Nancy C. Fisher, Belynda Hicks, Peter A. Lopez, Susan M. Meyn, Sheenah M. Mische, Frances Weis-Garcia, Lisa D. White, Katia Sol-Church

Shared scientific resources, also known as core facilities, support a significant portion of the research conducted at biomolecular research institutions.

The Association of Biomolecular Resource Facilities (ABRF) established the Committee on Core Rigor and Reproducibility (CCoRRe) to further its mission of integrating advanced technologies, education, and communication in the operations of shared scientific resources in support of reproducible research.

In order to first assess the needs of the scientific shared resource community, the CCoRRe solicited feedback from ABRF members via a survey. The purpose of the survey was to gain information on how U.S. National Institutes of Health (NIH) initiatives on advancing scientific rigor and reproducibility influenced current services and new technology development.

In addition, the survey aimed to identify the challenges and opportunities related to implementation of new reporting requirements and to identify new practices and resources needed to ensure rigorous research.

The results revealed a surprising unfamiliarity with the NIH guidelines. Many of the perceived challenges to the effective implementation of best practices (i.e., those designed to ensure rigor and reproducibility) were similarly noted as a challenge to effective provision of support services in a core setting. Further, most cores routinely use best practices and offer services that support rigor and reproducibility.

These services include access to well-maintained instrumentation and training on experimental design and data analysis as well as data management. Feedback from this survey will enable the ABRF to build better educational resources and share critical best-practice guidelines.

These resources will become important tools to the core community and the researchers they serve to impact rigor and transparency across the range of science and technology.

DOI : https://dx.doi.org/10.7171%2Fjbt.19-3003-001

The effect of publishing peer review reports on referee behavior in five scholarly journals

Authors : Giangiacomo Bravo, Francisco Grimaldo, Emilia López-Iñesta, Bahar Mehmani, Flaminio Squazzoni

To increase transparency in science, some scholarly journals are publishing peer review reports. But it is unclear how this practice affects the peer review process. Here, we examine the effect of publishing peer review reports on referee behavior in five scholarly journals involved in a pilot study at Elsevier.

By considering 9,220 submissions and 18,525 reviews from 2010 to 2017, we measured changes both before and during the pilot and found that publishing reports did not significantly compromise referees’ willingness to review, recommendations, or turn-around times.

Younger and non-academic scholars were more willing to accept to review and provided more positive and objective recommendations. Male referees tended to write more constructive reports during the pilot.

Only 8.1% of referees agreed to reveal their identity in the published report. These findings suggest that open peer review does not compromise the process, at least when referees are able to protect their anonymity.

URL : The effect of publishing peer review reports on referee behavior in five scholarly journals

DOI : https://doi.org/10.1038/s41467-018-08250-2

Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017

Authors : Joshua D. Wallach, Kevin W. Boyack, John P. A. Ioannidis

Currently, there is a growing interest in ensuring the transparency and reproducibility of the published scientific literature. According to a previous evaluation of 441 biomedical journals articles published in 2000–2014, the biomedical literature largely lacked transparency in important dimensions.

Here, we surveyed a random sample of 149 biomedical articles published between 2015 and 2017 and determined the proportion reporting sources of public and/or private funding and conflicts of interests, sharing protocols and raw data, and undergoing rigorous independent replication and reproducibility checks.

We also investigated what can be learned about reproducibility and transparency indicators from open access data provided on PubMed. The majority of the 149 studies disclosed some information regarding funding (103, 69.1% [95% confidence interval, 61.0% to 76.3%]) or conflicts of interest (97, 65.1% [56.8% to 72.6%]).

Among the 104 articles with empirical data in which protocols or data sharing would be pertinent, 19 (18.3% [11.6% to 27.3%]) discussed publicly available data; only one (1.0% [0.1% to 6.0%]) included a link to a full study protocol. Among the 97 articles in which replication in studies with different data would be pertinent, there were five replication efforts (5.2% [1.9% to 12.2%]).

Although clinical trial identification numbers and funding details were often provided on PubMed, only two of the articles without a full text article in PubMed Central that discussed publicly available data at the full text level also contained information related to data sharing on PubMed; none had a conflicts of interest statement on PubMed.

Our evaluation suggests that although there have been improvements over the last few years in certain key indicators of reproducibility and transparency, opportunities exist to improve reproducible research practices across the biomedical literature and to make features related to reproducibility more readily visible in PubMed.

URL : Reproducible research practices, transparency, and open access data in the biomedical literature, 2015–2017

DOI : https://doi.org/10.1371/journal.pbio.2006930

Dimensions of open research: critical reflections on openness in the ROER4D project

Authors : Thomas William King, Cheryl-Ann Hodgkinson-Williams, Michelle Willmers, Sukaina Walji

Open Research has the potential to advance the scientific process by improving the transparency, rigour, scope and reach of research, but choosing to experiment with Open Research carries with it a set of ideological, legal, technical and operational considerations.

Researchers, especially those in resource-constrained situations, may not be aware of the complex interrelations between these different domains of open practice, the additional resources required, or how Open Research can support traditional research practices.

Using the Research on Open Educational Resources for Development (ROER4D) project as an example, this paper attempts to demonstrate the interrelation between ideological, legal, technical and operational openness; the resources that conducting Open Research requires; and the benefits of an iterative, strategic approach to one’s own Open Research practice.

In this paper we discuss the value of a critical approach towards Open Research to ensure better coherence between ‘open’ ideology (embodied in strategic intention) and ‘open’ practice (the everyday operationalisation of open principles).

URL : Dimensions of open research: critical reflections on openness in the ROER4D project

Alternative location : https://www.openpraxis.org/index.php/OpenPraxis/article/view/285