Journal Open Access and Plan S: Solving Problems or Shifting Burdens?

Authors : Shina Caroline Lynn Kamerlin, David J. Allen, Bas de Bruin, Etienne Derat, Henrik Urdal

This academic thought piece provides an overview of the history of, and current trends in, publishing practices in the scientific fields known to the authors (chemical sciences, social sciences and humanities), as well as a discussion of how open access mandates such as Plan S from cOAlition S will affect these practices.

It begins by summarizing the evolution of scientific publishing, in particular how it was shaped by the learned societies, and highlights how important quality assurance and scientific management mechanisms are being challenged by the recent introduction of ever more stringent open access mandates.

The authors then discuss the various reactions of the researcher community to the introduction of Plan S, and elucidate a number of concerns: that it will push researchers towards a pay‐to‐publish system which will inevitably create new divisions between those who can afford to get their research published and those who cannot; that it will disrupt collaboration between researchers on the different sides of cOAlition S funding; and that it will have an impact on academic freedom of research and publishing.

The authors analyse the dissemination of, and responses to, an open letter distributed and signed in reaction to the introduction of Plan S, before concluding with some thoughts on the potential for evolution of open access in scientific publishing.

URL : Journal Open Access and Plan S: Solving Problems or Shifting Burdens?

DOI : https://doi.org/10.1111/dech.12635

Survey study of research integrity officers’ perceptions of research practices associated with instances of research misconduct

Author : Michael Kalichman

Background

Research on research integrity has tended to focus on frequency of research misconduct and factors that might induce someone to commit research misconduct.

A definitive answer to the first question has been elusive, but it remains clear that any research misconduct is too much. Answers to the second question are so diverse, it might be productive to ask a different question: What about how research is done allows research misconduct to occur?

Methods

With that question in mind, research integrity officers (RIOs) of the 62 members of the American Association of Universities were invited to complete a brief survey about their most recent instance of a finding of research misconduct.

Respondents were asked whether one or more good practices of research (e.g., openness and transparency, keeping good research records) were present in their case of research misconduct.

Results

Twenty-four (24) of the respondents (39% response rate) indicated they had dealt with at least one finding of research misconduct and answered the survey questions. Over half of these RIOs reported that their case of research misconduct had occurred in an environment in which at least nine of the ten listed good practices of research were deficient.

Conclusions

These results are not evidence for a causal effect of poor practices, but it is arguable that committing research misconduct would be more difficult if not impossible in research environments adhering to good practices of research.

URL : Survey study of research integrity officers’ perceptions of research practices associated with instances of research misconduct

DOI : https://doi.org/10.1186/s41073-020-00103-1

Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study

Authors : Tom E. Hardwicke, Manuel Bohn, Kyle MacDonald, Emily Hembacher, Michèle B. Nuijten, Benjamin N. Peloquin, Benjamin E. deMayo, Bria Long, Erica J. Yoon, Michael C. Frank

For any scientific report, repeating the original analyses upon the original data should yield the original outcomes. We evaluated analytic reproducibility in 25 Psychological Science articles awarded open data badges between 2014 and 2015.

Initially, 16 (64%, 95% confidence interval [43,81]) articles contained at least one ‘major numerical discrepancy’ (>10% difference) prompting us to request input from original authors.

Ultimately, target values were reproducible without author involvement for 9 (36% [20,59]) articles; reproducible with author involvement for 6 (24% [8,47]) articles; not fully reproducible with no substantive author response for 3 (12% [0,35]) articles; and not fully reproducible despite author involvement for 7 (28% [12,51]) articles.

Overall, 37 major numerical discrepancies remained out of 789 checked values (5% [3,6]), but original conclusions did not appear affected.

Non-reproducibility was primarily caused by unclear reporting of analytic procedures. These results highlight that open data alone is not sufficient to ensure analytic reproducibility.

URL : Analytic reproducibility in articles receiving open data badges at the journal Psychological Science: an observational study

DOI : https://doi.org/10.1098/rsos.201494

Improving Opportunities for New Value of Open Data: Assessing and Certifying Research Data Repositories

Author : Robert R. Downs

Investments in research that produce scientific and scholarly data can be leveraged by enabling the resulting research data products and services to be used by broader communities and for new purposes, extending reuse beyond the initial users and purposes for which the data were originally collected.

Submitting research data to a data repository offers opportunities for the data to be used in the future, providing ways for new benefits to be realized from data reuse. Improvements to data repositories that facilitate new uses of data increase the potential for data reuse and for gains in the value of open data products and services that are associated with such reuse.

Assessing and certifying the capabilities and services offered by data repositories provides opportunities for improving the repositories and for realizing the value to be attained from new uses of data.

The evolution of data repository certification instruments is described and discussed in terms of the implications for the curation and continuing use of research data.

URL : Improving Opportunities for New Value of Open Data: Assessing and Certifying Research Data Repositories

DOI : http://doi.org/10.5334/dsj-2021-001

Open Science and the Hype Cycle

Author : George Strawn

The introduction of a new technology or innovation is often accompanied by “ups and downs” in its fortunes. Gartner Inc. defined a so-called hype cycle to describe a general pattern that many innovations experience: technology trigger, peak of inflated expectations, trough of disillusionment, slope of enlightenment, and plateau of productivity.

This article will compare the ongoing introduction of Open Science (OS) with the hype cycle model and speculate on the relevance of that model to OS. Lest the title of this article mislead the reader, be assured that the author believes that OS should happen and that it will happen.

However, I also believe that the path to OS will be longer than many of us had hoped. I will give a brief history of the today’s “semi-open” science, define what I mean by OS, define the hype cycle and where OS is now on that cycle, and finally speculate what it will take to traverse the cycle and rise to its plateau of productivity (as described by Gartner).

URL : Open Science and the Hype Cycle

DOI : https://doi.org/10.1162/dint_a_00081

The Rigor and Transparency Index Quality Metric for Assessing Biological and Medical Science Methods

Authors : Joe Menke, Martijn Roelandse, Burak Ozyurt, Maryann Martone, Anita Bandrowski

The reproducibility crisis is a multifaceted problem involving ingrained practices within the scientific community. Fortunately, some causes are addressed by the author’s adherence to rigor and reproducibility criteria, implemented via checklists at various journals.

We developed an automated tool (SciScore) that evaluates research articles based on their adherence to key rigor criteria, including NIH criteria and RRIDs, at an unprecedented scale. We show that despite steady improvements, less than half of the scoring criteria, such as blinding or power analysis, are routinely addressed by authors; digging deeper, we examined the influence of specific checklists on average scores.

The average score for a journal in a given year was named the Rigor and Transparency Index (RTI), a new journal quality metric. We compared the RTI with the Journal Impact Factor and found there was no correlation. The RTI can potentially serve as a proxy for methodological quality.

URL : The Rigor and Transparency Index Quality Metric for Assessing Biological and Medical Science Methods

DOI : https://doi.org/10.1016/j.isci.2020.101698

Where Do Early Career Researchers Stand on Open Science Practices? A Survey Within the Max Planck Society

Authors : Daniel Toribio-Flórez, Lukas Anneser, Felipe Nathan deOliveira-Lopes, Martijn Pallandt, Isabell Tunn, Hendrik Windel

Open science (OS) is of paramount importance for the improvement of science worldwide and across research fields. Recent years have witnessed a transition toward open and transparent scientific practices, but there is still a long way to go.

Early career researchers (ECRs) are of crucial relevance in the process of steering toward the standardization of OS practices, as they will become the future decision makers of the institutional change that necessarily accompanies this transition. Thus, it is imperative to gain insight into where ECRs stand on OS practices.

Under this premise, the Open Science group of the Max Planck PhDnet designed and conducted an online survey to assess the stance toward OS practices of doctoral candidates from the Max Planck Society.

As one of the leading scientific institutions for basic research worldwide, the Max Planck Society provides a considerable population of researchers from multiple scientific fields, englobed into three sections: biomedical sciences, chemistry, physics and technology, and human and social sciences.

From an approximate total population of 5,100 doctoral candidates affiliated with the Max Planck Society, the survey collected responses from 568 doctoral candidates. The survey assessed self-reported knowledge, attitudes, and implementation of different OS practices, namely, open access publications, open data, preregistrations, registered reports, and replication studies.

ECRs seemed to hold a generally positive view toward these different practices and to be interested in learning more about them. Furthermore, we found that ECRs’ knowledge and positive attitudes predicted the extent to which they implemented these OS practices, although levels of implementation were rather low in the past. We observed differences and similarities between scientific sections.

We discuss these differences in terms of need and feasibility to apply these OS practices in specific scientific fields, but additionally in relation to the incentive systems that shape scientific communities. Lastly, we discuss the implications that these results can have for the training and career advancement of ECRs, and ultimately, for the consolidation of OS practices.

URL : Where Do Early Career Researchers Stand on Open Science Practices? A Survey Within the Max Planck Society

DOI : https://doi.org/10.3389/frma.2020.586992