Systematizing Confidence in Open Research and Evidence (SCORE)

Authors : Nazanin Alipourfard, Beatrix Arendt, Daniel M. Benjamin, Noam Benkler, Michael Bishop, Mark Burstein, Martin Bush, James Caverlee, Yiling Chen, Chae Clark, Anna Dreber Almenberg, Tim Errington, Fiona Fidler, Nicholas Fox, Aaron Frank, Hannah Fraser, Scott Friedman, Ben Gelman, James Gentile, C Lee Giles, Michael B Gordon, Reed Gordon-Sarney, Christopher Griffin, Timothy Gulden et al.,

Assessing the credibility of research claims is a central, continuous, and laborious part of the scientific process. Credibility assessment strategies range from expert judgment to aggregating existing evidence to systematic replication efforts.

Such assessments can require substantial time and effort. Research progress could be accelerated if there were rapid, scalable, accurate credibility indicators to guide attention and resource allocation for further assessment.

The SCORE program is creating and validating algorithms to provide confidence scores for research claims at scale. To investigate the viability of scalable tools, teams are creating: a database of claims from papers in the social and behavioral sciences; expert and machine generated estimates of credibility; and, evidence of reproducibility, robustness, and replicability to validate the estimates.

Beyond the primary research objective, the data and artifacts generated from this program will be openly shared and provide an unprecedented opportunity to examine research credibility and evidence.

URL : Systematizing Confidence in Open Research and Evidence (SCORE)

DOI : https://doi.org/10.31235/osf.io/46mnb

Questionable and open research practices: attitudes and perceptions among quantitative communication researchers

Authors : Bert Bakker, Kokil Jaidka, Timothy Dörr, Neil Fasching, Yphtach Lelkes

Recent contributions have questioned the credibility of quantitative communication research. While questionable research practices are believed to be widespread, evidence for this claim is primarily derived from other disciplines.

Before change in communication research can happen, it is important to document the extent to which QRPs are used and whether researchers are open to the changes proposed by the so-called open science agenda.

We conducted a large survey among authors of papers published in the top-20 journals in communication science in the last ten years (N=1039). A non-trivial percent of researchers report using one or more QRPs. While QRPs are generally considered unacceptable, researchers perceive QRPs to be common among their colleagues.

At the same time, we find optimism about the use of open science practices in communication research. We end with a series of recommendations outlining what journals, institutions and researchers can do moving forward.

DOI : https://doi.org/10.31234/osf.io/7uyn5

ODDPub – a Text-Mining Algorithm to Detect Data Sharing in Biomedical Publications

Authors: Nico Riedel, Miriam Kip, Evgeny Bobro

Open research data are increasingly recognized as a quality indicator and an important resource to increase transparency, robustness and collaboration in science. However, no standardized way of reporting Open Data in publications exists, making it difficult to find shared datasets and assess the prevalence of Open Data in an automated fashion.

We developed ODDPub (Open Data Detection in Publications), a text-mining algorithm that screens biomedical publications and detects cases of Open Data. Using English-language original research publications from a single biomedical research institution (n = 8689) and randomly selected from PubMed (n = 1500) we iteratively developed a set of derived keyword categories.

ODDPub can detect data sharing through field-specific repositories, general-purpose repositories or the supplement. Additionally, it can detect shared analysis code (Open Code).

To validate ODDPub, we manually screened 792 publications randomly selected from PubMed. On this validation dataset, our algorithm detected Open Data publications with a sensitivity of 0.73 and specificity of 0.97.

Open Data was detected for 11.5% (n = 91) of publications. Open Code was detected for 1.4% (n = 11) of publications with a sensitivity of 0.73 and specificity of 1.00. We compared our results to the linked datasets found in the databases PubMed and Web of Science.

Our algorithm can automatically screen large numbers of publications for Open Data. It can thus be used to assess Open Data sharing rates on the level of subject areas, journals, or institutions. It can also identify individual Open Data publications in a larger publication corpus. ODDPub is published as an R package on GitHub.

URL : ODDPub – a Text-Mining Algorithm to Detect Data Sharing in Biomedical Publications

DOI : http://doi.org/10.5334/dsj-2020-042

What senior academics can do to support reproducible and open research: a short, three-step guide

Authors : Olivia Kowalczyk, Alexandra Lautarescu, Elisabet Blok, Lorenza Dall’Aglio, Samuel Westwood

Increasingly, policies are being introduced to reward and recognise open research practices, while the adoption of such practices into research routines is being facilitated by many grassroots initiatives.

However, despite this widespread endorsement and support, open research is yet to be widely adopted, with early career researchers being the notable exception. For open research to become the norm, initiatives should engage academics from all career stages, particularly senior academics (namely senior lecturers, readers, professors) given their routine involvement in determining the quality of research.

Senior academics, however, face unique challenges in implementing policy change and supporting grassroots initiatives. Given that – like all researchers – senior academics are in part motivated by self-interest, this paper lays out three feasible steps that senior academics can take to improve the quality and productivity of their research, that also serve to engender open research.

These steps include a) change hiring criteria, b) change how scholarly outputs are credited, and c) change to funding and publishing with open research. The guidance we provide is accompanied by live, crowd-sourced material for further reading.

URL : What senior academics can do to support reproducible and open research: a short, three-step guide

Original location : https://psyarxiv.com/jyfr7

Ten Hot Topics around Scholarly Publishing

Authors : Jonathan P. Tennant, Harry Crane, Tom Crick, Jacinto Davila, Asura Enkhbayar, Johanna Havemann, Bianca Kramer, Ryan Martin, Paola Masuzzo,  Andy Nobes, Curt Rice, Bárbara Rivera-López, Tony Ross-Hellauer, Susanne Sattler, Paul D. Thacker, Marc Vanholsbeeck

The changing world of scholarly communication and the emerging new wave of ‘Open Science’ or ‘Open Research’ has brought to light a number of controversial and hotly debated topics.

Evidence-based rational debate is regularly drowned out by misinformed or exaggerated rhetoric, which does not benefit the evolving system of scholarly communication.

This article aims to provide a baseline evidence framework for ten of the most contested topics, in order to help frame and move forward discussions, practices, and policies.

We address issues around preprints and scooping, the practice of copyright transfer, the function of peer review, predatory publishers, and the legitimacy of ‘global’ databases.

These arguments and data will be a powerful tool against misinformation across wider academic research, policy and practice, and will inform changes within the rapidly evolving scholarly publishing system.

URL : Ten Hot Topics around Scholarly Publishing

DOI : https://doi.org/10.3390/publications7020034

Implementing publisher policies that inform, support and encourage authors to share data: two case studies

Authors: Leila Jones, Rebecca Grant, Iain Hrynaszkiewicz

Open research data is one of the key areas in the expanding open scholarship movement. Scholarly journals and publishers find themselves at the heart of the shift towards openness, with recent years seeing an increase in the number of scholarly journals with data-sharing policies aiming to increase transparency and reproducibility of research.

In this article we present two case studies which examine the experiences that two leading academic publishers, Taylor & Francis and Springer Nature, have had in rolling out data-sharing policies.

We illustrate some of the considerations involved in providing consistent policies across journals of many disciplines, reflecting on successes and challenges.

URL : Implementing publisher policies that inform, support and encourage authors to share data: two case studies

DOI : http://doi.org/10.1629/uksg.463

Ten myths around open scholarly publishing

Authors : Jonathan P Tennant​​​, Harry Crane​​, Tom Crick​​, Jacinto Davila​, Asura Enkhbayar​​, Johanna Havemann​​, Bianca Kramer​​, Ryan Martin​​, Paola Masuzzo​​, Andy Nobes​​, Curt Rice​​, Bárbara R López​​, Tony Ross-Hellauer​​, Susanne Sattler​​, Paul Thacker​​, MarcVanholsbeeck

The changing world of scholarly communication and the emergence of ‘Open Science’ or ‘Open Research’ has brought to light a number of controversial and hotly-debated topics.

Yet, evidence-based rational debate is regularly drowned out by misinformed or exaggerated rhetoric, which does not benefit the evolving system of scholarly communication.

The aim of this article is to provide a baseline evidence framework for ten of the most contested topics, in order to help frame and move forward discussions, practices and policies. We address preprints and scooping, the practice of copyright transfer, the function of peer review, and the legitimacy of ‘global’ databases.

The presented facts and data will be a powerful tool against misinformation across wider academic research, policy and practice, and may be used to inform changes within the rapidly evolving scholarly publishing system.

URL : Ten myths around open scholarly publishing

DOI : https://doi.org/10.7287/peerj.preprints.27580v1