Exploring enablers and barriers to implementing the Transparency and Openness Promotion Guidelines: a theory-based survey of journal editors

Authors : Kevin Naaman, Sean Grant, Sina Kianersi, Lauren Supplee, Beate Henschel, Evan Mayo-Wilson

The Transparency and Openness Promotion (TOP) Guidelines provide a framework to help journals develop open science policies. Theories of behaviour change can guide understanding of why journals do (not) implement open science policies and the development of interventions to improve these policies.

In this study, we used the Theoretical Domains Framework to survey 88 journal editors on their capability, opportunity and motivation to implement TOP. Likert-scale questions assessed editor support for TOP, and enablers and barriers to implementing TOP.

A qualitative question asked editors to provide reflections on their ratings. Most participating editors supported adopting TOP at their journal (71%) and perceived other editors in their discipline to support adopting TOP (57%). Most editors (93%) agreed their roles include maintaining policies that reflect current best practices.

However, most editors (74%) did not see implementing TOP as a high priority compared with other editorial responsibilities. Qualitative responses expressed structural barriers to implementing TOP (e.g. lack of time, resources and authority to implement changes) and varying support for TOP depending on study type, open science standard, and level of implementation.

We discuss how these findings could inform the development of theoretically guided interventions to increase open science policies, procedures and practices.

URL : Exploring enablers and barriers to implementing the Transparency and Openness Promotion Guidelines: a theory-based survey of journal editors

DOI : https://doi.org/10.1098/rsos.221093

Are papers published in predatory journals worthless? A geopolitical dimension revealed by content-based analysis of citations

Authors : Zehra Taşkın, Franciszek Krawczyk, Emanuel Kulczycki

This study uses content-based citation analysis to move beyond the simplified classification of predatory journals. We present that, when we analyze papers not only in terms of the quantity of their citations but also the content of these citations, we are able to show the various roles played by papers published in journals accused of being predatory.

To accomplish this, we analyzed the content of 9,995 citances (i.e., citation sentences) from 6,706 papers indexed in the Web of Science Core Collection, which cites papers published in so-called “predatory” (or questionable) journals. The analysis revealed that the vast majority of such citances are neutral (97.3%), and negative citations of articles published in the analyzed journals are almost completely nonexistent (0.8%).

Moreover, the analysis revealed that the most frequently mentioned countries in the citances are India, Pakistan, and Iran, with mentions of Western countries being rare. This highlights a geopolitical bias and shows the usefulness of looking at such journals as mislocated centers of scholarly communication.

The analyzed journals provide regional data prevalent for mainstream scholarly discussions, and the idea of predatory publishing hides geopolitical inequalities in global scholarly publishing. Our findings also contribute to the further development of content-based citation analysis.

URL : Are papers published in predatory journals worthless? A geopolitical dimension revealed by content-based analysis of citations

DOI : https://doi.org/10.1162/qss_a_00242

Interdisciplinary Analysis of Science Communication on Social Media during the COVID-19 Crisis

Authors : Thomas Mandl, Sylvia Jaki, Hannah Mitera, Franziska Schmidt

In times of crisis, science communication needs to be accessible and convincing. In order to understand whether these two criteria apply to concrete science communication formats, it is not enough to merely study the communication product. Instead, the recipient’s perspective also needs to be taken into account.

What do recipients value in popular science communication formats concerning COVID-19? What do they criticize? What elements in the formats do they pay attention to? These questions can be answered by reception studies, for example, by analyzing the reactions and comments of social media users.

This is particularly relevant since scientific information was increasingly disseminated over social media channels during the COVID-19 crisis. This interdisciplinary study, therefore, focuses both on science communication strategies in media formats and the related comments on social media.

First, we selected science communication channels on YouTube and performed a qualitative multi-modal analysis. Second, the comments responding to science communication content online were analyzed by identifying Twitter users who are doctors, researchers, science communicators and those who represent research institutes and then, subsequently, performing topic modeling on the textual data.

The main goal was to find topics that directly related to science communication strategies. The qualitative video analysis revealed, for example, a range of strategies for accessible communication and maintaining transparency about scientific insecurities.

The quantitative Twitter analysis showed that few tweets commented on aspects of the communication strategies. These were mainly positive while the sentiment in the overall collection was less positive.

We downloaded and processed replies for 20 months, starting at the beginning of the pandemic, which resulted in a collection of approximately one million tweets from the German science communication market.

URL : Interdisciplinary Analysis of Science Communication on Social Media during the COVID-19 Crisis

DOI : https://doi.org/10.3390/knowledge3010008

Researchers and their data: A study based on the use of the word data in scholarly articles

Authors : Frédérique Bordignon, Marion Maisonobe

Data is one of the most used terms in scientific vocabulary. This article focuses on the relationship between data and research by analyzing the contexts of occurrence of the word data in a corpus of 72,471 research articles (1980–2012) from two distinct fields (Social sciences, Physical sciences).

The aim is to shed light on the issues raised by research on data, namely the difficulty of defining what is considered as data, the transformations that data undergo during the research process, and how they gain value for researchers who hold them.

Relying on the distribution of occurrences throughout the texts and over time, it demonstrates that the word data mostly occurs at the beginning and end of research articles. Adjectives and verbs accompanying the noun data turn out to be even more important than data itself in specifying data.

The increase in the use of possessive pronouns at the end of the articles reveals that authors tend to claim ownership of their data at the very end of the research process. Our research demonstrates that even if data-handling operations are increasingly frequent, they are still described with imprecise verbs that do not reflect the complexity of these transformations.

URL : Researchers and their data: A study based on the use of the word data in scholarly articles

DOI : https://doi.org/10.1162/qss_a_00220

How do journals deal with problematic articles. Editorial response of journals to articles commented in PubPeer

Authors : José-Luis Ortega, Lorena Delgado-Quirós

The aim of this article is to explore the editorial response of journals to research articles that may contain methodological errors or misconduct. A total of 17,244 articles commented on in PubPeer, a post-publication peer review site, were processed and classified according to several error and fraud categories.

Then, the editorial response (i.e., editorial notices) to these papers were retrieved from PubPeer, Retraction Watch, and PubMed to obtain the most comprehensive picture. The results show that only 21.5% of the articles that deserve an editorial notice (i.e., honest errors, methodological flaws, publishing fraud, manipulation) were corrected by the journal. This percentage would climb to 34% for 2019 publications.

This response is different between journals, but cross-sectional across all disciplines. Another interesting result is that high-impact journals suffer more from image manipulations, while plagiarism is more frequent in low-impact journals.

The study concludes with the observation that the journals have to improve their response to problematic articles.

URL : How do journals deal with problematic articles. Editorial response of journals to articles commented in PubPeer

DOI : https://doi.org/10.3145/epi.2023.ene.18

The Issues with Journal Issues: Let Journals Be Digital Libraries

Author : C. Sean Burns

Science depends on a communication system, and today, that is largely provided by digital technologies such as the internet and web. Despite the fact that digital technologies provide the infrastructure for this communication system, peer-reviewed journals continue to mimic workflows and processes from the print era.

This paper focuses on one artifact from the print era, the journal issue, and describes how this artifact has been detrimental to the communication of science, and therefore, to science itself.

To replace the journal issue, this paper argues that scholarly publishing and journals could more fully embrace digital technologies by creating digital libraries to present and organize scholarly output.

URL : The Issues with Journal Issues: Let Journals Be Digital Libraries

DOI : https://doi.org/10.3390/publications11010007

Analysis of U.S. Federal Funding Agency Data Sharing Policies 2020 Highlights and Key Observations

Authors : Reid I. Boehm, Hannah Calkins, Patricia B. Condon, Jonathan Petters, Rachel Woodbrook

Federal funding agencies in the United States (U.S.) continue to work towards implementing their plans to increase public access to funded research and comply with the 2013 Office of Science and Technology memo Increasing Access to the Results of Federally Funded Scientific Research.

In this article we report on an analysis of research data sharing policy documents from 17 U.S. federal funding agencies as of February 2021. Our analysis is guided by two questions: 1.) What do the findings suggest about the current state of and trends in U.S. federal funding agency data sharing requirements? 2.) In what ways are universities, institutions, associations, and researchers affected by and responding to these policies?

Over the past five years, policy updates were common among these agencies and several themes have been thoroughly developed in that time; however, uncertainty remains around how funded researchers are expected to satisfy these policy requirements.

URL : Analysis of U.S. Federal Funding Agency Data Sharing Policies 2020 Highlights and Key Observations

DOI : https://doi.org/10.2218/ijdc.v17i1.791