Reproducibility in Management Science

Authors : Miloš Fišar, Ben Greiner, Christoph Huber, Elena Katok, Ali I. Ozkes

With the help of more than 700 reviewers, we assess the reproducibility of nearly 500 articles published in the journal Management Science before and after the introduction of a new Data and Code Disclosure policy in 2019.

When considering only articles for which data accessibility and hardware and software requirements were not an obstacle for reviewers, the results of more than 95% of articles under the new disclosure policy could be fully or largely computationally reproduced. However, for 29% of articles, at least part of the data set was not accessible to the reviewer. Considering all articles in our sample reduces the share of reproduced articles to 68%.

These figures represent a significant increase compared with the period before the introduction of the disclosure policy, where only 12% of articles voluntarily provided replication materials, of which 55% could be (largely) reproduced. Substantial heterogeneity in reproducibility rates across different fields is mainly driven by differences in data set accessibility.

Other reasons for unsuccessful reproduction attempts include missing code, unresolvable code errors, weak or missing documentation, and software and hardware requirements and code complexity. Our findings highlight the importance of journal code and data disclosure policies and suggest potential avenues for enhancing their effectiveness.

DOI : https://doi.org/10.1287/mnsc.2023.03556

The Future of Data in Research Publishing: From Nice to Have to Need to Have?

Authors : Christine L. Borgman, Amy Brand

Science policy promotes open access to research data for purposes of transparency and reuse of data in the public interest. We expect demands for open data in scholarly publishing to accelerate, at least partly in response to the opacity of artificial intelligence algorithms.

Open data should be findable, accessible, interoperable, and reusable (FAIR), and also trustworthy and verifiable. The current state of open data in scholarly publishing is in transition from ‘nice to have’ to ‘need to have.’

Research data are valuable, interpretable, and verifiable only in context of their origin, and with sufficient infrastructure to facilitate reuse. Making research data useful is expensive; benefits and costs are distributed unevenly.

Open data also poses risks for provenance, intellectual property, misuse, and misappropriation in an era of trolls and hallucinating AI algorithms. Scholars and scholarly publishers must make evidentiary data more widely available to promote public trust in research.

To make research processes more trustworthy, transparent, and verifiable, stakeholders need to make greater investments in data stewardship and knowledge infrastructures.

DOI : https://doi.org/10.1162/99608f92.b73aae77

Applying Librarian- Created Evaluation Tools to Determine Quality and Credibility of Open Access Library Science Journals

Authors : Maggie Albro, Jessica L. Serrao, Christopher D. Vidas, Jenessa M. McElfresh, K. Megan Sheffield, Megan Palmer

This article explores the application of journal quality and credibility evaluation tools to library science publications. The researchers investigate quality and credibility attributes of forty-eight peer-reviewed library science journals with open access components using two evaluative tools developed and published by librarians.

The results identify common positive and negative attributes of library science journals, compare the results of the two evaluation tools, and discuss their ease of use and limitations. Overall, the results show that while library science journals do not fall prey to the same concerning characteristics that librarians use to caution other researchers, there are several areas in which publishers can improve the quality and credibility of their journals.

URL : https://preprint.press.jhu.edu/portal/sites/default/files/06_24.1albro.pdf

Gender differences in submission behavior exacerbate publication disparities in elite journals

Authors : Isabel Basson, Chaoqun Ni, Giovanna Badia, Nathalie Tufenkji, Cassidy R. Sugimoto, Vincent Larivière

Women are particularly underrepresented in journals of the highest scientific impact, with substantial consequences for their careers. While a large body of research has focused on the outcome and the process of peer review, fewer articles have explicitly focused on gendered submission behavior and the explanations for these differences.

In our study of nearly five thousand active authors, we find that women are less likely to report having submitted papers and, when they have, to submit fewer manuscripts, on average, than men. Women were more likely to indicate that they did not submit their papers (in general and their subsequently most cited papers) to Science, Nature, or PNAS because they were advised not to.

In the aggregate, no statistically significant difference was observed between men and women in how they rated the quality of their work. Nevertheless, regardless of discipline, women were more likely than men to indicate that their “work was not ground-breaking or sufficiently novel” as a rationale for not submitting to one of the listed prestigious journals. Men were more likely than women to indicate that the “work would fit better in a more specialized journal.”

We discuss the implications of these findings and interventions that can serve to mitigate the disparities caused by gendered differences in submission behavior.

DOI : https://doi.org/10.1101/2023.08.21.554192

The Role of Academic Libraries in Scientific Production Evaluation – the Experience of University of Zagreb, Croatia

Authors : Branka Marijanović, Tatijana Petrić, Zrinka Udiljak Bugarinovski, Višnja Novosel

Since internationally visible scientific productivity is a criterion for state evaluation of Croatian academic and scientific institutions and their scientists, Croatian academic libraries have a key role in quantitative evaluation of scientific productivity using methods such as bibliometrics, scientometrics and the like.

The aim of this case study is to identify and illustrate the current situation of library services for evaluating scientific production at the University of Zagreb, Croatia, and to make recommendations for the further development of such services, which could serve as a framework for the systematic implementation of this type of service in all libraries at the University of Zagreb and beyond.

More specifically, the purpose of this paper was to identify the existence of the bibliometric services in the libraries of the University of Zagreb (UNIZG), examine the status and involvement of university librarians in the academic advancement procedures and to identify the required competences for bibliometric experts in Croatia.

The research was conducted using the content analysis method, the survey method, and the focus group method. The research results show that although UNIZG libraries are integrated into the system of academic promotion and the role of UNIZG libraries is enshrined in Croatian regulations, the bibliometric service is not standardised at the University level.

The results also indicate that the service needs to be strengthened in terms of training of professional staff and greater investment in staff capacity and infrastructure.

The fact that the study was conducted at a single Croatian university is a possible limitation that could relate to the application of guidelines for further actions and the development of bibliometric services at national level. It would therefore be desirable to conduct future research to identify the situation at other Croatian universities as well.

It would also be necessary to determine the open science and open access policies at UNIZG through further research and, in this context, to establish guidelines for possible improvements in the processes of evaluating scientific productivity.

The results of this study make an important contribution to the possible future positioning of university libraries and UNIZG librarians in the process of evaluating scientific productivity. In addition, some practical advice is given so that this case study may be a good introductory overview for the wider academic community in relation to this topic.

URL : The Role of Academic Libraries in Scientific Production Evaluation – the Experience of University of Zagreb, Croatia

DOI : https://doi.org/10.53377/lq.13523

Who Are Tweeting About Academic Publications? A Cochrane Systematic Review and Meta-Analysis of Altmetric Studies

Authors : Ashraf Maleki, Kim Holmberg

Previous studies have developed different categorizations of Twitter users who interact with scientific publications online, reflecting the difficulty in creating a unified approach. Using Cochrane Review meta-analysis to analyse earlier research (including 79,014 Twitter users, over twenty million tweets, and over five million tweeted publications from 23 studies), we created a consolidated robust categorization consisting of 11 user categories, at different dimensions, covering most of any future needs for user categorizations on Twitter and possibly also other social media platforms.

Our findings showed, with moderate certainty, covering all the earlier different approaches employed, that the predominant Twitter group was individual users (66%), responsible for the majority of tweets (55%) and tweeted publications (50%), while organizations (22%, 27%, and 28%, respectively) and science communicators (16%, 13%, and 30%) clearly contributed smaller proportions.

The cumulative findings from prior investigations indicated a statistically equal extent of academic individuals (33%) and other individuals (28%). While academic individuals shared more academic publications than other individuals (42% vs. 31%), they posted fewer tweets overall (22% vs. 30%), but these differences do not reach statistical significance.

Despite significant heterogeneity arising from variations in categorization methods, the findings consistently indicate the importance of academics in disseminating academic publications.

URL : https://arxiv.org/abs/2312.06399