Exploring the merits of research performance measures that comply with the San FranciscoDeclaration on Research Assessment and strategies to overcome barriers of adoption: qualitative interviews with administrators and researchers

Authors : Himani Boury, Mathieu Albert, Robert H. C. Chen, James C. L. Chow, Ralph DaCosta, Michael M. Hoffman, Behrang Keshavarz, Pia Kontos, Mary Pat McAndrews, Stephanie Protze, Anna R. Gagliardi

Background

In prior research, we identified and prioritized ten measures to assess research performance that comply with the San Francisco Declaration on Research Assessment, a principle adopted worldwide that discourages metrics-based assessment.

Given the shift away from assessment based on Journal Impact Factor, we explored potential barriers to implementing and adopting the prioritized measures.

Methods

We identified administrators and researchers across six research institutes, conducted telephone interviews with consenting participants, and used qualitative description and inductive content analysis to derive themes.

Results

We interviewed 18 participants: 6 administrators (research institute business managers and directors) and 12 researchers (7 on appointment committees) who varied by career stage (2 early, 5 mid, 5 late). Participants appreciated that the measures were similar to those currently in use, comprehensive, relevant across disciplines, and generated using a rigorous process.

They also said the reporting template was easy to understand and use. In contrast, a few administrators thought the measures were not relevant across disciplines. A few participants said it would be time-consuming and difficult to prepare narratives when reporting the measures, and several thought that it would be difficult to objectively evaluate researchers from a different discipline without considerable effort to read their work.

Strategies viewed as necessary to overcome barriers and support implementation of the measures included high-level endorsement of the measures, an official launch accompanied by a multi-pronged communication strategy, training for both researchers and evaluators, administrative support or automated reporting for researchers, guidance for evaluators, and sharing of approaches across research institutes.

Conclusions

While participants identified many strengths of the measures, they also identified a few limitations and offered corresponding strategies to address the barriers that we will apply at our organization. Ongoing work is needed to develop a framework to help evaluators translate the measures into an overall assessment.

Given little prior research that identified research assessment measures and strategies to support adoption of those measures, this research may be of interest to other organizations that assess the quality and impact of research.

URL : Exploring the merits of research performance measures that comply with the San Francisco Declaration on Research Assessment and strategies to overcome barriers of adoption: qualitative interviews with administrators and researchers

DOI : https://doi.org/10.1186/s12961-023-01001-w

The Platformisation of Scholarly Information and How to Fight It

Author : Lai Ma

The commercial control of academic publishing and research infrastructure by a few oligopolistic companies has crippled the development of open access movement and interfered with the ethical principles of information access and privacy.

In recent years, vertical integration of publishers and other service providers throughout the research cycle has led to platformisation, characterized by datafication and commodification similar to practices on social media platforms. Scholarly publications are treated as user-generated contents for data tracking and surveillance, resulting in profitable data products and services for research assessment, benchmarking and reporting.

Meanwhile, the bibliodiversity and equal open access are denied by the dominant gold open access model and the privacy of researchers is being compromised by spyware embedded in research infrastructure.

This article proposes four actions to fight the platformisation of scholarly information after a brief overview of the market of academic journals and research assessments and their implications for bibliodiversity, information access, and privacy: (1) Educate researchers about commercial publishers and APCs; (2) Allocate library budget to support scholar-led and library publishing; (3) Engage in the development of public research infrastructures and copyright reform; and (4) Advocate for research assessment reforms.

URL : The Platformisation of Scholarly Information and How to Fight It

DOI : https://doi.org/10.53377/lq.13561

« Les brevets sont à peine au rang d’une publication » : Projets de valorisation et cycle de crédibilité au CNRS

Autrice/Author : Victoria Brun

Cet article vise à expliciter la place qu’occupent les activités de valorisation dans les carrières des personnels de la recherche publique et la manière dont ils travaillent ou non à les internaliser dans le cycle de crédibilité académique (Latour & Woolgar, 1979).

À partir d’une enquête conduite dans des projets de valorisation liés au CNRS, l’analyse montre que les activités de valorisation sont pensées conjointement avec les activités académiques. Si les chercheur·se·s échouent le plus souvent à les convertir en reconnaissance sans détour par la publication, il·elle·s peuvent réinjecter cet investissement sous forme de financement et d’équipement pour d’autres travaux.

D’autres décident de les externaliser, faisant de la valorisation un à-côté de la carrière. Les doctorant·e·s et les ingénieur·e·s, qui participent pourtant à alimenter le cycle de crédibilité des chercheur·se·s, investissent des voies professionnelles parallèles. Enfin, l’engagement dans des projets de valorisation expose à des risques de décrédibilisation que les chercheur·se·s dénouent en défendant une conception du désintéressement scientifique compatible avec la perspective applicative.

La transformation de l’économie de la crédibilité se fait donc à la marge, malgré les multiples dispositifs incitatifs des institutions de recherche.

DOI : https://doi.org/10.4000/rac.30214

Contours of a research ethics and integrity perspective on open science

Authors : Tom Lindemann, Lisa Häberlein

This article argues that adopting a research ethics and integrity perspective could support researchers in operationalizing the open science guiding principle “as open as possible, as closed as necessary” in a responsible and context-sensitive manner.

To that end, the article points out why the guiding principle as such provides only a limited extent of action-guidance and outlines the practical value of ethical reflection when it comes to translating open science into responsible research practice.

The article illustrates how research ethics and integrity considerations may help researchers understand the ethical rationale underpinning open science as well as recognize that limiting openness is necessary or at least normatively permissible in some situations.

Finally, the article briefly discusses possible consequences of integrating open science into a responsibility-centered framework and implications on research assessment.

URL : Contours of a research ethics and integrity perspective on open science

DOI : https://doi.org/10.3389/frma.2023.1052353

Rhetorical Features and Functions of Data References in Academic Articles

Authors : Sara Lafia, Andrea Thomer, Elizabeth Moss, David Bleckley, Libby Hemphill

Data reuse is a common practice in the social sciences. While published data play an essential role in the production of social science research, they are not consistently cited, which makes it difficult to assess their full scholarly impact and give credit to the original data producers.

Furthermore, it can be challenging to understand researchers’ motivations for referencing data. Like references to academic literature, data references perform various rhetorical functions, such as paying homage, signaling disagreement, or drawing comparisons. This paper studies how and why researchers reference social science data in their academic writing.

We develop a typology to model relationships between the entities that anchor data references, along with their features (access, actions, locations, styles, types) and functions (critique, describe, illustrate, interact, legitimize). We illustrate the use of the typology by coding multidisciplinary research articles (n = 30) referencing social science data archived at the Inter-university Consortium for Political and Social Research (ICPSR).

We show how our typology captures researchers’ interactions with data and purposes for referencing data. Our typology provides a systematic way to document and analyze researchers’ narratives about data use, extending our ability to give credit to data that support research.

URL : Rhetorical Features and Functions of Data References in Academic Articles

DOI : https://doi.org/10.5334/dsj-2023-010

Metrics and peer review agreement at the institutional level

Authors : Vincent A Traag, Marco Malgarini, Sarlo Scipione

In the past decades, many countries have started to fund academic institutions based on the evaluation of their scientific performance. In this context, post-publication peer review is often used to assess scientific performance. Bibliometric indicators have been suggested as an alternative to peer review.

A recurrent question in this context is whether peer review and metrics tend to yield similar outcomes. In this paper, we study the agreement between bibliometric indicators and peer review based on a sample of publications submitted for evaluation to the national Italian research assessment exercise (2011–2014).

In particular, we study the agreement between bibliometric indicators and peer review at a higher aggregation level, namely the institutional level. Additionally, we also quantify the internal agreement of peer review at the institutional level. We base our analysis on a hierarchical Bayesian model using cross-validation.

We find that the level of agreement is generally higher at the institutional level than at the publication level. Overall, the agreement between metrics and peer review is on par with the internal agreement among two reviewers for certain fields of science in this particular context.

This suggests that for some fields, bibliometric indicators may possibly be considered as an alternative to peer review for the Italian national research assessment exercise. Although results do not necessarily generalise to other contexts, it does raise the question whether similar findings would obtain for other research assessment exercises, such as in the United Kingdom.

URL : https://arxiv.org/abs/2006.14830

From Research Evaluation to Research Analytics. The digitization of academic performance measurement

Authors : Anne K. Krüger, Sabrina Petersohn

One could think that bibliometric measurement of academic performance has always been digital since the computer-assisted invention of the Science Citation Index. Yet, since the 2000s, the digitization of bibliometric infrastructure has accelerated at a rapid pace. Citation databases are indexing an increasing variety of publication types.

Altmetric data aggregators are producing data on the reception of research outcomes. Machine-readable persistent identifiers are created to unambiguously identify researchers, research organizations, and research objects; and evaluative software tools and current research information systems are constantly enlarging their functionalities to make use of these data and extract meaning from them.

In this article, we analyse how these developments in evaluative bibliometrics have contributed to an extension of indicator-based research evaluation towards data-driven research analytics.

Drawing on empirical material from blogs and websites as well as from research and policy papers, we discuss how interoperability, scalability, and flexibility as material specificities of digital infrastructures generate new ways of data production and their assessment, which affect the possibilities of how academic performance can be understood and (e)valuated.

URL : From Research Evaluation to Research Analytics. The digitization of academic performance measurement

DOI : https://doi.org/10.3384/VS.2001-5992.2022.9.1.11-46