From Research Evaluation to Research Analytics. The digitization of academic performance measurement

Authors : Anne K. Krüger, Sabrina Petersohn

One could think that bibliometric measurement of academic performance has always been digital since the computer-assisted invention of the Science Citation Index. Yet, since the 2000s, the digitization of bibliometric infrastructure has accelerated at a rapid pace. Citation databases are indexing an increasing variety of publication types.

Altmetric data aggregators are producing data on the reception of research outcomes. Machine-readable persistent identifiers are created to unambiguously identify researchers, research organizations, and research objects; and evaluative software tools and current research information systems are constantly enlarging their functionalities to make use of these data and extract meaning from them.

In this article, we analyse how these developments in evaluative bibliometrics have contributed to an extension of indicator-based research evaluation towards data-driven research analytics.

Drawing on empirical material from blogs and websites as well as from research and policy papers, we discuss how interoperability, scalability, and flexibility as material specificities of digital infrastructures generate new ways of data production and their assessment, which affect the possibilities of how academic performance can be understood and (e)valuated.

URL : From Research Evaluation to Research Analytics. The digitization of academic performance measurement

DOI : https://doi.org/10.3384/VS.2001-5992.2022.9.1.11-46

Promoting Open Science through bibliometrics : a practical guide to build an open access monitor

Author : Laetitia Bracco

In order to assess the progress of Open Science in France, the French Ministry of Higher Education, Research and Innovation published the French Open Science Monitor in 2019. Even if this tool has a bias, for only the publications with a DOI can be considered, thus promoting article-dominant research communities, its indicators are trustworthy and reliable.

The University of Lorraine was the very first institution to reuse the National Monitor in order to create a new version at the scale of one university in 2020. Since its release, the Lorraine Open Science Monitor has been reused by many other institutions.

In 2022, the French Open Science Monitor further evolved, enabling new insights on open science. The Lorraine Open Science Monitor has also evolved since it began. This paper details how the initial code for the Lorraine Open Science Monitor was developed and disseminated. It then outlines plans for development in the next few years.

URL : Promoting Open Science through bibliometrics : a practical guide to build an open access monitor

DOI : https://doi.org/10.53377/lq.11545

Reshaping How Universities Can Evaluate the Research Impact of Open Humanities for Societal Benefit

Authors : Paul Longley Arthur, Lydia Hearn

During the twenty-first century, for the first time, the volume of digital data has surpassed the amount of analog data. As academic practices increasingly become digital, opportunities arise to reshape the future of scholarly communication through more accessible, interactive, open, and transparent methods that engage a far broader and more diverse public.

Yet despite these advances, the research performance of universities and public research institutes remains largely evaluated through publication and citation analysis rather than by public engagement and societal impact.

This article reviews how changes to bibliometric evaluations toward greater use of altmetrics, including social media mentions, could enhance uptake of open scholarship in the humanities.

In addition, the article highlights current challenges faced by the open scholarship movement, given the complexity of the humanities in terms of its sources and outputs that include monographs, book chapters, and journals in languages other than English; the use of popular media not considered as scholarly papers; the lack of time and energy to develop digital skills among research staff; problems of authority and trust regarding the scholarly or non-academic nature of social media platforms; the prestige of large academic publishing houses; and limited awareness of and familiarity with advanced digital applications.

While peer review will continue to be a primary method for evaluating research in the humanities, a combination of altmetrics and other assessment of research impact through different data sources may provide a way forward to ensure the increased use, sustainability, and effectiveness of open scholarship in the humanities.

DOI : https://doi.org/10.3998/jep.788

Awareness Mentality and Strategic Behavior in Science

Author : Rafael Ball

Acknowledgement of scientific achievements was and is essentially achieved through the citation of a publication. Increasingly, however, it is no longer just the publication itself that plays an important role, but also the degree of attention that a scientist achieves with this very publication.

Thus, the importance of strategic behavior in science is progressing and an awareness mentality is spreading. In this paper, the causes and backgrounds of this development are discussed, identifying the use of reductionist, quantitative systems in science management and research funding, the loss of critical judgment and technocratic dominance, quantitative assessments used for decision making, altmetrics and the like as alternative views, the use of perception scores in reference databases and universities as well as ambitions of journals as main drivers.

Besides, different forms of strategic behavior in science and the resulting consequences and impacts are being highlighted.

URL : Awareness Mentality and Strategic Behavior in Science

DOI : https://doi.org/10.3389/frma.2021.703159

TeamTree analysis: A new approach to evaluate scientific production

Author : Frank W. Pfrieger

Advances in science and technology depend on the work of research teams and the publication of results through peer-reviewed articles representing a growing socio-economic resource. Current methods to mine the scientific literature regarding a field of interest focus on content, but the workforce credited by authorship remains largely unexplored.

Notably, appropriate measures of scientific production are debated. Here, a new bibliometric approach named TeamTree analysis is introduced that visualizes the development and composition of the workforce driving a field.

A new citation-independent measure that scales with the H index estimates impact based on publication record, genealogical ties and collaborative connections.

This author-centered approach complements existing tools to mine the scientific literature and to evaluate research across disciplines.

URL : TeamTree analysis: A new approach to evaluate scientific production

DOI : https://doi.org/10.1371/journal.pone.0253847

Which aspects of the Open Science agenda are most relevant to scientometric research and publishing? An opinion paper

Authors : Lutz Bornmann, Raf Guns, Michael Thelwall, Dietmar Wolfram

Open Science is an umbrella term that encompasses many recommendations for possible changes in research practices, management, and publishing with the objective to increase transparency and accessibility.

This has become an important science policy issue that all disciplines should consider. Many Open Science recommendations may be valuable for the further development of research and publishing but not all are relevant to all fields.

This opinion paper considers the aspects of Open Science that are most relevant for scientometricians, discussing how they can be usefully applied.

DOI : https://doi.org/10.1162/qss_e_00121

Conjoint analysis of researchers’ hidden preferences for bibliometrics, altmetrics, and usage metrics

Authors : Steffen Lemke, Athanasios Mazarakis, Isabella Peters

The amount of annually published scholarly articles is growing steadily, as is the number of indicators through which impact of publications is measured. Little is known about how the increasing variety of available metrics affects researchers’ processes of selecting literature to read.

We conducted ranking experiments embedded into an online survey with 247 participating researchers, most from social sciences. Participants completed series of tasks in which they were asked to rank fictitious publications regarding their expected relevance, based on their scores regarding six prototypical metrics.

Through applying logistic regression, cluster analysis, and manual coding of survey answers, we obtained detailed data on how prominent metrics for research impact influence our participants in decisions about which scientific articles to read.

Survey answers revealed a combination of qualitative and quantitative characteristics that researchers consult when selecting literature, while regression analysis showed that among quantitative metrics, citation counts tend to be of highest concern, followed by Journal Impact Factors.

Our results suggest a comparatively favorable view of many researchers on bibliometrics and widespread skepticism toward altmetrics.

The findings underline the importance of equipping researchers with solid knowledge about specific metrics’ limitations, as they seem to play significant roles in researchers’ everyday relevance assessments.

URL : Conjoint analysis of researchers’ hidden preferences for bibliometrics, altmetrics, and usage metrics

DOI : https://doi.org/10.1002/asi.24445