How Can Science and Research Work Well? Toward a Critique of New Public Management Practices in Academia From a Socio-Philosophical Perspective

Author : Jan-Philipp Kruse

While New Public Management practices (NPM) have been adopted in academia and higher education over the past two decades, this paper is investigating their role in a specifically socio-philosophical way: The preeminent question is what organization of science is likely to make science and research work well in the context of a complex society.

The starting point is an obvious intuition: that academia would be “economized” by NPM (basically, that something is coming from the outside and is disturbing the inside). Habermas provides a sophisticated theorization for this intuition.

In contrast, the thesis advanced here is that we should consider NPM potentially problematic—but not for descending from economics or administration outside academia. It is because NPM often cannot help research and science to function well. In this (rather “essayistic” than strictly deductive) consideration,

I will therefore tentatively discuss an alternative approach that takes up critical intuitions while transposing them into a different setting. If we understand science and research as a form of life, a different picture emerges that can still bring immanent standards to bear, but at the same time compose them more broadly.

This outlines a socio-philosophical critique of NPM. Accordingly, the decisive factor is not NPM’s provenance. What is decisive is that it addresses some organizational problems while at the same time creating new ones.

At the end, an outlook is sketched on how the specific situation of NPM allows some hypotheses on academy’s [by “academy”, I am referring to the whole research community (like “academia”)] future organization. Ex negativo, it seems likely that qualitative evaluation criteria and creative freedom will have to play a greater role.

URL : How Can Science and Research Work Well? Toward a Critique of New Public Management Practices in Academia From a Socio-Philosophical Perspective

Original location : https://www.frontiersin.org/articles/10.3389/frma.2022.791114/full

How should evaluation be? Is a good evaluation of research also just? Towards the implementation of good evaluation

Authors : Cinzia Daraio, Alessio Vaccari

In this paper we answer the question of how evaluation should be by proposing a good evaluation of research practices. A good evaluation of research practices, intended as social practices à la MacIntyre, should take into account the stable motivations and the traits of the characters (i.e. the virtues) of researchers.

We also show that a good evaluation is also just, beyond the sense of fairness, as working on good research practices implies keep into account a broader sense of justice. After that, we propose the development of a knowledge base for the assessment of “good” evaluations of research practices to implement a questionnaire for the assessment of researchers’ virtues.

Although the latter is a challenging task, the use of ontologies and taxonomic knowledge, and the reasoning algorithms that can draw inferences on the basis of such knowledge represents a way for testing the consistency of the information reported in the questionnaire and to analyse correctly and coherently how the data is gathered through it.

Finally, we describe the potential application usefulness of our proposal for the reform of current research assessment systems.

URL : How should evaluation be? Is a good evaluation of research also just? Towards the implementation of good evaluation

DOI : https://doi.org/10.1007/s11192-022-04329-2

Reshaping How Universities Can Evaluate the Research Impact of Open Humanities for Societal Benefit

Authors : Paul Longley Arthur, Lydia Hearn

During the twenty-first century, for the first time, the volume of digital data has surpassed the amount of analog data. As academic practices increasingly become digital, opportunities arise to reshape the future of scholarly communication through more accessible, interactive, open, and transparent methods that engage a far broader and more diverse public.

Yet despite these advances, the research performance of universities and public research institutes remains largely evaluated through publication and citation analysis rather than by public engagement and societal impact.

This article reviews how changes to bibliometric evaluations toward greater use of altmetrics, including social media mentions, could enhance uptake of open scholarship in the humanities.

In addition, the article highlights current challenges faced by the open scholarship movement, given the complexity of the humanities in terms of its sources and outputs that include monographs, book chapters, and journals in languages other than English; the use of popular media not considered as scholarly papers; the lack of time and energy to develop digital skills among research staff; problems of authority and trust regarding the scholarly or non-academic nature of social media platforms; the prestige of large academic publishing houses; and limited awareness of and familiarity with advanced digital applications.

While peer review will continue to be a primary method for evaluating research in the humanities, a combination of altmetrics and other assessment of research impact through different data sources may provide a way forward to ensure the increased use, sustainability, and effectiveness of open scholarship in the humanities.

DOI : https://doi.org/10.3998/jep.788

What Is Wrong With the Current Evaluative Bibliometrics?

Author : Endel Põder

Bibliometric data are relatively simple and describe objective processes of publishing articles and citing others. It seems quite straightforward to define reasonable measures of a researcher’s productivity, research quality, or overall performance based on these data. Why do we still have no acceptable bibliometric measures of scientific performance?

Instead, there are hundreds of indicators with nobody knowing how to use them. At the same time, an increasing number of researchers and some research fields have been excluded from the standard bibliometric analysis to avoid manifestly contradictive conclusions.

I argue that the current biggest problem is the inadequate rule of credit allocation for multiple authored articles in mainstream bibliometrics. Clinging to this historical choice excludes any systematic and logically consistent bibliometrics-based evaluation of researchers, research groups, and institutions.

During the last 50 years, several authors have called for a change. Apparently, there are no serious methodologically justified or evidence-based arguments in the favor of the present system.

However, there are intractable social, psychological, and economical issues that make adoption of a logically sound counting system almost impossible.

URL : What Is Wrong With the Current Evaluative Bibliometrics?

DOI : https://doi.org/10.3389/frma.2021.824518

Evaluation and Merit-Based Increase in Academia: A Case Study in the First Person

Author : Christine Musselin

This article provides a reflexive account of the process of defining and implementing a mechanism to evaluate a group of academics in a French higher education institution. The situation is a rather unusual case for France, as the assessed academics are not civil servants but are employed by their university and this evaluation leads to merit-based salary increases.

To improve and implement this strategy was one of the author’s tasks, when she was vice-president for research at the institution in this case.

The article looks at this experience retrospectively, emphasizing three issues of particular relevance in the context of discussions about valuation studies and management proposed in this symposium: (1) the decision to distinguish between different types of profiles and thus categorize, or to apply the same criteria to all; (2) the concrete forms of commensuration to be developed in order to be able to evaluate and rank individuals from different disciplines; (3) the quantification of qualitative appreciation, i.e. their transformation into merit-based salary increases.

URL : Evaluation and Merit-Based Increase in Academia: A Case Study in the First Person

DOI : https://doi.org/10.3384/VS.2001-5992.2021.8.2.73-88

The production of scientific and societal value in research evaluation: a review of societal impact assessment methods

Authors : Jorrit P Smit, Laurens K Hessels

Over the past two decades, several methods have been developed to evaluate the societal impact of research. Compared to the practical development of the field, the conceptual development is relatively weak.

This review article contributes to the latter by elucidating the theoretical aspects of the dominant methods for evaluating societal impact of research, in particular, their presuppositions about the relationship between scientific and societal value of research. We analyse 10 approaches to the assessment of the societal impact of research from a constructivist perspective.

The methods represent different understandings of knowledge exchange, which can be understood in terms of linear, cyclical, and co-production models. In addition, the evaluation methods use a variety of concepts for the societal value of research, which suggest different relationships with scientific value.

While some methods rely on a clear and explicit distinction between the two types of value, other methods, in particular Evaluative Inquiry, ASIRPA, Contribution Mapping, Public Value Mapping, and SIAMPI, consider the mechanisms for producing societal value integral to the research process.

We conclude that evaluation methods must balance between demarcating societal value as a separate performance indicator for practical purposes and doing justice to the (constructivist) science studies’ findings about the integration of scientific and societal value of research.

Our analytic comparison of assessment methods can assist research evaluators in the conscious and responsible selection of an approach that fits with the object under evaluation. As evaluation actively shapes knowledge production, it is important not to use oversimplified concepts of societal value.

URL : The production of scientific and societal value in research evaluation: a review of societal impact assessment methods

DOI : https://doi.org/10.1093/reseval/rvab002