Do altmetrics assess societal impact in the same way as case studies? An empirical analysis testing the convergent validity of altmetrics based on data from the UK Research Excellence Framework (REF)

Authors : Lutz Bornmann, Robin Haunschild, Jonathan Adams

Altmetrics have been proposed as a way to assess the societal impact of research. Although altmetrics are already in use as impact or attention metrics in different contexts, it is still not clear whether they really capture or reflect societal impact.

This study is based on altmetrics, citation counts, research output and case study data from the UK Research Excellence Framework (REF), and peers’ REF assessments of research output and societal impact. We investigated the convergent validity of altmetrics by using two REF datasets: publications submitted as research output (PRO) to the REF and publications referenced in case studies (PCS).

Case studies, which are intended to demonstrate societal impact, should cite the most relevant research papers. We used the MHq’ indicator for assessing impact – an indicator which has been introduced for count data with many zeros.

The results of the first part of the analysis show that news media as well as mentions on Facebook, in blogs, in Wikipedia, and in policy-related documents have higher MHq’ values for PCS than for PRO.

Thus, the altmetric indicators seem to have convergent validity for these data. In the second part of the analysis, altmetrics have been correlated with REF reviewers’ average scores on PCS. The negative or close to zero correlations question the convergent validity of altmetrics in that context.

We suggest that they may capture a different aspect of societal impact (which can be called unknown attention) to that seen by reviewers (who are interested in the causal link between research and action in society).

URL : https://arxiv.org/abs/1807.03977

Allegation of scientific misconduct increases Twitter attention

Authors : Lutz Bornmann, Robin Haunschild

The web-based microblogging system Twitter is a very popular altmetrics source for measuring the broader impact of science. In this case study, we demonstrate how problematic the use of Twitter data for research evaluation can be, even though the aspiration of measurement is degraded from impact to attention measurement.

We collected the Twitter data for the paper published by Yamamizu et al. (2017). An investigative committee found that the main figures in the paper are fraudulent.

URL : https://arxiv.org/abs/1802.00606

To what extent does the Leiden Manifesto also apply to altmetrics? A discussion of the manifesto against the background of research into altmetrics

Authors : Lutz Bornmann, Robin Haunschild

Purpose

Hicks, Wouters, Waltman, de Rijcke, and Rafols (2015) have formulated the so-called Leiden manifesto, in which they have assembled the ten principles for a meaningful evaluation of research on the basis of bibliometric data.

Approach

In this work the attempt is made to indicate the relevance of the Leiden manifesto for altmetrics.

Results

As shown by the discussion of the ten principles against the background of the knowledge about and the research into altmetrics, the principles also have a great importance for altmetrics and should be taken into account in their application.

Originality

Altmetrics is already frequently used in the area of research evaluation. Thus, it is important that the user of altmetrics data knows the relevance of the Leiden manifesto also in this area.

URL : https://figshare.com/articles/To_what_extent_does_the_Leiden_Manifesto_also_apply_to_altmetrics_A_discussion_of_the_manifesto_against_the_background_of_research_into_altmetrics/1464981