Do altmetrics work for assessing research quality?

Authors : Andrea Giovanni Nuzzolese, Paolo Ciancarini, Aldo Gangemi, Silvio Peroni, Francesco Poggi, Valentina Presutti

Alternative metrics (aka altmetrics) are gaining increasing interest in the scientometrics community as they can capture both the volume and quality of attention that a research work receives online.

Nevertheless, there is limited knowledge about their effectiveness as a mean for measuring the impact of research if compared to traditional citation-based indicators.

This work aims at rigorously investigating if any correlation exists among indicators, either traditional (i.e. citation count and h-index) or alternative (i.e. altmetrics) and which of them may be effective for evaluating scholars.

The study is based on the analysis of real data coming from the National Scientific Qualification procedure held in Italy by committees of peers on behalf of the Italian Ministry of Education, Universities and Research.

URL : https://arxiv.org/abs/1812.11813

The insoluble problems of books: What does Altmetric.com have to offer?

Authors : Daniel Torres-Salinas, Juan Gorraiz, Nicolas Robinson-Garcia

The purpose of this paper is to analyze the capabilities, functionalities and appropriateness of Altmetric.com as a data source for the bibliometric analysis of books in comparison to PlumX.

We perform an exploratory analysis on the metrics the Altmetric Explorer for Institutions platform offers for books. We use two distinct datasets of books: the Book Collection included in Altmetric.com and the Clarivate’s Master Book List, to analyze Altmetric.com’s capabilities to download and merge data with external databases.

Finally, we compare our findings with those obtained in a previous study performed in PlumX. Altmetric.com combines and orderly tracks a set of data sources combined by DOI identifiers to retrieve metadata from books, being Google Books its main provider. It also retrieves information from commercial publishers and from some Open Access initiatives, including those led by university libraries such as Harvard Library.

We find issues with linkages between records and mentions or ISBN discrepancies. Furthermore, we find that automatic bots affect greatly Wikipedia mentions to books. Our comparison with PlumX suggests that none of these tools provide a complete picture of the social attention generated by books and are rather complementary than comparable tools.

URL : https://arxiv.org/abs/1809.10128

What increases (social) media attention: Research impact, author prominence or title attractiveness?

Authors : Olga Zagovora, Katrin Weller, Milan Janosov, Claudia Wagner, Isabella Peters

Do only major scientific breakthroughs hit the news and social media, or does a ‘catchy’ title help to attract public attention? How strong is the connection between the importance of a scientific paper and the (social) media attention it receives?

In this study we investigate these questions by analysing the relationship between the observed attention and certain characteristics of scientific papers from two major multidisciplinary journals: Nature Communication (NC) and Proceedings of the National Academy of Sciences (PNAS).

We describe papers by features based on the linguistic properties of their titles and centrality measures of their authors in their co-authorship network.

We identify linguistic features and collaboration patterns that might be indicators for future attention, and are characteristic to different journals, research disciplines, and media sources.

URL : https://arxiv.org/abs/1809.06299

Do altmetrics assess societal impact in the same way as case studies? An empirical analysis testing the convergent validity of altmetrics based on data from the UK Research Excellence Framework (REF)

Authors : Lutz Bornmann, Robin Haunschild, Jonathan Adams

Altmetrics have been proposed as a way to assess the societal impact of research. Although altmetrics are already in use as impact or attention metrics in different contexts, it is still not clear whether they really capture or reflect societal impact.

This study is based on altmetrics, citation counts, research output and case study data from the UK Research Excellence Framework (REF), and peers’ REF assessments of research output and societal impact. We investigated the convergent validity of altmetrics by using two REF datasets: publications submitted as research output (PRO) to the REF and publications referenced in case studies (PCS).

Case studies, which are intended to demonstrate societal impact, should cite the most relevant research papers. We used the MHq’ indicator for assessing impact – an indicator which has been introduced for count data with many zeros.

The results of the first part of the analysis show that news media as well as mentions on Facebook, in blogs, in Wikipedia, and in policy-related documents have higher MHq’ values for PCS than for PRO.

Thus, the altmetric indicators seem to have convergent validity for these data. In the second part of the analysis, altmetrics have been correlated with REF reviewers’ average scores on PCS. The negative or close to zero correlations question the convergent validity of altmetrics in that context.

We suggest that they may capture a different aspect of societal impact (which can be called unknown attention) to that seen by reviewers (who are interested in the causal link between research and action in society).

URL : https://arxiv.org/abs/1807.03977

Social media metrics for new research evaluation

Authors : Paul Wouters, Zohreh Zahedi, Rodrigo Costas

This chapter approaches, both from a theoretical and practical perspective, the most important principles and conceptual frameworks that can be considered in the application of social media metrics for scientific evaluation.

We propose conceptually valid uses for social media metrics in research evaluation. The chapter discusses frameworks and uses of these metrics as well as principles and recommendations for the consideration and application of current (and potentially new) metrics in research evaluation.

URL  : https://arxiv.org/abs/1806.10541

Analysing researchers’ outreach efforts and the association with publication metrics: A case study of Kudos

Authors : Mojisola Erdt, Htet Htet Aung, Ashley Sara Aw, Charlie Rapple, Yin-Leng Theng

With the growth of scholarly collaboration networks and social communication platforms, members of the scholarly community are experimenting with their approach to disseminating research outputs, in an effort to increase their audience and outreach.

However, from a researcher’s point of view, it is difficult to determine whether efforts to make work more visible are worthwhile (in terms of the association with publication metrics) and within that, difficult to assess which platform or network is most effective for sharing work and connecting to a wider audience.

We undertook a case study of Kudos (https://www.growkudos.com), a web-based service that claims to help researchers increase the outreach of their publications, to examine the most effective tools for sharing publications online, and to investigate which actions are associated with improved metrics.

We extracted a dataset from Kudos of 830,565 unique publications claimed by authors, for which 20,775 had actions taken to explain or share via Kudos, and for 4,867 of these full text download data from publishers was available.

Findings show that researchers are most likely to share their work on Facebook, but links shared on Twitter are more likely to be clicked on. A Mann-Whitney U test revealed that a treatment group (publications having actions in Kudos) had a significantly higher median average of 149 full text downloads (23.1% more) per publication as compared to a control group (having no actions in Kudos) with a median average of 121 full text downloads per publication.

These findings suggest that performing actions on publications, such as sharing, explaining, or enriching, could help to increase the number of full text downloads of a publication.

URL : Analysing researchers’ outreach efforts and the association with publication metrics: A case study of Kudos

DOI : https://doi.org/10.1371/journal.pone.0183217