Aligning Social Media Indicators with the Documents in an Open Access Repository

Authors: Luis Meneses, Alyssa Arbuckle, Hector Lopez, Belaid Moa, Richard Furuta, Ray Siemens

In this paper we describe our current efforts towards building a framework that extends the functionality of an Open Access Repository by implementing processes to incorporate the ongoing trends in social media into the context of a digital collection.

We refer to these processes collectively as the Social Media Engine. The purpose of our framework is twofold: first, we propose to challenge some of the preconceived notions of digital libraries by making repositories more dynamic; and second, by challenging this notion we want to promote public engagement and open scholarship.

As a work in progress, we believe that a real challenge lies in investigating the implications that these two points introduce within the context of the humanities.

URL : Aligning Social Media Indicators with the Documents in an Open Access Repository


Do altmetrics work for assessing research quality?

Authors : Andrea Giovanni Nuzzolese, Paolo Ciancarini, Aldo Gangemi, Silvio Peroni, Francesco Poggi, Valentina Presutti

Alternative metrics (aka altmetrics) are gaining increasing interest in the scientometrics community as they can capture both the volume and quality of attention that a research work receives online.

Nevertheless, there is limited knowledge about their effectiveness as a mean for measuring the impact of research if compared to traditional citation-based indicators.

This work aims at rigorously investigating if any correlation exists among indicators, either traditional (i.e. citation count and h-index) or alternative (i.e. altmetrics) and which of them may be effective for evaluating scholars.

The study is based on the analysis of real data coming from the National Scientific Qualification procedure held in Italy by committees of peers on behalf of the Italian Ministry of Education, Universities and Research.


The insoluble problems of books: What does have to offer?

Authors : Daniel Torres-Salinas, Juan Gorraiz, Nicolas Robinson-Garcia

The purpose of this paper is to analyze the capabilities, functionalities and appropriateness of as a data source for the bibliometric analysis of books in comparison to PlumX.

We perform an exploratory analysis on the metrics the Altmetric Explorer for Institutions platform offers for books. We use two distinct datasets of books: the Book Collection included in and the Clarivate’s Master Book List, to analyze’s capabilities to download and merge data with external databases.

Finally, we compare our findings with those obtained in a previous study performed in PlumX. combines and orderly tracks a set of data sources combined by DOI identifiers to retrieve metadata from books, being Google Books its main provider. It also retrieves information from commercial publishers and from some Open Access initiatives, including those led by university libraries such as Harvard Library.

We find issues with linkages between records and mentions or ISBN discrepancies. Furthermore, we find that automatic bots affect greatly Wikipedia mentions to books. Our comparison with PlumX suggests that none of these tools provide a complete picture of the social attention generated by books and are rather complementary than comparable tools.


What increases (social) media attention: Research impact, author prominence or title attractiveness?

Authors : Olga Zagovora, Katrin Weller, Milan Janosov, Claudia Wagner, Isabella Peters

Do only major scientific breakthroughs hit the news and social media, or does a ‘catchy’ title help to attract public attention? How strong is the connection between the importance of a scientific paper and the (social) media attention it receives?

In this study we investigate these questions by analysing the relationship between the observed attention and certain characteristics of scientific papers from two major multidisciplinary journals: Nature Communication (NC) and Proceedings of the National Academy of Sciences (PNAS).

We describe papers by features based on the linguistic properties of their titles and centrality measures of their authors in their co-authorship network.

We identify linguistic features and collaboration patterns that might be indicators for future attention, and are characteristic to different journals, research disciplines, and media sources.


Do altmetrics assess societal impact in the same way as case studies? An empirical analysis testing the convergent validity of altmetrics based on data from the UK Research Excellence Framework (REF)

Authors : Lutz Bornmann, Robin Haunschild, Jonathan Adams

Altmetrics have been proposed as a way to assess the societal impact of research. Although altmetrics are already in use as impact or attention metrics in different contexts, it is still not clear whether they really capture or reflect societal impact.

This study is based on altmetrics, citation counts, research output and case study data from the UK Research Excellence Framework (REF), and peers’ REF assessments of research output and societal impact. We investigated the convergent validity of altmetrics by using two REF datasets: publications submitted as research output (PRO) to the REF and publications referenced in case studies (PCS).

Case studies, which are intended to demonstrate societal impact, should cite the most relevant research papers. We used the MHq’ indicator for assessing impact – an indicator which has been introduced for count data with many zeros.

The results of the first part of the analysis show that news media as well as mentions on Facebook, in blogs, in Wikipedia, and in policy-related documents have higher MHq’ values for PCS than for PRO.

Thus, the altmetric indicators seem to have convergent validity for these data. In the second part of the analysis, altmetrics have been correlated with REF reviewers’ average scores on PCS. The negative or close to zero correlations question the convergent validity of altmetrics in that context.

We suggest that they may capture a different aspect of societal impact (which can be called unknown attention) to that seen by reviewers (who are interested in the causal link between research and action in society).


Social media metrics for new research evaluation

Authors : Paul Wouters, Zohreh Zahedi, Rodrigo Costas

This chapter approaches, both from a theoretical and practical perspective, the most important principles and conceptual frameworks that can be considered in the application of social media metrics for scientific evaluation.

We propose conceptually valid uses for social media metrics in research evaluation. The chapter discusses frameworks and uses of these metrics as well as principles and recommendations for the consideration and application of current (and potentially new) metrics in research evaluation.

URL  :