Revisiting an open access monograph experiment: measuring citations and tweets 5 years later

Author : Ronald Snijder

An experiment run in 2009 could not assess whether making monographs available in open access enhanced scholarly impact. This paper revisits the experiment, drawing on additional citation data and tweets. It attempts to answer the following research question: does open access have a positive influence on the number of citations and tweets a monograph receives, taking into account the influence of scholarly field and language?

The correlation between monograph citations and tweets is also investigated. The number of citations and tweets measured in 2014 reveal a slight open access advantage, but the influence of language or subject should also be taken into account. However, Twitter usage and citation behaviour hardly overlap.

URL : Revisiting an open access monograph experiment

Alternative location :

To what extent does the Leiden Manifesto also apply to altmetrics? A discussion of the manifesto against the background of research into altmetrics

Authors : Lutz Bornmann, Robin Haunschild


Hicks, Wouters, Waltman, de Rijcke, and Rafols (2015) have formulated the so-called Leiden manifesto, in which they have assembled the ten principles for a meaningful evaluation of research on the basis of bibliometric data.


In this work the attempt is made to indicate the relevance of the Leiden manifesto for altmetrics.


As shown by the discussion of the ten principles against the background of the knowledge about and the research into altmetrics, the principles also have a great importance for altmetrics and should be taken into account in their application.


Altmetrics is already frequently used in the area of research evaluation. Thus, it is important that the user of altmetrics data knows the relevance of the Leiden manifesto also in this area.



Tracking the Digital Footprints to Scholarly Articles from Social Media

Authors : Xianwen Wang, Zhichao Fang, Xinhui Guo

Scholarly articles are discussed and shared on social media, which generates altmetrics. On the opposite side, what is the impact of social media on the dissemination of scholarly articles and how to measure it? What are the visiting patterns?

Investigating these issues, the purpose of this study is to seek a solution to fill the research gap, specifically, to explore the dynamic visiting patterns directed by social media, and examine the effects of social buzz on the article visits.

Using the unique real referral data of 110 scholarly articles, which are daily updated in a 90-day period, this paper proposes a novel method to make analysis. We find that visits from social media are fast to accumulate but decay rapidly.

Twitter and Facebook are the two most important social referrals that directing people to scholarly articles, the two are about the same and account for over 95% of the total social referral directed visits.

There is synchronism between tweets and tweets resulted visits. Social media and open access are playing important roles in disseminating scholarly articles and promoting public understanding science, which are confirmed quantitatively for the first time with real data in this study.


Can we use altmetrics at the institutional level? A case study analysing the coverage by research areas of four Spanish universities

Authors : Daniel Torres-Salinas, Nicolas Robinson-Garcia, Evaristo Jiménez-Contreras

Social media based indicators or altmetrics have been under scrutiny for the last seven years. Their promise as alternative metrics for measuring scholarly impact is still far from becoming a reality.

Up to now, most studies have focused on the understanding of the nature and relation of altmetric indicators with citation data. Few papers have analysed research profiles based on altmetric data.

Most of these have related to researcher profiles and the expansion of these tools among researchers. This paper aims at exploring the coverage of the database and its potential use in order to show universities’ research profiles in relationship with other databases.

We analyse a sample of four different Spanish universities.First, we observe a low coverage of altmetric indicators with only 36 percent of all documents retrieved from the Web of Science having an ‘altmetric’ score.Second, we observe that for the four universities analysed, the area of Science shows higher ‘altmetric’ scores that the rest of the research areas.

Finally, considering the low coverage of altmetric data at the institutional level, it could be interesting for research policy makers to consider the development of guidelines and best practices guides to ensure that researchers disseminate adequately their research findings through social media.

URL : Can we use altmetrics at the institutional level? A case study analysing the coverage by research areas of four Spanish universities

Alternative location :

Measuring Book Impact Based on the Multi-granularity Online Review Mining

As with articles and journals, the customary methods for measuring books’ academic impact mainly involve citations, which is easy but limited to interrogating traditional citation databases and scholarly book reviews, Researchers have attempted to use other metrics, such as Google Books, libcitation, and publisher prestige.

However, these approaches lack content-level information and cannot determine the citation intentions of users. Meanwhile, the abundant online review resources concerning academic books can be used to mine deeper information and content utilizing altmetric perspectives.

In this study, we measure the impacts of academic books by multi-granularity mining online reviews, and we identify factors that affect a book’s impact. First, online reviews of a sample of academic books on are crawled and processed.

Then, multi-granularity review mining is conducted to identify review sentiment polarities and aspects’ sentiment values. Lastly, the numbers of positive reviews and negative reviews, aspect sentiment values, star values, and information regarding helpfulness are integrated via the entropy method, and lead to the calculation of the final book impact scores.

The results of a correlation analysis of book impact scores obtained via our method versus traditional book citations show that, although there are substantial differences between subject areas, online book reviews tend to reflect the academic impact.

Thus, we infer that online reviews represent a promising source for mining book impact within the altmetric perspective and at the multi-granularity content level. Moreover, our proposed method might also be a means by which to measure other books besides academic publications.


Altmetrics of « altmetrics » using Google Scholar, Twitter, Mendeley, Facebook, Google-plus, CiteULike, Blogs and Wiki

We measure the impact of « altmetrics » field by deploying altmetrics indicators using the data from Google Scholar, Twitter, Mendeley, Facebook, Google-plus, CiteULike, Blogs and Wiki during 2010- 2014.

To capture the social impact of scientific publications, we propose an index called alt-index, analogues to h-index. Across the deployed indices, our results have shown high correlation among the indicators that capture social impact.

While we observe medium Pearson’s correlation (\r{ho}= .247) among the alt-index and h-index, a relatively high correlation is observed between social citations and scholarly citations (\r{ho}= .646). Interestingly, we find high turnover of social citations in the field compared with the traditional scholarly citations, i.e. social citations are 42.2 % more than traditional citations.

The social mediums such as Twitter and Mendeley appear to be the most effective channels of social impact followed by Facebook and Google-plus. Overall, altmetrics appears to be working well in the field of « altmetrics ».


Research data explored: an extended analysis of citations and altmetrics

In this study, we explore the citedness of research data, its distribution over time and its relation to the availability of a digital object identifier (DOI) in the Thomson Reuters database Data Citation Index (DCI).

We investigate if cited research data “impacts” the (social) web, reflected by altmetrics scores, and if there is any relationship between the number of citations and the sum of altmetrics scores from various social media platforms.

Three tools are used to collect altmetrics scores, namely PlumX, ImpactStory, and, and the corresponding results are compared. We found that out of the three altmetrics tools, PlumX has the best coverage. Our experiments revealed that research data remain mostly uncited (about 85 %), although there has been an increase in citing data sets published since 2008.

The percentage of the number of cited research data with a DOI in DCI has decreased in the last years. Only nine repositories are responsible for research data with DOIs and two or more citations. The number of cited research data with altmetrics “foot-prints” is even lower (4–9 %) but shows a higher coverage of research data from the last decade. In our study, we also found no correlation between the number of citations and the total number of altmetrics scores.

Yet, certain data types (i.e. survey, aggregate data, and sequence data) are more often cited and also receive higher altmetrics scores. Additionally, we performed citation and altmetric analyses of all research data published between 2011 and 2013 in four different disciplines covered by the DCI.

In general, these results correspond very well with the ones obtained for research data cited at least twice and also show low numbers in citations and in altmetrics. Finally, we observed that there are disciplinary differences in the availability and extent of altmetrics scores.