A review of the literature on citation impact indicators

Citation impact indicators nowadays play an important role in research evaluation, and consequently these indicators have received a lot of attention in the bibliometric and scientometric literature. This paper provides an in-depth review of the literature on citation impact indicators. First, an overview is given of the literature on bibliographic databases that can be used to calculate citation impact indicators (Web of Science, Scopus, and Google Scholar).

Next, selected topics in the literature on citation impact indicators are reviewed in detail. The first topic is the selection of publications and citations to be included in the calculation of citation impact indicators. The second topic is the normalization of citation impact indicators, in particular normalization for field differences.

Counting methods for dealing with co-authored publications are the third topic, and citation impact indicators for journals are the last topic. The paper concludes by offering some recommendations for future research.

URL : http://arxiv.org/abs/1507.02099

Open Access Meets Discoverability: Citations to Articles Posted to Academia.edu

Using matching and regression analyses, we measure the difference in citations between articles posted to Academia.edu and other articles from similar journals, controlling for field, impact factor, and other variables. Based on a sample size of 31,216 papers, we find that a paper in a median impact factor journal uploaded to Academia.edu receives 16% more citations after one year than a similar article not available online, 51% more citations after three years, and 69% after five years. We also found that articles also posted to Academia.edu had 58% more citations than articles only posted to other online venues, such as personal and departmental home pages, after five years.

URL : Open Access Meets Discoverability: Citations to Articles Posted to Academia.edu

DOI : 10.1371/journal.pone.0148257

Meaningful Metrics: A 21st Century Librarian’s Guide to Bibliometrics, Altmetrics, and Research Impact

What does it mean to have meaningful metrics in today’s complex higher education landscape? With a foreword by Heather Piwowar and Jason Priem, this highly engaging and activity-laden book serves to introduce readers to the fast-paced world of research metrics from the unique perspective of academic librarians and LIS practitioners.

Starting with the essential histories of bibliometrics and altmetrics, and continuing with in-depth descriptions of the core tools and emerging issues at stake in the future of both fields, Meaningful Metrics is a convenient all-in-one resource that is designed to be used by a range of readers, from those with little to no background on the subject to those looking to become movers and shakers in the current scholarly metrics movement. Authors Borchardt and Roemer, offer tips, tricks, and real-world examples illustrate how librarians can support the successful adoption of research metrics, whether in their institutions or across academia as a whole.

URL : http://www.ala.org/acrl/sites/ala.org.acrl/files/content/publications/booksanddigitalresources/digital/9780838987568_metrics_OA.pdf

Tweets Do Measure Non – Citational Intellectual Impact

Purpose

The aim of the paper is to identify the motive behind the social media indicators in focus to tweets and attempts to identify what is measured or indicated by tweets, based on these motives.

Design/methodology/approach

Documents with non zero tweets were manually collected from a source of 5 journals – Nature Biotechnology, Nature Nanotechnology, Nature Physics, Nature Chemistry and Nature Communications for the period January 2014 – October 2014 so as to depict the contemporary trend, as tweets tends to have L shaped curve in time-wise distribution.

Findings

Investigations suggest that the motives behind the tweets are research reach, research acceptance and research usage. Further analysis revealed that the motive behind self – tweets are research visibility, which is one of the attributes of social media and therefore self tweets may not be a complex problem as expected seeing that documents are self tweeted not more than once in most cases.

Furthermore, identifying and classifying tweets based on users – Publishers, Frequent tweeters who apparently tweet all documents of an issue and Authors will increase the effectiveness of altmetrics in research evaluation. It was also found that association between subjects can be identified by the analysis of tweets pattern among subjects.

Originality/value

Study proposes an overall hierarchical structure of impact based on the change/advancement instigated. The study confirms that tweets do measure non – academic intellectual impact that is not captured by traditional metrics.

URL : http://www.itlit.net/v2n2art2.pdf

Wikiometrics: A Wikipedia Based Ranking System

We present a new concept – Wikiometrics – the derivation of metrics and indicators from Wikipedia. Wikipedia provides an accurate representation of the real world due to its size, structure, editing policy and popularity. We demonstrate an innovative mining methodology, where different elements of Wikipedia – content, structure, editorial actions and reader reviews – are used to rank items in a manner which is by no means inferior to rankings produced by experts or other methods. We test our proposed method by applying it to two real-world ranking problems: top world universities and academic journals. Our proposed ranking methods were compared to leading and widely accepted benchmarks, and were found to be extremely correlative but with the advantage of the data being publically available.

URL : http://arxiv.org/abs/1601.01058

Evaluating the Impact of Open Access at Berkeley: Results from the 2015 Survey of Berkeley Research Impact Initiative (BRII) Funding Recipients

The Berkeley Research Impact Initiative (BRII) was one of the first campus-based open access (OA) funds to be established in North America and one of the most active, distributing more than $244,000 to support University of California (UC) Berkeley authors. In April 2015, we conducted a qualitative study of 138 individuals who had received BRII funding to survey their opinions about the benefits and funding of open access.

Most respondents believe their articles had a greater impact as open access, expect to tap multiple sources to fund open access fees, and support the UC Open Access Policy and its goal of making research public and accessible. Results of the survey and a discussion of their impact on the BRII program follow.

URL : http://crl.acrl.org/content/early/2015/11/05/crl15-824.short

 

Examining the Impact of the National Institutes of Health Public Access Policy on the Citation Rates of Journal Articles

Purpose

To examine whether National Institutes of Health (NIH) funded articles that were archived in PubMed Central (PMC) after the release of the 2008 NIH Public Access Policy show greater scholarly impact than comparable articles not archived in PMC.

Methods

A list of journals across several subject areas was developed from which to collect article citation data. Citation information and cited reference counts of the articles published in 2006 and 2009 from 122 journals were obtained from the Scopus database. The articles were separated into categories of NIH funded, non-NIH funded and whether they were deposited in PubMed Central. An analysis of citation data across a five-year timespan was performed on this set of articles.

Results

A total of 45,716 articles were examined, including 7,960 with NIH-funding. An analysis of the number of times these articles were cited found that NIH-funded 2006 articles in PMC were not cited significantly more than NIH-funded non-PMC articles. However, 2009 NIH funded articles in PMC were cited 26% more than 2009 NIH funded articles not in PMC, 5 years after publication. This result is highly significant even after controlling for journal (as a proxy of article quality and topic).

Conclusion

Our analysis suggests that factors occurring between 2006 and 2009 produced a subsequent boost in scholarly impact of PubMed Central. The 2008 Public Access Policy is likely to be one such factor, but others may have contributed as well (e.g., growing size and visibility of PMC, increasing availability of full-text linkouts from PubMed, and indexing of PMC articles by Google Scholar).

URL : Examining the Impact of the National Institutes of Health Public Access Policy on the Citation Rates of Journal Articles

DOI : 10.1371/journal.pone.0139951