A simple proposal for the publication of journal citation distributions

Authors : Vincent Larivière, Véronique Kiermer, Catriona J. MacCallum, Marcia McNutt, Mark Patterson, Bernd Pulverer, Sowmya Swaminathan, Stuart Taylor, Stephen Curry

Although the Journal Impact Factor (JIF) is widely acknowledged to be a poor indicator of the quality of individual papers, it is used routinely to evaluate research and researchers. Here, we present a simple method for generating the citation distributions that underlie JIFs.

Application of this straightforward protocol reveals the full extent of the skew of distributions and variation in citations received by published papers that is characteristic of all scientific journals.

Although there are differences among journals across the spectrum of JIFs, the citation distributions overlap extensively, demonstrating that the citation performance of individual papers cannot be inferred from the JIF.

We propose that this methodology be adopted by all journals as a move to greater transparency, one that should help to refocus attention on individual pieces of work and counter the inappropriate usage of JIFs during the process of research assessment.

URL : A simple proposal for the publication of journal citation distributions

Alternative location : http://www.biorxiv.org/content/early/2016/07/05/062109.abstract

Open access publishing trend analysis: statistics beyond the perception

Authors : Elisabetta Poltronieri, Elena Bravo, Moreno Curti, Maurizio Ferri, Cristina Mancini


The purpose of this analysis was twofold: to track the number of open access journals acquiring impact factor, and to investigate the distribution of subject categories pertaining to these journals. As a case study, journals in which the researchers of the National Institute of Health (Istituto Superiore di Sanità) in Italy have published were surveyed.


Data were collected by searching open access journals listed in the Directory of Open Access Journals ) then compared with those having an impact factor as tracked by the Journal Citation Reports for the years 2010-2012. Journal Citation Reports subject categories were matched with Medical Subject Headings to provide a larger content classification.


A survey was performed to determine the Directory journals matching the Journal Citation Reports list, and their inclusion in a given subject area.


In the years 2010-2012, an increase in the number of journals was observed for Journal Citation Reports (+ 4.93%) and for the Directory (+18.51%). The discipline showing the highest increment was medicine (315 occurrences, 26%).


From 2010 to 2012, the number of open access journals with impact factor has gradually risen, with a prevalence for journals relating to medicine and biological science disciplines, suggesting that authors prefer to publish more than before in open access journals.

URL : http://www.informationr.net/ir/21-2/paper712.html


Can Scientific Impact Be Predicted?

Authors : Yuxiao Dong, Reid A. Johnson, Nitesh V. Chawla

A widely used measure of scientific impact is citations. However, due to their heavy-tailed distribution, citations are fundamentally difficult to predict.

Instead, to characterize scientific impact, we address two analogous questions asked by many scientific researchers: « How will my h-index evolve over time, and which of my previously or newly published papers will contribute to it? » To answer these questions, we perform two related tasks. First, we develop a model to predict authors’ future h-indices based on their current scientific impact. Second, we examine the factors that drive papers—either previously or newly published—to increase their authors’ predicted future h-indices.

By leveraging relevant factors, we can predict an author’s h-index in five years with an R2 value of 0.92 and whether a previously (newly) published paper will contribute to this future h-index with an F1 score of 0.99 (0.77).

We find that topical authority and publication venue are crucial to these effective predictions, while topic popularity is surprisingly inconsequential. Further, we develop an online tool that allows users to generate informed h-index predictions.

Our work demonstrates the predictability of scientific impact, and can help scholars to effectively leverage their position of « standing on the shoulders of giants. »

URL : https://arxiv.org/abs/1606.05905

Are Wikipedia Citations Important Evidence of the Impact of Scholarly Articles and Books?

Authors : Kayvan Koush, Mike Thelwall

Individual academics and research evaluators often need to assess the value of published research. Whilst citation counts are a recognised indicator of scholarly impact, alternative data is needed to provide evidence of other types of impact, including within education and wider society.

Wikipedia is a logical choice for both of these because the role of a general encyclopaedia is to be an understandable repository of facts about a diverse array of topics and hence it may cite research to support its claims.

To test whether Wikipedia could provide new evidence about the impact of scholarly research, this article counted citations to 302,328 articles and 18,735 monographs in English indexed by Scopus in the period 2005 to 2012.

The results show that citations from Wikipedia to articles are too rare for most research evaluation purposes, with only 5% of articles being cited in all fields. In contrast, a third of monographs have at least one citation from Wikipedia, with the most in the arts and humanities.

Hence, Wikipedia citations can provide extra impact evidence for academic monographs. Nevertheless, the results may be relatively easily manipulated and so Wikipedia is not recommended for evaluations affecting stakeholder interests.

URL : http://www.scit.wlv.ac.uk/~cm1993/papers/WikipediaCitations.pdf

Quantifying the changing role of past publications

Our current societies increasingly rely on electronic repositories of collective knowledge. An archetype of these databases is the Web of Science (WoS) that stores scientific publications. In contrast to several other forms of knowledge — e.g., Wikipedia articles — a scientific paper does not change after its « birth ».

Nonetheless, from the moment a paper is published it exists within the evolving web of other papers, thus, its actual meaning to the reader changes.

To track how scientific ideas (represented by groups of scientific papers) appear and evolve, we apply a novel combination of algorithms explicitly allowing for papers to change their groups. We (i) identify the overlapping clusters of the undirected yearly co-citation networks of the WoS (1975-2008) and (ii) match these yearly clusters (groups) to form group timelines.

After visualizing the longest lived groups of the entire data set we assign topic labels to the groups. We find that in the entire Web of Science multidisciplinarity is clearly over-represented among cutting edge ideas. In addition, we provide detailed examples for papers that (i) change their topic labels and (ii) move between groups.

URL : http://arxiv.org/abs/1605.00509

The academic, economic and societal impacts of Open Access: an evidence-based review

Ongoing debates surrounding Open Access to the scholarly literature are multifaceted and complicated by disparate and often polarised viewpoints from engaged stakeholders. At the current stage, Open Access has become such a global issue that it is critical for all involved in scholarly publishing, including policymakers, publishers, research funders, governments, learned societies, librarians, and academic communities, to be well-informed on the history, benefits, and pitfalls of Open Access.

In spite of this, there is a general lack of consensus regarding the advantages or disadvantages of Open Access at multiple levels. This review aims to to be a resource for current knowledge on the impacts of Open Access by synthesizing important research in three major areas of impact: academic, economic and societal.

While there is clearly much scope for additional research, several key trends are identified, including a broad citation advantage for researchers who publish openly, as well as additional benefits to the non-academic dissemination of their work.

The economic case for Open Access is less well-understood, although it is clear that access to the research literature is key for innovative enterprises, and a range of governmental and non-governmental services.

Furthermore, Open Access has the potential to save publishers and research funders considerable amounts of financial resources. The social case for Open Access is strong, in particular for advancing citizen science initiatives, and leveling the playing field for researchers in developing countries.

Open Access supersedes all potential alternative modes of access to the scholarly literature through enabling unrestricted re-use, and long-term stability independent of financial constraints of traditional publishers that impede knowledge sharing.

Open Access remains only one of the multiple challenges that the scholarly publishing system is currently facing. Yet, it provides one foundation for increasing engagement with researchers regarding ethical standards of publishing.

We recommend that Open Access supporters focus their efforts on working to establish viable new models and systems of scholarly communication, rather than trying to undermine the existing ones as part of the natural evolution of the scholarly ecosystem. Based on this, future research should investigate the wider impacts of an ecosystem-wide transformation to a system of Open Research.

URL : The academic, economic and societal impacts of Open Access: an evidence-based review

Alternative location : http://f1000research.com/articles/5-632/v1

A review of the literature on citation impact indicators

Citation impact indicators nowadays play an important role in research evaluation, and consequently these indicators have received a lot of attention in the bibliometric and scientometric literature. This paper provides an in-depth review of the literature on citation impact indicators. First, an overview is given of the literature on bibliographic databases that can be used to calculate citation impact indicators (Web of Science, Scopus, and Google Scholar).

Next, selected topics in the literature on citation impact indicators are reviewed in detail. The first topic is the selection of publications and citations to be included in the calculation of citation impact indicators. The second topic is the normalization of citation impact indicators, in particular normalization for field differences.

Counting methods for dealing with co-authored publications are the third topic, and citation impact indicators for journals are the last topic. The paper concludes by offering some recommendations for future research.

URL : http://arxiv.org/abs/1507.02099