Use of the journal impact factor for assessing individual articles need not be statistically wrong

Authors : Ludo Waltman, Vincent A. Traag

Most scientometricians reject the use of the journal impact factor for assessing individual articles and their authors. The well-known San Francisco Declaration on Research Assessment also strongly objects against this way of using the impact factor.

Arguments against the use of the impact factor at the level of individual articles are often based on statistical considerations. The skewness of journal citation distributions typically plays a central role in these arguments.

We present a theoretical analysis of statistical arguments against the use of the impact factor at the level of individual articles. Our analysis shows that these arguments do not support the conclusion that the impact factor should not be used for assessing individual articles.

In fact, our computer simulations demonstrate the possibility that the impact factor is a more accurate indicator of the value of an article than the number of citations the article has received.

It is important to critically discuss the dominant role of the impact factor in research evaluations, but the discussion should not be based on misplaced statistical arguments. Instead, the primary focus should be on the socio-technical implications of the use of the impact factor.

URL : Use of the journal impact factor for assessing individual articles need not be statistically wrong

DOI : https://doi.org/10.12688/f1000research.23418.1

Rethinking the Journal Impact Factor and Publishing in the Digital Age

Authors : Mark S. Nestor, Daniel Fischer, David Arnold, Brian Berman, James Q. Del Rosso

Clinical and experimental literature search has changed significantly over the past few decades, and with it, the way in which we value information. Today, our need for immediate access to relevant and specific literature, regardless of specialty, has led to a growing demand for open access to publications.

The Journal Impact Factor (JIF) has been a long-time standard for representing the quality or “prestige” of a journal, but it appears to be losing its relevance. Here, we define the JIF and deconstruct its validity as a modern measure of a journal’s quality, discuss the current models of academic publication, including their advantages and shortcomings, and discuss the benefits and shortcomings of a variety of open-access models, including costs to the author.

We have quantified a nonsubscribed physician’s access to full articles associated with dermatologic disease and aesthetics cited on PubMed. For some of the most common dermatology conditions, 23.1 percent of citations (ranging from 17.2% for melasma to 31.9% for malignant melanoma) were available as free full articles, and for aesthetic procedures, 18.9 percent of citations (ranging from 11.9% for laser hair removal to 27.9% for botulinum toxin) were available as free full articles.

Finally, we discuss existing alternative metrics for measuring journal impact and propose the adoption of a superior publishing model, one that satisfies modern day standards of scholarly knowledge pursuit and dissemination of scholarly publications for dermatology and all of medical science.

URL : https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7028381/

Science through Wikipedia: A novel representation of open knowledge through co-citation networks

Authors : Wenceslao Arroyo-Machado, Daniel Torres-Salinas, Enrique Herrera-Viedma, Esteban Romero-Frías

This study provides an overview of science from the Wikipedia perspective. A methodology has been established for the analysis of how Wikipedia editors regard science through their references to scientific papers.

The method of co-citation has been adapted to this context in order to generate Pathfinder networks (PFNET) that highlight the most relevant scientific journals and categories, and their interactions in order to find out how scientific literature is consumed through this open encyclopaedia.

In addition to this, their obsolescence has been studied through Price index. A total of 1 433 457 references available at this http URL have been initially taken into account. After pre-processing and linking them to the data from Elsevier’s CiteScore Metrics the sample was reduced to 847 512 references made by 193 802 Wikipedia articles to 598 746 scientific articles belonging to 14 149 journals indexed in Scopus.

As highlighted results we found a significative presence of “Medicine” and “Biochemistry, Genetics and Molecular Biology” papers and that the most important journals are multidisciplinary in nature, suggesting also that high-impact factor journals were more likely to be cited. Furthermore, only 13.44% of Wikipedia citations are to Open Access journals.

URL : https://arxiv.org/abs/2002.04347

Envisioning the scientific paper of the future

Authors : Natalie M. Sopinka, Laura E. Coristine, Maria C. DeRosa, Chelsea M. Rochman, Brian L. Owens, Steven J. Cooke

Consider for a moment the rate of advancement in the scientific understanding of DNA. It is formidable; from Fredrich Miescher’s nuclein extraction in the 1860s to Rosalind Franklin’s double helix X-ray in the 1950s to revolutionary next-generation sequencing in the late 2000s.

Now consider the scientific paper, the medium used to describe and publish these advances. How is the scientific paper advancing to meet the needs of those who generate and use scientific information?

We review four essential qualities for the scientific paper of the future: (i) a robust source of trustworthy information that remains peer reviewed and is (ii) communicated to diverse users in diverse ways, (iii) open access, and (iv) has a measurable impact beyond Impact Factor.

Since its inception, scientific literature has proliferated. We discuss the continuation and expansion of practices already in place including: freely accessible data and analytical code, living research and reviews, changes to peer review to improve representation of under-represented groups, plain language summaries, preprint servers, evidence-informed decision-making, and altmetrics.

URL : Envisioning the scientific paper of the future

DOI : https://doi.org/10.1139/facets-2019-0012

The Pricing of Open Access Journals: Diverse Niches and Sources of Value in Academic Publishing

Authors : Kyle Siler, Koen Frenken

Open Access (OA) publishing has created new academic and economic niches in contemporary science. OA journals offer numerous publication outlets with varying editorial philosophies and business models.

This article analyzes the Directory of Open Access Journals (DOAJ) (N=12,127) to identify characteristics of OA academic journals related to the adoption of Article Processing Charge (APC)-based business models, as well as price points of journals that charge APCs. Journal Impact Factor (JIF), language, publisher mission, DOAJ Seal, economic and geographic regions of publishers, peer review duration and journal discipline are all significantly related to the adoption and pricing of journal APCs.

Even after accounting for other journal characteristics (prestige, discipline, publisher country), journals published by for-profit publishers charge the highest APCs. Journals with status endowments (JIF, DOAJ Seal), articles written in English, published in wealthier regions, and in medical or science-based disciplines are also relatively costlier.

The OA publishing market reveals insights into forces that create economic and academic value in contemporary science. Political and institutional inequalities manifest in the varying niches occupied by different OA journals and publishers.

URL : The Pricing of Open Access Journals: Diverse Niches and Sources of Value in Academic Publishing

DOI : https://doi.org/10.1162/qss_a_00016

Releasing a preprint is associated with more attention and citations for the peer-reviewed article

Authors : Darwin Y. Fu, Jacob J Hughey

Preprints in biology are becoming more popular, but only a small fraction of the articles published in peer-reviewed journals have previously been released as preprints.

To examine whether releasing a preprint on bioRxiv was associated with the attention and citations received by the corresponding peer-reviewed article, we assembled a dataset of 74,239 articles, 5,405 of which had a preprint, published in 39 journals.

Using log-linear regression and random-effects meta-analysis, we found that articles with a preprint had, on average, a 49% higher Altmetric Attention Score and 36% more citations than articles without a preprint.

These associations were independent of several other article- and author-level variables (such as scientific subfield and number of authors), and were unrelated to journal-level variables such as access model and Impact Factor.

This observational study can help researchers and publishers make informed decisions about how to incorporate preprints into their work.

URL : https://elifesciences.org/articles/52646

Does Monetary Support Increase Citation Impact of Scholarly Papers?

Authors : Yasar Tonta, Muge Akbulut

One of the main indicators of scientific development of a given country is the number of papers published in high impact scholarly journals. Many countries introduced performance-based research funding systems (PRFSs) to create a more competitive environment where prolific researchers get rewarded with subsidies to increase both the quantity and quality of papers.

Yet, subsidies do not always function as a leverage to improve the citation impact of scholarly papers. This paper investigates the effect of the publication support system of Turkey (TR) on the citation impact of papers authored by Turkish researchers.

Based on a stratified probabilistic sample of 4,521 TR-addressed papers, it compares the number of citations to determine if supported papers were cited more often than those of not supported ones, and if they were published in journals with relatively higher citation impact in terms of journal impact factors, article influence scores and quartiles.

Both supported and not supported papers received comparable number of citations per paper, and were published in journals with similar citation impact values. Findings suggest that subsidies do not seem to be an effective incentive to improve the quality of scholarly papers. Such support programs should therefore be reconsidered.

URL : https://arxiv.org/abs/1909.10068