‘Nepotistic journals’: a survey of biomedical journals

Authors : Alexandre Scanff, Florian Naudet, Ioana Cristea, David Moher, Dorothy V M Bishop, Clara Locher

Context

Convergent analyses in different disciplines support the use of the Percentage of Papers by the Most Prolific author (PPMP) as a red flag to identify journals that can be suspected of questionable editorial practices. We examined whether this index, complemented by the Gini index, could be useful for identifying cases of potential editorial bias, using a large sample of biomedical journals.

Methods

We extracted metadata for all biomedical journals referenced in the National Library of Medicine, with any attributed Broad Subject Terms, and at least 50 authored (i.e. by at least one author) articles between 2015 and 2019, identifying the most prolific author (i.e. the person who signed the most papers in each particular journal).

We calculated the PPMP and the 2015-2019 Gini index for the distribution of articles across authors. When the relevant information was reported, we also computed the median publication lag (time between submission and acceptance) for articles authored by any of the most prolific authors and that for articles not authored by prolific authors.

For outlier journals, defined as a PPMP or Gini index above the 95th percentile of their respective distributions, a random sample of 100 journals was selected and described in relation to status on the editorial board for the most prolific author.

Results

5 468 journals that published 4 986 335 papers between 2015 and 2019 were analysed. The PPMP 95th percentile was 10.6% (median 2.9%). The Gini index 95th percentile was 0.355 (median 0.183). Correlation between the two indices was 0.35 (95CI 0.33 to 0.37). Information on publication lag was available for 2 743 journals.

We found that 277 journals (10.2%) had a median time lag to publication for articles by the most prolific author(s) that was shorter than 3 weeks, versus 51 (1.9%) journals with articles not authored by prolific author(s).

Among the random sample of outlier journals, 98 provided information about their editorial board. Among these 98, the most prolific author was part of the editorial board in 60 cases (61%), among whom 25 (26% of the 98) were editors-in-chief.

Discussion

In most journals publications are distributed across a large number of authors. Our results reveal a subset of journals where a few authors, often members of the editorial board, were responsible for a disproportionate number of publications.

The papers by these authors were more likely to be accepted for publication within 3 weeks of their submission. To enhance trust in their practices, journals need to be transparent about their editorial and peer review practices.

URL : ‘Nepotistic journals’: a survey of biomedical journals

DOI : https://doi.org/10.1101/2021.02.03.429520

Hostage authorship and the problem of dirty hands

Authors : William Bülow, Gert Helgesson

This article discusses gift authorship, the practice where co-authorship is awarded to a person who has not contributed significantly to the study. From an ethical point of view, gift authorship raises concerns about desert, fairness, honesty and transparency, and its prevalence in research is rightly considered a serious ethical concern.

We argue that even though misuse of authorship is always bad, there are instances where accepting requests of gift authorship may nevertheless be the right thing to do. More specifically, we propose that researchers may find themselves in a situation much similar to the problem of dirty hands, which has been frequently discussed in political philosophy and applied ethics.

The problem of dirty hands is relevant to what we call hostage authorship, where the researchers include undeserving authors unwillingly, and only because they find it unavoidable in order to accomplish a morally important research goal.

URL : Hostage authorship and the problem of dirty hands

DOI : https://doi.org/10.1177/1747016118764305

Improving the Measurement of Scientific Success by Reporting a Self-Citation Index

Authors : JustinW. Flatt, Alessandro Blasimme, Effy Vayena

Who among the many researchers is most likely to usher in a new era of scientific breakthroughs? This question is of critical importance to universities, funding agencies, as well as scientists who must compete under great pressure for limited amounts of research money.

Citations are the current primary means of evaluating one’s scientific productivity and impact, and while often helpful, there is growing concern over the use of excessive self-citations to help build sustainable careers in science.

Incorporating superfluous self-citations in one’s writings requires little effort, receives virtually no penalty, and can boost, albeit artificially, scholarly impact and visibility, which are both necessary for moving up the academic ladder.

Such behavior is likely to increase, given the recent explosive rise in popularity of web-based citation analysis tools (Web of Science, Google Scholar, Scopus, and Altmetric) that rank research performance.

Here, we argue for new metrics centered on transparency to help curb this form of self-promotion that, if left unchecked, can have a negative impact on the scientific workforce, the way that we publish new knowledge, and ultimately the course of scientific advance.

URL : Improving the Measurement of Scientific Success by Reporting a Self-Citation Index

DOI : http://www.mdpi.com/2304-6775/5/3/20

Principles of the Self Journal of Science: bringing ethics and freedom to scientific publishing

I present the core principles of the “Self-Journal of Science” (SJS), an open repository as well as a new paradigm of scientific publication. Rooted in Science ethics, a full and consistent solution is proposed to address the many flaws in current systems. SJS implements an optimal peer review, which itself becomes a measurable process, and builds an objective and unfalsifiable evaluation system.

In addition, it can operate at very low costs. One of the essential features of SJS is to allow every scientist to play his full role as a member of the scientific community and to be credited for all contributions – whether as author, referee, or editor. The output is the responsibility of each scientist, and no subgroup can dictate scientific policy to all.

By fully opening up the process of publication, peer pressure becomes the force that drives output towards the highest quality in a virtuous self-regulating circle. SJS also provides a self-organizing and scalable solution to handle an ever-increasing number of articles.

URL : Principles of the Self Journal of Science: bringing ethics and freedom to scientific publishing

Alternative location : http://www.sjscience.org/article?id=46

Publishing Ethics and Predatory Practices: A Dilemma for All Stakeholders of Science Communication

Publishing scholarly articles in traditional and newly-launched journals is a responsible task, requiring diligence from authors, reviewers, editors, and publishers. The current generation of scientific authors has ample opportunities for publicizing their research. However, they have to selectively target journals and publish in compliance with the established norms of publishing ethics. Over the past few years, numerous illegitimate or predatory journals have emerged in most fields of science. By exploiting gold Open Access publishing, these journals paved the way for low-quality articles that threatened to change the landscape of evidence-based science.

 

Authors, reviewers, editors, established publishers, and learned associations should be informed about predatory publishing practices and contribute to the trustworthiness of scholarly publications. In line with this, there have been several attempts to distinguish legitimate and illegitimate journals by blacklisting unethical journals (the Jeffrey Beall’s list), issuing a statement on transparency and best publishing practices (the Open Access Scholarly Publishers Association’s and other global organizations’ draft document), and tightening the indexing criteria by the Directory of Open Access Journals. None of these measures alone turned to be sufficient. All stakeholders of science communication should be aware of multiple facets of unethical practices and publish well-checked and evidence-based articles.