Perspectives From Authors and Editors in the Biomedical Disciplines on Predatory Journals: Survey Study

Authors : Andrew J Cohen, German Patino, Puneet Kamal, Medina Ndoye, Anas Tresh, Jorge Mena, Christi Butler, Samuel Washington, Benjamin N Breyer

Background

Predatory journals fail to fulfill the tenets of biomedical publication: peer review, circulation, and access in perpetuity. Despite increasing attention in the lay and scientific press, no studies have directly assessed the perceptions of the authors or editors involved.

Objective

Our objective was to understand the motivation of authors in sending their work to potentially predatory journals. Moreover, we aimed to understand the perspective of journal editors at journals cited as potentially predatory.

Methods

Potential online predatory journals were randomly selected among 350 publishers and their 2204 biomedical journals. Author and editor email information was valid for 2227 total potential participants.

A survey for authors and editors was created in an iterative fashion and distributed. Surveys assessed attitudes and knowledge about predatory publishing. Narrative comments were invited.

Results

A total of 249 complete survey responses were analyzed. A total of 40% of editors (17/43) surveyed were not aware that they were listed as an editor for the particular journal in question.

A total of 21.8% of authors (45/206) confirmed a lack of peer review. Whereas 77% (33/43) of all surveyed editors were at least somewhat familiar with predatory journals, only 33.0% of authors (68/206) were somewhat familiar with them (P<.001). Only 26.2% of authors (54/206) were aware of Beall’s list of predatory journals versus 49% (21/43) of editors (P<.001).

A total of 30.1% of authors (62/206) believed their publication was published in a predatory journal. After defining predatory publishing, 87.9% of authors (181/206) surveyed would not publish in the same journal in the future.

Conclusions

Authors publishing in suspected predatory journals are alarmingly uninformed in terms of predatory journal quality and practices. Editors’ increased familiarity with predatory publishing did little to prevent their unwitting listing as editors.

Some suspected predatory journals did provide services akin to open access publication. Education, research mentorship, and a realignment of research incentives may decrease the impact of predatory publishing.

URL : Perspectives From Authors and Editors in the Biomedical Disciplines on Predatory Journals: Survey Study

DOI : https://doi.org/10.2196/13769

The limitations to our understanding of peer review

Authors : Jonathan P. Tennant, Tony Ross-Hellauer

Peer review is embedded in the core of our scholarly knowledge generation systems, conferring legitimacy on research while distributing academic capital and prestige on individuals.

Despite its critical importance, it curiously remains poorly understood in a number of dimensions. In order to address this, we have programmatically analysed peer review to assess where the major gaps in our theoretical and empirical understanding of it lie.

We distill this into core themes around editorial accountability, the subjectivity and bias of reviewers, the function and quality of peer review, the role of the reviewer, the social and epistemic implications of peer review, and regarding innovations in open peer review platforms and services.

We use this to present a guide for the future of peer review, and the development of a new research discipline based on the study of peer review. Such a field requires sustained funding and commitment from publishers and research funders, who both have a commitment to uphold the integrity of the published scholarly record.

This will require the design of a consensus for a minimal set of standards for what constitutes peer review, and the development of a shared data infrastructure to support this.

We recognise that many of the criticisms attributed to peer review might reflect wider issues within academia and wider society, and future care will be required in order to carefully demarcate and address these.

DOI : https://doi.org/10.31235/osf.io/jq623

Workflows Allowing Creation of Journal Article Supporting Information and Findable, Accessible, Interoperable, and Reusable (FAIR)-Enabled Publication of Spectroscopic Data

Authors : Agustin Barba, Santiago Dominguez, Carlos Cobas, David P. Martinsen, Charles Romain, Henry S. Rzepa; Felipe Seoane

There is an increasing focus on the part of academic institutions, funding agencies, and publishers, if not researchers themselves, on preservation and sharing of research data. Motivations for sharing include research integrity, replicability, and reuse.

One of the barriers to publishing data is the extra work involved in preparing data for publication once a journal article and its supporting information have been completed.

In this work, a method is described to generate both human and machine-readable supporting information directly from the primary instrumental data files and to generate the metadata to ensure it is published in accordance with findable, accessible, interoperable, and reusable (FAIR) guidelines.

Using this approach, both the human readable supporting information and the primary (raw) data can be submitted simultaneously with little extra effort.

Although traditionally the data package would be sent to a journal publisher for publication alongside the article, the data package could also be published independently in an institutional FAIR data repository.

Workflows are described that store the data packages and generate metadata appropriate for such a repository. The methods both to generate and to publish the data packages have been implemented for NMR data, but the concept is extensible to other types of spectroscopic data as well.

URL : Workflows Allowing Creation of Journal Article Supporting Information and Findable, Accessible, Interoperable, and Reusable (FAIR)-Enabled Publication of Spectroscopic Data

DOI : https://doi.org/10.1021/acsomega.8b03005

Is Scholarly Publishing Like Rock and Roll?

Author : David W. Lewis

This article uses Alan B. Krueger’s analysis of the music industry in his book Rockonomics: A Backstage Tour of What the Music Industry Can Teach Us About Economics and Life as a lens to consider the structure of scholarly publishing and what could happen to scholarly publishing going forward.

Both the music industry and scholarly publishing are facing disruption as their products become digital. Digital content provides opportunities to a create a better product at lower prices and in the music industry this has happened. Scholarly publishing has not yet done so.

Similarities and differences between the music industry and scholarly publishing will be considered. Like music, scholarly publishing appears to be a superstar industry. Both music and scholarly publishing are subject to piracy, which threatens revenue, though Napster was a greater disrupter than Sci-Hub seems to be.

It also appears that for a variety of reasons market forces are not effective in driving changes in business models and practices in scholarly publishing, at least not at the rate we would expect given the changes in technology. After reviewing similarities and differences, the prospects for the future of scholarly publishing will be considered.

URL : Is Scholarly Publishing Like Rock and Roll?

URL : http://hdl.handle.net/1805/20430

A Crisis in “Open Access”: Should Communication Scholarly Outputs Take 77 Years to Become Open Access?

Authors : Abbas Ghanbari Baghestan, Hadi Khaniki, Abdolhosein Kalantari, Mehrnoosh Akhtari-Zavare, Elaheh Farahmand, Ezhar Tamam, Nader Ale Ebrahim, Havva Sabani, Mahmoud Danaee

This study diachronically investigates the trend of the “open access” in the Web of Science (WoS) category of “communication.” To evaluate the trend, data were collected from 184 categories of WoS from 1980 to 2017.

A total of 87,997,893 documents were obtained, of which 95,304 (0.10%) were in the category of “communication.” In average, 4.24% of the documents in all 184 categories were open access. While in communication, it was 3.29%, which ranked communication 116 out of 184.

An Open Access Index (OAI) was developed to predict the trend of open access in communication. Based on the OAI, communication needs 77 years to fully reach open access, which undeniably can be considered as “crisis in scientific publishing” in this field.

Given this stunning information, it is the time for a global call for “open access” by communication scholars across the world. Future research should investigate whether the current business models of publications in communication scholarships are encouraging open access or pose unnecessary restrictions on knowledge development.

URL : A Crisis in “Open Access”: Should Communication Scholarly Outputs Take 77 Years to Become Open Access?

DOI : https://doi.org/10.1177/2158244019871044

Assessing the Quality of Scientific Papers

Authors : Roman Vainshtein, Gilad Katz, Bracha Shapira, Lior Rokach

A multitude of factors are responsible for the overall quality of scientific papers, including readability, linguistic quality, fluency,semantic complexity, and of course domain-specific technical factors.

These factors vary from one field of study to another. In this paper, we propose a measure and method for assessing the overall quality of the scientific papers in a particular field of study.

We evaluate our method in the computer science domain, but it can be applied to other technical and scientific fields.Our method is based on the corpus linguistics technique. This technique enables the extraction of required information and knowledge associated with a specific domain.

For this purpose, we have created a large corpus, consisting of papers from very high impact conferences. First, we analyze this corpus in order to extract rich domain-specific terminology and knowledge.

Then we use the acquired knowledge to estimate the quality of scientific papers by applying our proposed measure. We examine our measure on high and low scientific impact test corpora.

Our results show a significant difference in the measure scores of the high and low impact test corpora. Second, we develop a classifier based on our proposed measure and compare it to the baseline classifier.

Our results show that the classifier based on our measure over-performed the baseline classifier. Based on the presented results the proposed measure and the technique can be used for automated assessment of scientific papers.

URL : https://arxiv.org/abs/1908.04200

Meta-Research: Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations

Authors : Erin C McKiernan, Lesley A Schimanski, Carol Muñoz Nieves, Lisa Matthias, Meredith T Niles, Juan P Alperin

We analyzed how often and in what ways the Journal Impact Factor (JIF) is currently used in review, promotion, and tenure (RPT) documents of a representative sample of universities from the United States and Canada. 40% of research-intensive institutions and 18% of master’s institutions mentioned the JIF, or closely related terms.

Of the institutions that mentioned the JIF, 87% supported its use in at least one of their RPT documents, 13% expressed caution about its use, and none heavily criticized it or prohibited its use. Furthermore, 63% of institutions that mentioned the JIF associated the metric with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status.

We conclude that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the JIF.

URL : Meta-Research: Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations

DOI : https://doi.org/10.7554/eLife.47338.001