We apply a novel mistake index to assess trends in the proportion of corrections published between 1993 and 2014 in Nature, Science and PNAS. The index revealed a progressive increase in the proportion of corrections published in these three high-quality journals.
The index appears to be independent of the journal impact factor or the number of items published, as suggested by a comparative analyses among 16 top scientific journals of different impact factors and disciplines. A more detailed analysis suggests that the trend in the time-to-correction increased significantly over time and also differed among journals (Nature 233 days; Science 136 days; PNAS 232 days).
A detailed review of 1,428 errors showed that 60% of corrections were related to figures, authors, references or results. According to the three categories established, 34.7% of the corrections were considered mild, 47.7% moderate and 17.6% severe, also differing among journals. Errors occurring during the printing process were responsible for 5% of corrections in Nature, 3% in Science and 18% in PNAS.
The measurement of the temporal trends in the quality of scientific manuscripts can assist editors and reviewers in identifying the most common mistakes, increasing the rigor of peer-review and improving the quality of published scientific manuscripts.
URL : Improving the peer-review process and editorial quality: key errors escaping the review and editorial process in top scientific journals
DOI : https://doi.org/10.7717/peerj.1670
Recent controversies highlighting substandard peer review in Open Access (OA) and traditional (subscription) journals have increased the need for authors, funders, publishers, and institutions to assure quality of peer-review in academic journals. I propose that transparency of the peer-review process may be seen as an indicator of the quality of peer-review, and develop and validate a tool enabling different stakeholders to assess transparency of the peer-review process.
Methods and Findings
Based on editorial guidelines and best practices, I developed a 14-item tool to rate transparency of the peer-review process on the basis of journals’ websites. In Study 1, a random sample of 231 authors of papers in 92 subscription journals in different fields rated transparency of the journals that published their work. Authors’ ratings of the transparency were positively associated with quality of the peer-review process but unrelated to journal’s impact factors.
In Study 2, 20 experts on OA publishing assessed the transparency of established (non-OA) journals, OA journals categorized as being published by potential predatory publishers, and journals from the Directory of Open Access Journals (DOAJ). Results show high reliability across items (α = .91) and sufficient reliability across raters. Ratings differentiated the three types of journals well.
In Study 3, academic librarians rated a random sample of 140 DOAJ journals and another 54 journals that had received a hoax paper written by Bohannon to test peer-review quality. Journals with higher transparency ratings were less likely to accept the flawed paper and showed higher impact as measured by the h5 index from Google Scholar.
The tool to assess transparency of the peer-review process at academic journals shows promising reliability and validity. The transparency of the peer-review process can be seen as an indicator of peer-review quality allowing the tool to be used to predict academic quality in new journals.
URL : Peer Review Quality and Transparency of the Peer-Review Process in Open Access and Subscription Journals
Openness is one of the central values of science. Open scientific practices such as sharing data, materials and analysis scripts alongside published articles have many benefits, including easier replication and extension studies, increased availability of data for theory-building and meta-analysis, and increased possibility of review and collaboration even after a paper has been published. Although modern information technology makes sharing easier than ever before, uptake of open practices had been slow. We suggest this might be in part due to a social dilemma arising from misaligned incentives and propose a specific, concrete mechanism—reviewers withholding comprehensive review—to achieve the goal of creating the expectation of open practices as a matter of scientific principle.
URL : The Peer Reviewers’ Openness Initiative: incentivizing open research practices through peer review
This paper is based on research commissioned by the Wellcome Trust in 2015 and catalogues current initiatives and trends in the systems and processes surrounding peer review. It considers issues such as open and interactive reviews, post-publication comments and ratings, and the platforms provided by both publishers and other organisations to support such activity; third-party peer review platforms; and measures from publishers and others to provide more recognition and rewards for peer reviewers. It also speculates on likely key trends in peer review for the future.
URL : http://onlinelibrary.wiley.com/doi/10.1002/leap.1008/abstract
A megajournal is an open-access journal that publishes any manuscript that presents scientifically trustworthy empirical results, without asking about the potential scientific contribution prior to publication. Megajournals have rapidly increased their output and are currently publishing around 50,000 articles per year. We report on a small pilot study in which we looked at the citation distributions for articles in megajournals compared with journals with traditional peer review, which also evaluate articles for contribution and novelty. We found that elite journals with very low acceptance rates have far fewer articles with no or few citations, but that the long tail of articles with two citations or less was actually bigger in a sample of selective traditional journals in comparison with megajournals. This indicates the need for more systematic studies, because the results raise many questions as to how efficiently the current peer review system in reality fulfils its filtering function.
URL : http://onlinelibrary.wiley.com/doi/10.1002/leap.1007/abstract