Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology

Authors : Michele Nuijten, Jeroen Borghuis, Coosje Veldkamp, Linda Alvarez, Marcel van Assen, Jelte Wicherts

In this paper, we present three studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of statistical significance (Wicherts, Bakker, & Molenaar, 2011).

We therefore hypothesized that journal policies about data sharing and data sharing itself would reduce these inconsistencies. In Study 1, we compared the prevalence of reporting inconsistencies in two similar journals on decision making with different data sharing policies.

In Study 2, we compared reporting inconsistencies in articles published in PLOS (with a data sharing policy) and Frontiers in Psychology (without a data sharing policy). In Study 3, we looked at papers published in the journal Psychological Science to check whether papers with or without an Open Practice Badge differed in the prevalence of reporting errors.

Overall, we found no relationship between data sharing and reporting inconsistencies. We did find that journal policies on data sharing are extremely effective in promoting data sharing.

We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.

DOI : https://dx.doi.org/10.17605/OSF.IO/SGBTA

A Proposed Currency System for Academic Peer Review Payments Using the BlockChain Technology

Author : Michael Spearpoint

Peer review of scholarly papers is seen to be a critical step in the publication of high quality outputs in reputable journals.

However, it appears that there are few incentives for researchers to agree to conduct suitable reviews in a timely fashion and in some cases unscrupulous practices are occurring as part of the production of academic research output. Innovations in internet-based technologies mean that there are ways in which some of the challenges can be addressed.

In particular, this paper proposes a new currency system using the BlockChain as its basis that provides a number of solutions. Potential benefits and problems of using the technology are discussed in the paper and these will need further investigation should the idea develop further.

Ultimately, the currency could be used as an alternative publication metric for authors, institutions and journals.

URL : A Proposed Currency System for Academic Peer Review Payments Using the BlockChain Technology

Alternative location : http://www.mdpi.com/2304-6775/5/3/19

Sci-Hub provides access to nearly all scholarly literature

Authors : Daniel S. Himmelstein, Ariel R. Romero, Bastian Greshake Tzovaras, Casey S. Greene

The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci-Hub has grown rapidly in popularity.

However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in closed access journals.

Furthermore, Sci-Hub contains 77.0% of the 5.2 million articles published by inactive journals. Coverage varies by discipline, with 92.8% coverage of articles in chemistry journals compared to 76.3% for computer science.

Coverage also varies by publisher, with the coverage of the largest publisher, Elsevier, at 97.3%. Our interactive browser at greenelab.github.io/scihub allows users to explore these findings in more detail.

Finally, we estimate that over a six month period in 2015–2016, Sci-Hub provided access for 99.3% of valid incoming requests. Hence, the scope of this resource suggests the subscription publishing model is becoming unsustainable. For the first time, the overwhelming majority of scholarly literature is available gratis to anyone.

URL : https://greenelab.github.io/scihub-manuscript/

 

Medical Theses and Derivative Articles: Dissemination Of Contents and Publication Patterns

Authors : Mercedes Echeverria, David Stuart, Tobias Blanke

Doctoral theses are an important source of publication in universities, although little research has been carried out on the publications resulting from theses, on so-called derivative articles.

This study investigates how derivative articles can be identified through a text analysis based on the full-text of a set of medical theses and the full-text of articles, with which they shared authorship.

The text similarity analysis methodology applied consisted in exploiting the full-text articles according to organization of scientific discourse (IMRaD) using the TurnItIn plagiarism tool.

The study found that the text similarity rate in the Discussion section can be used to discriminate derivative articles from non-derivative articles.

Additional findings were: the first position of the thesis’s author dominated in 85% of derivative articles, the participation of supervisors as coauthors occurred in 100% of derivative articles, the authorship credit retained by the thesis’s author was 42% in derivative articles, the number of coauthors by article was 5 in derivative articles versus 6.4 coauthors, as average, in non-derivative articles and the time differential regarding the year of thesis completion showed that 87.5% of derivative articles were published before or in the same year of thesis completion.

URL : https://arxiv.org/abs/1707.04439

Scientist impact factor (SIF): a new metric for improving scientists’ evaluation?

Authors : Giuseppe Lippi, Camilla Mattiuzzi

Background

The publication of scientific research is the mainstay for knowledge dissemination, but is also an essential criterion of scientists’ evaluation for recruiting funds and career progression.

Although the most widespread approach for evaluating scientists is currently based on the H-index, the total impact factor (IF) and the overall number of citations, these metrics are plagued by some well-known drawbacks. Therefore, with the aim to improve the process of scientists’ evaluation, we developed a new and potentially useful indicator of recent scientific output.

Methods

The new metric scientist impact factor (SIF) was calculated as all citations of articles published in the two years following the publication year of the articles, divided by the overall number of articles published in that year. The metrics was then tested by analyzing data of the 40 top scientists of the local University.

Results

No correlation was found between SIF and H-index (r=0.15; P=0.367) or 2 years H-index (r=−0.01; P=0.933), whereas the H-Index and 2 years H-index values were found to be highly correlated (r=0.57; P<0.001). A highly significant correlation was also observed between the articles published in one year and the total number of citations to these articles in the two following years (r=0.62; P<0.001).

Conclusions

According to our data, the SIF may be a useful measure to complement current metrics for evaluating scientific output. Its use may be especially helpful for young scientists, wherein the SIF reflects the scientific output over the past two years thus increasing their chances to apply to and obtain competitive funding.

URL : http://atm.amegroups.com/article/view/15375

 

The changing role of research publishing: a case study from Springer Nature

Author : Steven Inchcoombe

Using Springer Nature as a case study this article explores the future of research publishing, with the guiding objective of identifying how such organizations can better serve the needs of researchers and those that support researchers (particularly academic institutions, institutional libraries, research funding bodies and academic societies) as we work together to help advance discovery for the benefit of all.

Progress in four key areas is described: improving the publishing process, innovating across science communication, driving the growth and development of open research and adding value beyond publishing.

The aim of this article is thus to set out a clear vision of what research publishers can achieve if they especially focus on addressing researchers’ needs and apply their considerable resources and expertise accordingly.

If delivered with care, this vision should enable research publishers to help advance discovery, publish more robust and insightful research, support the development of new areas of knowledge and understanding, and make these ideas and this information accessible, usable and reusable by humans and machines alike.

URL : The changing role of research publishing: a case study from Springer Nature

DOI : http://doi.org/10.1629/uksg.355

 

Journal of Open Source Software (JOSS): design and first-year review

Authors : Arfon M Smith, Kyle E Niemeyer, Daniel S Katz, Lorena A Barba, George Githinji, Melissa Gymrek, Kathryn D Huff, Christopher R Madan, Abigail Cabunoc Mayes, Kevin M Moerman, Pjotr Prins, Karthik Ram, Ariel Rokem, Tracy K Teal, Roman Valls Guimera, Jacob T Vanderplas

This article describes the motivation, design, and progress of the Journal of Open Source Software (JOSS). JOSS is a free and open-access journal that publishes articles describing research software. It has the dual goals of improving the quality of the software submitted and providing a mechanism for research software developers to receive credit.

While designed to work within the current merit system of science, JOSS addresses the dearth of rewards for key contributions to science made in the form of software. JOSS publishes articles that encapsulate scholarship contained in the software itself, and its rigorous peer review targets the software components: functionality, documentation, tests, continuous integration, and the license.

A JOSS article contains an abstract describing the purpose and functionality of the software, references, and a link to the software archive. The article is the entry point of a JOSS submission, which encompasses the full set of software artifacts.

Submission and review proceed in the open, on GitHub. Editors, reviewers, and authors work collaboratively and openly. Unlike other journals, JOSS does not reject articles requiring major revision; while not yet accepted, articles remain visible and under review until the authors make adequate changes (or withdraw, if unable to meet requirements).

Once an article is accepted, JOSS gives it a DOI, deposits its metadata in Crossref, and the article can begin collecting citations on indexers like Google Scholar and other services. Authors retain copyright of their JOSS article, releasing it under a Creative Commons Attribution 4.0 International License.

In its first year, starting in May 2016, JOSS published 111 articles, with more than 40 additional articles currently under review. JOSS is a sponsored project of the nonprofit organization NumFOCUS and is an affiliate of the Open Source Initiative.

URL : https://arxiv.org/abs/1707.02264