Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology

Authors : Michele Nuijten, Jeroen Borghuis, Coosje Veldkamp, Linda Alvarez, Marcel van Assen, Jelte Wicherts

In this paper, we present three studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of statistical significance (Wicherts, Bakker, & Molenaar, 2011).

We therefore hypothesized that journal policies about data sharing and data sharing itself would reduce these inconsistencies. In Study 1, we compared the prevalence of reporting inconsistencies in two similar journals on decision making with different data sharing policies.

In Study 2, we compared reporting inconsistencies in articles published in PLOS (with a data sharing policy) and Frontiers in Psychology (without a data sharing policy). In Study 3, we looked at papers published in the journal Psychological Science to check whether papers with or without an Open Practice Badge differed in the prevalence of reporting errors.

Overall, we found no relationship between data sharing and reporting inconsistencies. We did find that journal policies on data sharing are extremely effective in promoting data sharing.

We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.

DOI : https://dx.doi.org/10.17605/OSF.IO/SGBTA

Replicability and Reproducibility in Comparative Psychology

Author : Jeffrey R. Stevens

Psychology faces a replication crisis. The Reproducibility Project: Psychology sought to replicate the effects of 100 psychology studies. Though 97% of the original studies produced statistically significant results, only 36% of the replication studies did so (Open Science Collaboration, 2015).

This inability to replicate previously published results, however, is not limited to psychology (Ioannidis, 2005). Replication projects in medicine (Prinz et al., 2011) and behavioral economics (Camerer et al., 2016) resulted in replication rates of 25 and 61%, respectively, and analyses in genetics (Munafò, 2009) and neuroscience (Button et al., 2013) question the validity of studies in those fields. Science, in general, is reckoning with challenges in one of its basic tenets: replication.

Comparative psychology also faces the grand challenge of producing replicable research. Though social psychology has born the brunt of most of the critique regarding failed replications, comparative psychology suffers from some of the same problems faced by social psychology (e.g., small sample sizes).

Yet, comparative psychology follows the methods of cognitive psychology by often using within-subjects designs, which may buffer it from replicability problems (Open Science Collaboration, 2015). In this Grand Challenge article, I explore the shared and unique challenges of and potential solutions for replication and reproducibility in comparative psychology.

URL : Replicability and Reproducibility in Comparative Psychology

Alternative location : http://journal.frontiersin.org/article/10.3389/fpsyg.2017.00862/full

Intellectual Property’s Great Fallacy

Intellectual property law has long been justified on the belief that external incentives are necessary to get people to produce artistic works and technological innovations that are easily copied.

This Essay argues that this foundational premise of the economic theory of intellectual property is wrong. Using recent advances in behavioral economics, psychology, and business-management studies, it is now possible to show that there are natural and intrinsic motivations that will cause technology and the arts to flourish even in the absence of externally supplied rewards, such as copyrights and patents.

URL : http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1746343