Open Up – the Mission Statement of the Control of Impulsive Action (Ctrl-ImpAct) Lab on Open Science

Authors : Christina B. Reimer, Zhang Chen, Carsten Bundt, Charlotte Eben, Raquel E. London, Sirarpi Vardanian

The present paper is the mission statement of the Control of Impulsive Action (Ctrl-ImpAct) Lab regarding Open Science. As early-career researchers (ECRs) in the lab, we first state our personal motivation to conduct research based on the principles of Open Science.

We then describe how we incorporate four specific Open Science practices (i.e., Open Methodology, Open Data, Open Source, and Open Access) into our scientific workflow. In more detail, we explain how Open Science practices are embedded into the so-called ‘co-pilot’ system in our lab.

The ‘co-pilot’ researcher is involved in all tasks of the ‘pilot’ researcher, that is designing a study, double-checking experimental and data analysis scripts, as well as writing the manuscript.

The lab has set up this co-pilot system to increase transparency, reduce potential errors that could occur during the entire workflow, and to intensify collaborations between lab members.

Finally, we discuss potential solutions for general problems that could arise when practicing Open Science.

URL : Open Up – the Mission Statement of the Control of Impulsive Action (Ctrl-ImpAct) Lab on Open Science

DOI : http://doi.org/10.5334/pb.494

Scholarly Communication and Open Access in Psychology: Current Considerations for Researchers

Author : Laura Bowering Mullen

Scholarly communication and open access practices in psychological science are rapidly evolving. However, most published works that focus on scholarly communication issues do not target the specific discipline, and instead take a more “one size fits all” approach.

When it comes to scholarly communication, practices and traditions vary greatly across the disciplines. It is important to look at issues such as open access (of all types), reproducibility, research data management, citation metrics, the emergence of preprint options, the evolution of new peer review models, coauthorship conventions, and use of scholarly networking sites such as ResearchGate and Academia.edu from a disciplinary perspective.

Important issues in scholarly publishing for psychology include uptake of authors’ use of open access megajournals, how open science is represented in psychology journals, challenges of interdisciplinarity, and how authors avail themselves of green and gold open access strategies.

This overview presents a discipline-focused treatment of selected scholarly communication topics that will allow psychology researchers and others to get up to speed on this expansive topic.

Further study into researcher behavior in terms of scholarly communication in psychology would create more understanding of existing culture as well as provide early career researchers with a more effective roadmap to the current landscape.

As no other single work provides a study of scholarly communication and open access in psychology, this work aims to partially fill that niche.

DOI : https://doi.org/10.31234/osf.io/2d7um

A Principled Approach to Online Publication Listings and Scientific Resource Sharing

Authors : Jacquelijn Ringersma, Karin Kastens, Ulla Tschida, Jos van Berkum

The Max Planck Institute (MPI) for Psycholinguistics has developed a service to manage and present the scholarly output of their researchers. The PubMan database manages publication metadata and full-texts of publications published by their scholars.

All relevant information regarding a researcher’s work is brought together in this database, including supplementary materials and links to the MPI database for primary research data.

The PubMan metadata is harvested into the MPI website CMS (Plone). The system developed for the creation of the publication lists, allows the researcher to create a selection of the harvested data in a variety of formats.

URL : https://journal.code4lib.org/articles/2520

Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology

Authors : Michele Nuijten, Jeroen Borghuis, Coosje Veldkamp, Linda Alvarez, Marcel van Assen, Jelte Wicherts

In this paper, we present three studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of statistical significance (Wicherts, Bakker, & Molenaar, 2011).

We therefore hypothesized that journal policies about data sharing and data sharing itself would reduce these inconsistencies. In Study 1, we compared the prevalence of reporting inconsistencies in two similar journals on decision making with different data sharing policies.

In Study 2, we compared reporting inconsistencies in articles published in PLOS (with a data sharing policy) and Frontiers in Psychology (without a data sharing policy). In Study 3, we looked at papers published in the journal Psychological Science to check whether papers with or without an Open Practice Badge differed in the prevalence of reporting errors.

Overall, we found no relationship between data sharing and reporting inconsistencies. We did find that journal policies on data sharing are extremely effective in promoting data sharing.

We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.

DOI : https://dx.doi.org/10.17605/OSF.IO/SGBTA

Replicability and Reproducibility in Comparative Psychology

Author : Jeffrey R. Stevens

Psychology faces a replication crisis. The Reproducibility Project: Psychology sought to replicate the effects of 100 psychology studies. Though 97% of the original studies produced statistically significant results, only 36% of the replication studies did so (Open Science Collaboration, 2015).

This inability to replicate previously published results, however, is not limited to psychology (Ioannidis, 2005). Replication projects in medicine (Prinz et al., 2011) and behavioral economics (Camerer et al., 2016) resulted in replication rates of 25 and 61%, respectively, and analyses in genetics (Munafò, 2009) and neuroscience (Button et al., 2013) question the validity of studies in those fields. Science, in general, is reckoning with challenges in one of its basic tenets: replication.

Comparative psychology also faces the grand challenge of producing replicable research. Though social psychology has born the brunt of most of the critique regarding failed replications, comparative psychology suffers from some of the same problems faced by social psychology (e.g., small sample sizes).

Yet, comparative psychology follows the methods of cognitive psychology by often using within-subjects designs, which may buffer it from replicability problems (Open Science Collaboration, 2015). In this Grand Challenge article, I explore the shared and unique challenges of and potential solutions for replication and reproducibility in comparative psychology.

URL : Replicability and Reproducibility in Comparative Psychology

Alternative location : http://journal.frontiersin.org/article/10.3389/fpsyg.2017.00862/full

Intellectual Property’s Great Fallacy

Intellectual property law has long been justified on the belief that external incentives are necessary to get people to produce artistic works and technological innovations that are easily copied.

This Essay argues that this foundational premise of the economic theory of intellectual property is wrong. Using recent advances in behavioral economics, psychology, and business-management studies, it is now possible to show that there are natural and intrinsic motivations that will cause technology and the arts to flourish even in the absence of externally supplied rewards, such as copyrights and patents.

URL : http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1746343