Scholarly Communication and Open Access in Psychology: Current Considerations for Researchers

Author : Laura Bowering Mullen

Scholarly communication and open access practices in psychological science are rapidly evolving. However, most published works that focus on scholarly communication issues do not target the specific discipline, and instead take a more “one size fits all” approach.

When it comes to scholarly communication, practices and traditions vary greatly across the disciplines. It is important to look at issues such as open access (of all types), reproducibility, research data management, citation metrics, the emergence of preprint options, the evolution of new peer review models, coauthorship conventions, and use of scholarly networking sites such as ResearchGate and Academia.edu from a disciplinary perspective.

Important issues in scholarly publishing for psychology include uptake of authors’ use of open access megajournals, how open science is represented in psychology journals, challenges of interdisciplinarity, and how authors avail themselves of green and gold open access strategies.

This overview presents a discipline-focused treatment of selected scholarly communication topics that will allow psychology researchers and others to get up to speed on this expansive topic.

Further study into researcher behavior in terms of scholarly communication in psychology would create more understanding of existing culture as well as provide early career researchers with a more effective roadmap to the current landscape.

As no other single work provides a study of scholarly communication and open access in psychology, this work aims to partially fill that niche.

DOI : https://doi.org/10.31234/osf.io/2d7um

Correcting duplicate publications: follow up study of MEDLINE tagged duplications

Authors : Mario Malički, Ana Utrobičić, Ana Marušić

Introduction

As MEDLINE indexers tag similar articles as duplicates even when journals have not addressed the duplication(s), we sought to determine the reasons behind the tagged duplications, and if the journals had undertaken or had planned to undertake any actions to address them.

Materials and methods

On 16 January 2013, we extracted all tagged duplicate publications (DPs), analysed published notices, and then contacted MEDLINE and editors regarding cases unaddressed by notices.

For non-respondents, we compared full text of the articles. We followed up the study for the next 5 years to see if any changes occurred.

Results

We found 1011 indexed DPs, which represented 555 possible DP cases (in MEDLINE, both the original and the duplicate are assigned a DP tag). Six cases were excluded as we could not obtain their full text.

Additional 190 (35%) cases were incorrectly tagged as DPs. Of 359 actual cases of DPs, 200 (54%) were due to publishers’ actions (e.g. identical publications in the same journal), and 159 (46%) due to authors’ actions (e.g. article submission to more than one journal). Of the 359 cases, 185 (52%) were addressed by notices, but only 25 (7%) retracted.

Following our notifications, MEDLINE corrected 138 (73%) incorrectly tagged cases, and editors retracted 8 articles.

Conclusions

Despite clear policies on how to handle DPs, just half (54%) of the DPs in MEDLINE were addressed by journals and only 9% retracted. Publishers, editors, and indexers need to develop and implement standards for better correction of duplicate published records.

URL : Correcting duplicate publications: follow up study of MEDLINE tagged duplications

DOI : https://doi.org/10.11613/BM.2019.010201

Merits and Limits: Applying open data to monitor open access publications in bibliometric databases

Authors : Aliakbar Akbaritabar, Stephan Stahlschmidt

Identifying and monitoring Open Access (OA) publications might seem a trivial task while practical efforts prove otherwise. Contradictory information arise often depending on metadata employed.

We strive to assign OA status to publications in Web of Science (WOS) and Scopus while complementing it with different sources of OA information to resolve contradicting cases.

We linked publications from WOS and Scopus via DOIs and ISSNs to Unpaywall, Crossref, DOAJ and ROAD. Only about 50% of articles and reviews from WOS and Scopus could be matched via a DOI to Unpaywall.

Matching with Crossref brought 56 distinct licences, which define in many cases the legally binding access status of publications. But only 44% of publications hold only a single licence on Crossref, while more than 50% have no licence information submitted to Crossref.

Contrasting OA information from Crossref licences with Unpaywall we found contradictory cases overall amounting to more than 25%, which might be partially explained by (ex-)including green OA.

A further manual check found about 17% of OA publications that are not accessible and 15% non-OA publications that are accessible through publishers’ websites. These preliminary results suggest that identification of OA state of publications denotes a difficult and currently unfulfilled task.

URL : https://arxiv.org/abs/1902.03937

Open data to evaluate academic researchers: an experiment with the Italian Scientific Habilitation

Authors : Angelo Di Iorio, Silvio Peroni, Francesco Poggi

The need for scholarly open data is ever increasing. While there are large repositories of open access articles and free publication indexes, there are still a few examples of free citation networks and their coverage is partial.

One of the results is that most of the evaluation processes based on citation counts rely on commercial citation databases. Things are changing under the pressure of the Initiative for Open Citations (I4OC), whose goal is to campaign for scholarly publishers to make their citations as totally open.

This paper investigates the growth of open citations with an experiment on the Italian Scientific Habilitation, the National process for University Professor qualification which instead uses data from commercial indexes.

We simulated the procedure by only using open data and explored similarities and differences with the official results. The outcomes of the experiment show that the amount of open citation data currently available is not yet enough for obtaining similar results.

URL : https://arxiv.org/abs/1902.03287

Improving the discoverability and web impact of open repositories: techniques and evaluation

Author : George Macgregor

In this contribution we experiment with a suite of repository adjustments and improvements performed on Strathprints, the University of Strathclyde, Glasgow, institutional repository powered by EPrints 3.3.13.

These adjustments were designed to support improved repository web visibility and user engagement, thereby improving usage. Although the experiments were performed on EPrints it is thought that most of the adopted improvements are equally applicable to any other repository platform.

Following preliminary results reported elsewhere, and using Strathprints as a case study, this paper outlines the approaches implemented, reports on comparative search traffic data and usage metrics, and delivers conclusions on the efficacy of the techniques implemented.

The evaluation provides persuasive evidence that specific enhancements to technical aspects of a repository can result in significant improvements to repository visibility, resulting in a greater web impact and consequent increases in content usage.

COUNTER usage grew by 33% and traffic to Strathprints from Google and Google Scholar was found to increase by 63% and 99% respectively. Other insights from the evaluation are also explored.

The results are likely to positively inform the work of repository practitioners and open scientists.

URL : https://journal.code4lib.org/articles/14180

A Principled Approach to Online Publication Listings and Scientific Resource Sharing

Authors : Jacquelijn Ringersma, Karin Kastens, Ulla Tschida, Jos van Berkum

The Max Planck Institute (MPI) for Psycholinguistics has developed a service to manage and present the scholarly output of their researchers. The PubMan database manages publication metadata and full-texts of publications published by their scholars.

All relevant information regarding a researcher’s work is brought together in this database, including supplementary materials and links to the MPI database for primary research data.

The PubMan metadata is harvested into the MPI website CMS (Plone). The system developed for the creation of the publication lists, allows the researcher to create a selection of the harvested data in a variety of formats.

URL : https://journal.code4lib.org/articles/2520

“When You Use Social Media You Are Not Working”: Barriers for the Use of Metrics in Social Sciences

Authors : Steffen Lemke, Maryam Mehrazar, Athanasios Mazarakis, Isabella Peters

The Social Sciences have long been struggling with quantitative forms of research assessment—insufficient coverage in prominent citation indices and overall lower citation counts than in STM subject areas have led to a widespread weariness regarding bibliometric evaluations among social scientists.

Fueled by the rise of the social web, new hope is often placed on alternative metrics that measure the attention scholarly publications receive online, in particular on social media. But almost a decade after the coining of the term altmetrics for this new group of indicators, the uptake of the concept in the Social Sciences still seems to be low.

Just like with traditional bibliometric indicators, one central problem hindering the applicability of altmetrics for the Social Sciences is the low coverage of social science publications on the respective data sources—which in the case of altmetrics are the various social media platforms on which interactions with scientific outputs can be measured.

Another reason is that social scientists have strong opinions about the usefulness of metrics for research evaluation which may hinder broad acceptance of altmetrics too. We conducted qualitative interviews and online surveys with researchers to identify the concerns which inhibit the use of social media and the utilization of metrics for research evaluation in the Social Sciences.

By analyzing the response data from the interviews in conjunction with the response data from the surveys, we identify the key concerns that inhibit social scientists from (1) applying social media for professional purposes and (2) making use of the wide array of metrics available.

Our findings show that aspects of time consumption, privacy, dealing with information overload, and prevalent styles of communication are predominant concerns inhibiting Social Science researchers from using social media platforms for their work.

Regarding indicators for research impact we identify a widespread lack of knowledge about existing metrics, their methodologies and meanings as a major hindrance for their uptake through social scientists.

The results have implications for future developments of scholarly online tools and show that researchers could benefit considerably from additional formal training regarding the correct application and interpretation of metrics.

URL : “When You Use Social Media You Are Not Working”: Barriers for the Use of Metrics in Social Sciences

DOI : https://doi.org/10.3389/frma.2018.00039