Analysing Elsevier Journal Metadata with a New Specialized Workbench inside ICSR Lab

Authors : Ramadurai Petchiappan, Kristy James, Andrew Plume, Efthymios Tsakonas, Ana Marušić, Mario Malicki, Francisco Grimaldo, Bahar Mehmani

In this white paper we introduce Elsevier’s Peer Review Workbench which will be available via the computational platform ICSR Lab. The workbench offers a unique dataset to interested researchers who want to run research on journal evaluation and peer review processes.

We describe its properties, advantages, and limitations as well as the process of proposal application. This is a living document and will be updated on a regular basis.

DOI : https://dx.doi.org/10.2139/ssrn.4211833

Reading Peer Review : PLOS ONE and Institutional Change in Academia

Authors : Martin Paul Eve, Cameron Neylon, Daniel Paul O’Donnell, Samuel Moore, Robert Gadie, Victoria Odeniyi, Shahina Parvin

This Element describes for the first time the database of peer review reports at PLOS ONE, the largest scientific journal in the world, to which the authors had unique access.

Specifically, this Element presents the background contexts and histories of peer review, the data-handling sensitivities of this type of research, the typical properties of reports in the journal to which the authors had access, a taxonomy of the reports, and their sentiment arcs.

This unique work thereby yields a compelling and unprecedented set of insights into the evolving state of peer review in the twenty-first century, at a crucial political moment for the transformation of science.

It also, though, presents a study in radicalism and the ways in which PLOS’s vision for science can be said to have effected change in the ultra-conservative contemporary university.

URL : Reading Peer Review : PLOS ONE and Institutional Change in Academia

Original location : https://www.cambridge.org/core/elements/reading-peer-review/42F027E4C67D246DD8C3AC440A68C7A7

Adoption of Transparency and Openness Promotion (TOP) Guidelines across Journals

Authors : Inga Patarčić, Jadranka Stojanovski

Journal policies continuously evolve to enable knowledge sharing and support reproducible science. However, that change happens within a certain framework. Eight modular standards with three levels of increasing stringency make Transparency and Openness Promotion (TOP) guidelines which can be used to evaluate to what extent and with which stringency journals promote open science.

Guidelines define standards for data citation, transparency of data, material, code and design and analysis, replication, plan and study pre-registration, and two effective interventions: “Registered reports” and “Open science badges”, and levels of adoption summed up across standards define journal’s TOP Factor. In this paper, we analysed the status of adoption of TOP guidelines across two thousand journals reported in the TOP Factor metrics.

We show that the majority of the journals’ policies align with at least one of the TOP’s standards, most likely “Data citation” (70%) followed by “Data transparency” (19%). Two-thirds of adoptions of TOP standard are of the stringency Level 1 (less stringent), whereas only 9% is of the stringency Level 3.

Adoption of TOP standards differs across science disciplines and multidisciplinary journals (N = 1505) and journals from social sciences (N = 1077) show the greatest number of adoptions. Improvement of the measures that journals take to implement open science practices could be done: (1) discipline-specific, (2) journals that have not yet adopted TOP guidelines could do so, (3) the stringency of adoptions could be increased.

URL : Adoption of Transparency and Openness Promotion (TOP) Guidelines across Journals

DOI : https://doi.org/10.3390/publications10040046

Open science and conflict of interest policies of medical and health sciences journals before and during the COVID-19 pandemic: A repeat cross-sectional study

Authors : Antoni D. Gardener, Ellen J. Hick, Chloe Jacklin, Gifford Tan, Aidan G. Cashin, Hopin Lee, David Nunan, Elaine C. Toomey, Georgia C. Richards

Objectives

To audit the transparent and open science standards of health and medical sciences journal policies and explore the impact of the COVID-19 pandemic.

Design

Repeat cross-sectional study.

Setting

19 journals listed in Google Scholar’s Top Publications for health and medical sciences.

Participants

Blood, Cell, Circulation, European Heart Journal, Gastroenterology, Journal of Clinical Oncology, Journal of the American College of Cardiology, Nature Genetics, Nature Medicine, Nature Neuroscience, Neuron, PLoS ONE, Proceedings of the National Academy of Sciences, Science Translational Medicine, The British Medical Journal, The Journal of the American Medical Association, The Lancet, The Lancet Oncology, and The New England Journal of Medicine.

Main outcome measures

We used the Transparency and Openness Promotion (TOP) guideline and the International Committee of Medical Journal Editors (ICMJE) requirements for disclosing conflicts
of interest (COIs) to evaluate journals standards.

Results

TOP scores slightly improved during the COVID-19 pandemic, from a median of 5 (IQR: 212.5) out of a possible 24 points in February 2020 to 7 (IQR: 4–12) in May 2021, but overall, scores were very low at both time points. Journal policies scored highest for their adherence to data transparency and scored lowest for preregistration of study protocols and analysis plans and the submission of replication studies. Most journals fulfilled all ICMJE provisions for reporting COIs before (84%; n = 16) and during (95%; n = 18) the COVID-19 pandemic.

Conclusions

The COVID-19 pandemic has highlighted the importance of practising open science. However, requirements for open science practices in audited policies were overall low, which may impede progress in health and medical research. As key stakeholders in disseminating research, journals should promote a research culture of greater transparency and more robust open science practices.

URL : Open science and conflict of interest policies of medical and health sciences journals before and during the COVID-19 pandemic: A repeat cross-sectional study

DOI : https://doi.org/10.1177/20542704221132139

Knowledge Production: Analysing Gender- and Country-Dependent Factors in Research Topics through Term Communities

Authors : Parminder Bakshi-Hamm, Andreas Hamm

Scholarly publications are among the most tangible forms of knowledge production. Therefore, it is important to analyse them, amongst other features, for gender or country differences and the incumbent inequalities.

While there are many quantitative studies of publication activities and success in terms of publication numbers and citation counts, a more content-related understanding of differences in the choice of research topics is rare.

The present paper suggests an innovative method of using term communities in co-occurrence networks for detecting and evaluating the gender- and country-specific distribution of topics in research publications. The method is demonstrated with a pilot study based on approximately a quarter million of publication abstracts in seven diverse research areas.

In this example, the method validly reconstructs all obvious topic preferences, for instance, country-dependent language-related preferences. It also produces new insight into country-specific research focuses. It emerges that in all seven subject areas studied, topic preferences are significantly different depending on whether all authors are women, all authors are men, or there are female and male co-authors, with a tendency of male authors towards theoretical core topics, of female authors towards peripheral applied topics, and of mixed-author teams towards modern interdisciplinary topics.

URL : Knowledge Production: Analysing Gender- and Country-Dependent Factors in Research Topics through Term Communities

DOI : https://doi.org/10.3390/publications10040045

Academic Tracker: Software for tracking and reporting publications associated with authors and grants

Authors : P. Travis Thompson, Christian D. Powell, Hunter N. B. Moseley

In recent years, United States federal funding agencies, including the National Institutes of Health (NIH) and the National Science Foundation (NSF), have implemented public access policies to make research supported by funding from these federal agencies freely available to the public.

Enforcement is primarily through annual and final reports submitted to these funding agencies, where all peer-reviewed publications must be registered through the appropriate mechanism as required by the specific federal funding agency. Unreported and/or incorrectly reported papers can result in delayed acceptance of annual and final reports and even funding delays for current and new research grants.

So, it’s important to make sure every peer-reviewed publication is reported properly and in a timely manner. For large collaborative research efforts, the tracking and proper registration of peer-reviewed publications along with generation of accurate annual and final reports can create a large administrative burden. With large collaborative teams, it is easy for these administrative tasks to be overlooked, forgotten, or lost in the shuffle. In order to help with this reporting burden, we have developed the Academic Tracker software package, implemented in the Python 3 programming language and supporting Linux, Windows, and Mac operating systems.

Academic Tracker helps with publication tracking and reporting by comprehensively searching major peer-reviewed publication tracking web portals, including PubMed, Crossref, ORCID, and Google Scholar, given a list of authors. Academic Tracker provides highly customizable reporting templates so information about the resulting publications is easily transformed into appropriate formats for tracking and reporting purposes.

The source code and extensive documentation is hosted on GitHub (https://moseleybioinformaticslab.github.io/academic_tracker/) and is also available on the Python Package Index (https://pypi.org/project/academic_tracker) for easy installation.

A quantitative and qualitative open citation analysis of retracted articles in the humanities

Authors : Ivan Heibi, Silvio Peroni

In this article, we show and discuss the results of a quantitative and qualitative analysis of open citations to retracted publications in the humanities domain. Our study was conducted by selecting retracted papers in the humanities domain and marking their main characteristics (e.g., retraction reason).

Then, we gathered the citing entities and annotated their basic metadata (e.g., title, venue, etc.) and the characteristics of their in-text citations (e.g., intent, sentiment, etc.). Using these data, we performed a quantitative and qualitative study of retractions in the humanities, presenting descriptive statistics and a topic modeling analysis of the citing entities’ abstracts and the in-text citation contexts.

As part of our main findings, we noticed that there was no drop in the overall number of citations after the year of retraction, with few entities which have either mentioned the retraction or expressed a negative sentiment toward the cited publication.

In addition, on several occasions, we noticed a higher concern/awareness when it was about citing a retracted publication, by the citing entities belonging to the health sciences domain, if compared to the humanities and the social science domains. Philosophy, arts, and history are the humanities areas that showed the higher concerns toward the retraction.

URL : A quantitative and qualitative open citation analysis of retracted articles in the humanities

DOI : https://doi.org/10.1162/qss_a_00222