Exploring how the public “see” scientists: A systematic literature review, 1983–2024

Authors : Weiping Wang, Hongxuan Ji, Ying Wang, Zhisen Wang

The public image of scientists significantly influences scientific literacy, science education, professional identity, science communication, and societal attitudes toward public issues. However, there has not been a thorough and detailed review of this topic. This paper presents a Systematic Literature Review (SLR) of 233 high-quality articles examining public perceptions of scientists.

The findings indicate that studies emphasize vivid and emotionally engaging characteristics of scientists, reflecting contemporary trends, particularly during the pandemic. Research predominantly targets students across various educational levels, highlighting a gap between science education and science communication, with a reliance on quantitative methods despite the use of visualization tools.

Key research limitations include a lack of humanistic perspective, issues with validity and reproducibility, insufficient cultural context analysis, weak causal inferences, and limited integration of artificial intelligence and big data, which impede advancements in science education.

The paper concludes with recommendations for developing a more comprehensive conceptual framework to bridge the gaps between science education and communication, as well as their relationship with science teaching, in order to foster a positive public understanding of science.

URL : Exploring how the public “see” scientists: A systematic literature review, 1983–2024

DOI : https://doi.org/10.1057/s41599-025-05869-7

Scientific publishing without gatekeeping: an empirical investigation of eLife’s new peer review process

Authors : Rüdiger Mutz, Lutz Bornmann, Hans‑Dieter Daniel

At the end of January 2023, eLife introduced a new publishing model (alongside the old-traditional-publishing model): all manuscripts submitted as preprints are peer-reviewed and published if they are deemed worthy of review by the editorial team (“editorial triage”). The model abandons the gatekeeping function and retains the previous “consultative approach to peer review”.

Even under the changed conditions, the question of the quality of judgements in the peer review process remains. In this study, the reviewers’ ratings of manuscripts submitted to eLife were examined in terms of both descriptive comparisons of peer review models, and the following selected quality criteria of peer review: interrater agreement and interrater reliability. eLife provided us with the data on all manuscripts submitted in 2023 according to the new publishing model (group 3, N = 3,846), as well as manuscripts submitted according to the old publishing model (group 1: N = 6,592 submissions from 2019; group 2: N = 364 submissions from 2023).

The interrater agreement and interrater reliability for the criteria “significance of findings” and “strength of support” were similarly low, as previous empirical studies for gatekeeping journals have shown.

The fairness of peer review is not or only slightly compromised. We used the empirical results of our study to recommend several improvements to the new publishing model introduced by eLife as for example, increasing transparency, masking author identity or increasing the number of expert reviewers.

URL : Scientific publishing without gatekeeping: an empirical investigation of eLife’s new peer review process

DOI : https://doi.org/10.1007/s11192-025-05422-y

A decentralized future for the open-science databases

Authors : Gaurav Sharma, Viorel Munteanu, Nika Mansouri Ghiasi, Jineta Banerjee, Susheel Varma, Luca Foschini, Kyle Ellrott, Onur Mutlu, Dumitru Ciorbă, Roel A. Ophoff, Viorel Bostan, Christopher E Mason, Jason H. Moore, Despoina Sousoni, Arunkumar Krishnan, Christopher E. Mason, Mihai Dimian, Gustavo Stolovitzky, Fabio G. Liberante, Taras K. Oleksyk, Serghei Mangul

Continuous and reliable access to curated biological data repositories is indispensable for accelerating rigorous scientific inquiry and fostering reproducible research. Centralized repositories, though widely used, are vulnerable to single points of failure arising from cyberattacks, technical faults, natural disasters, or funding and political uncertainties.

This can lead to widespread data unavailability, data loss, integrity compromises, and substantial delays in critical research, ultimately impeding scientific progress. Centralizing essential scientific resources in a single geopolitical or institutional hub is inherently dangerous, as any disruption can paralyze diverse ongoing research.

The rapid acceleration of data generation, combined with an increasingly volatile global landscape, necessitates a critical re-evaluation of the sustainability of centralized models. Implementing federated and decentralized architectures presents a compelling and future-oriented pathway to substantially strengthen the resilience of scientific data infrastructures, thereby mitigating vulnerabilities and ensuring the long-term integrity of data.

Here, we examine the structural limitations of centralized repositories, evaluate federated and decentralized models, and propose a hybrid framework for resilient, FAIR, and sustainable scientific data stewardship. Such an approach offers a significant reduction in exposure to governance instability, infrastructural fragility, and funding volatility, and also fosters fairness and global accessibility.

The future of open science depends on integrating these complementary approaches to establish a globally distributed, economically sustainable, and institutionally robust infrastructure that safeguards scientific data as a public good, further ensuring continued accessibility, interoperability, and preservation for generations to come.

DOI : https://doi.org/10.48550/arXiv.2509.19206

Trends of Publication of Negative Trials Over Time

Authors : Bruno LaviolleClara LocherJean-Sébastien AllainQuentin Le CornuPierre CharpentierMarie LefebvreClémence Le PapeCyril LevenClément PalpacuerClémence PontoizeauEric BellissantFlorian Naudet

Studies with negative results are less likely to be published than others, potentially leading to publication bias. Introduced in 2000, trial registration could have participated in decreasing the proportion of unpublished studies. We assessed the proportion of negative randomized controlled trials (RCT) over the last 20 years.

We searched Medline for RCT published in 2000, 2005, 2010, 2015, and 2020 in the British Medical Journal, the Journal of the American Medical Association, the Lancet, and the New England Journal of Medicine. The primary endpoint was the proportion of negative (final comparison on the primary study-endpoint without statistical significance or favoring the control arm) studies published in 2000 and 2020.

Factors independently associated with the publication of negative studies were identified using multivariable analysis. A total of 1,542 studies were included. The proportion of negative RCT significantly increased between 2000 and 2020 (from 27.6% to 37.4%; P = 0.01), however, the trend over time was not significant (P = 0.203). In multivariable analysis, the following factors were associated with a higher proportion of published negative studies: superiority (P < 0.001), two-group trials (P < 0.001), number of patients ≥510 (P < 0.001), cardiology trials (P = 0.003), emergency/critical care trials (P < 0.001), obstetrics trials (P = 0.032), surgery trials (P = 0.006), pneumology trials (P = 0.029).

Exclusive industry funding was associated with a lower proportion of published negative studies (P < 0.001). The proportion of published negative studies in 2020 was higher only when compared to 2000. During the two decades, no trend was noticeable. There is no clear relationship between trial registration and the publication of negative results over time.

URL : Trends of Publication of Negative Trials Over Time

DOI : https://doi.org/10.1002/cpt.3535

Publish-Review-Curate Modelling for Data Paper and Dataset: A Collaborative Approach

Authors : Youngim JungSungsoo Robert Ahn

Research datasets—capturing natural, societal, or artificial phenomena—are critical in generating new scientific insights, validating research models, and supporting data-intensive discovery. Data papers that describe and contextualise these datasets aim to ensure their findability, accessibility, interoperability, and reusability (FAIR) while providing academic credit to data creators.

However, the peer review of data papers and associated datasets presents considerable challenges, requiring reviewers to assess both the syntactic and semantic integrity of the data, metadata quality, and domain-specific scientific relevance. Furthermore, the coordination between journal editors, reviewers, and curators demands substantial effort, often leading to publication delays in the conventional review and then publishing framework.

This study proposes a novel Publish-Review-Curate (PRC) model tailored to the synchronised publication and review of data papers and their underlying datasets. Building on preprint and open science practices, the model defines a collaborative, multi-stakeholder workflow involving authors, peer reviewers, data experts, and journal editors.

The PRC model integrates open feedback, transparent peer review, and structured curation to improve research data’s quality, discoverability, and impact. By articulating conceptual and operational workflows, this study contributes a practical framework for modernising data publishing infrastructures and supporting the co-evaluation of narrative and data artefacts.

URL : Publish-Review-Curate Modelling for Data Paper and Dataset: A Collaborative Approach

DOI : https://doi.org/10.1002/leap.2024

National Repository Infrastructure and Open Access Challenges: The Croatian Perspective

Authors : Ivana Matijevi, Ivona Milovanović

Repositories are one of the key infrastructure components in achieving the goals of open science. In response to legal obligations, emerging trends, and challenges in open science, several Croatian institutions jointly established a national digital repository infrastructure in 2015 – the DABAR system (Digital Academic Archives and Repositories).

Its purpose is to provide a unified space for storing, preserving, and ensuring open access to the scholarly output of scientists and institutions within the Croatian science and higher education system.

After nearly a decade of operation, it is crucial to assess the role of this infrastructure today and evaluate whether it has successfully embodied the core principles of open science – openness, transparency, and visibility of scientific and Croatian scholarly output. This paper presents the Croatian national repository infrastructure as a case study, offering insights for comparison with similar national infrastructures.

The study employs a quantitative research approach, divided into two parts to provide a comprehensive overview of the current state and future development of repositories in Croatia. The first part analyses quantitative data and repository statistics. The DABAR infrastructure currently comprises 182 repositories and hosts over 249,000 digital objects, yet only slightly more than 50% of them are openly accessible.

To investigate the reasons behind the high percentage of restricted or closed-access objects, a survey was conducted among institutions that primarily deposit such items.

The findings of this research contribute to a broader discussion on open science practices and repository management at both European and international levels. The results will serve as a foundation for further improvements to the infrastructure, the promotion of open science principles, and the development of systematic support mechanisms to encourage greater accessibility and transparency in scholarly communication.

URL : National Repository Infrastructure and Open Access Challenges: The Croatian Perspective

DOI : https://doi.org/10.53377/lq.23061

Analysis of scientific paper retractions due to data problems: Revealing challenges and countermeasures in data management

Authors : Wanfei Hu, Guiliang Yan, Jingyu Zhang, Zhenli Chen, Qing Qian, Sizhu Wu

Background

Scientific data, the cornerstone of scientific endeavors, face management challenges amid technological advances. While retractions are analyzed, a rigorous focus on data problems leading to them is missing.

Methods

This study collected 49,979 retraction records up to 17 December 2023. After screening 16,842 records were related to data problems and 19,656 were due to other reasons. Methods such as descriptive statistics, hypothesis testing, and the BERTopic (Bidirectional Encoder Representations from Transformers Topic Modelling) were applied to conduct a topic analysis of article titles.

Result

The results show that since 2000, retractions due to data problems have increased significantly (p < 0.001), with the percentage in 2023 exceeding 75%. Among 16,842 data-related retractions, 59.0% were in Basic Life Sciences and 40.2% in Health Sciences. Data problems involve accuracy, reliability, validity, and integrity. There are significant differences (p < 0.001) in subjects, journal quartiles, retraction intervals, and other characteristics between data-related and other retractions. Data-related retractions are more concentrated in high-impact journals (Q1 37.6% and Q2 43.0%).

Conclusions

Institutions, publishers, and journals should adopt image-screening tools, enforce data deposition, standardize retraction notices, provide ethics training, and strengthen peer review to address these data problems, guiding better data management and healthier scientific development.

URL : Analysis of scientific paper retractions due to data problems Revealing challenges and countermeasures in data management

DOI : https://doi.org/10.1080/08989621.2025.2531987