The impact of researchers’ perceived pressure on their publication strategies

Authors : David Johann, Jorg Neufeld, Kathrin Thomas, Justus Rathmann, Heiko Rauhut

This article investigates researchers’ publication strategies and how their perceived pressure to publish and to obtain external funding are related to these strategies. The analyses rely on data from the Zurich Survey of Academics (ZSoA), an online survey representative of academics working at higher education institutions in Germany, Austria, and Switzerland. The results suggest that academics pursue both instrumental and normative publication strategies.

The main finding is that academics who perceive high pressure to publish tend to employ instrumental publication strategies rather than normative ones: they are more likely to focus on the journal’s reputation and the speed of publication when selecting an outlet for peer review. Publishing results in open-access outlets or in native languages other than English is less important for those under pressure.

However, the extent to which researchers’ perceived pressure affects publication strategies also depends on other factors, such as the discrepancy between the time available for research and the time actually desired for research.

URL : The impact of researchers’ perceived pressure on their publication strategies

DOI : https://doi.org/10.1093/reseval/rvae011

Evolution of Peer Review in Scientific Communication

Author : Dmitry Kochetkov

It is traditionally believed that peer review is the backbone of an academic journal and scientific communication, ensuring high quality and trust in the published materials. However, peer review only became an institutionalized practice in the second half of the 20th century, although the first scientific journals appeared three centuries earlier. By the beginning of the 21st century, there emerged an opinion that the traditional model of peer review is in deep crisis.

The aim of this article is to formulate a perspective model of peer review for scientific communication. The article discusses the evolution of the institution of scientific peer review and the formation of the current crisis. The author analyzed the modern landscape of innovations in peer review and scientific communication. Based on this analysis, three main peer review models in relation to editorial workflow were identified: pre-publication peer review (traditional model),  registered reports, and post-publication (peer) review (including preprints (peer) review).

The author argues that the third model offers the best way to implement the main functions of scientific communication.

URL : Evolution of Peer Review in Scientific Communication

DOI : https://doi.org/10.31235/osf.io/b2ra3

On the Fast Track to Full Gold Open Access

Author : Robert Kudelić

The world of scientific publishing is changing; the days of an old type of subscription-based earnings for publishers seem over, and we are entering a new era. It seems as if an ever-increasing number of journals from disparate publishers are going Gold, Open Access that is, yet have we rigorously ascertained the issue in its entirety, or are we touting the strengths and forgetting about constructive criticism and careful weighing of evidence?

We will therefore present the current state of the art, in a compact review/bibliometrics style, of this more relevant than ever hot topic and suggest solutions that are most likely to be acceptable to all parties–while the performed analysis also shows there seems to be a link between trends in scientific publishing and tumultuous world events, which in turn has a special significance for the publishing environment in the current world stage.

URL : On the Fast Track to Full Gold Open Access

Arxiv : https://arxiv.org/abs/2311.08313

Collaborative design to bridge theory and practice in science communication

Authors :

The science communication field strives to connect theory and practice. This essay delves into the potential of collaborative design to bridge this gap. Collaborative design in science communication can involve scientists, science communication researchers, designers, and other stakeholders in developing new science communication solutions.

By incorporating diverse perspectives and expertise, it can help create more effective and evidence-based communication strategies that cater to the needs of audiences. To integrate these demands, a structured approach is necessary. This paper discusses two established frameworks, Design-Based Research and Design Thinking, and applies practical insights to envision the impact of collaborative design on the future of science communication.

URL : Collaborative design to bridge theory and practice in science communication

DOI : https://doi.org/10.22323/2.23020401

May contain English – The assessment effect on language in publications and how this has manifested over a decade of DOAB titles

Authors : Danny Kingsley, Ronald Snijder

Research assessment in a major driver of research behaviour. The current emphasis on journal citations in a limited number of journals with an English focus have multiple effects. The need to publish in English even when it is not the local language affects the type of research undertaken and further consolidates the global North-centric view of scientific approach.

The bibliometric databases on which assessments of universities and journals are based are owned by two large corporate organisations, and this concentration of the market has in turn concentrated the research environment. Open infrastructure offers an alternative option for the research endeavour.

The OAPEN online open access library and the Directory of Open Access Books form part of this infrastructure and we consider the pattern of languages present in the directories over time.

DOI : https://doi.org/10.31235/osf.io/c8yq3

A survey of how biology researchers assess credibility when serving on grant and hiring committees

Authors : Iain Hrynaszkiewicz, Beruria Novich, James Harney, Veronique Kiermer

Researchers who serve on grant review and hiring committees have to make decisions about the intrinsic value of research in short periods of time, and research impact metrics such Journal Impact Factor (JIF) exert undue influence on these decisions. Initiatives such as the Coalition for Advancing Research Assessment (CoARA) and the Declaration on Research Assessment (DORA) emphasize responsible use of quantitative metrics and avoidance of journal-based impact metrics for research assessment. Further, our previous qualitative research suggested that assessing credibility, or trustworthiness, of research is important to researchers not only when they seek to inform their own research but also in the context of research assessment committees.

To confirm our findings from previous interviews in quantitative terms, we surveyed 485 biology researchers who have served on committees for grant review or hiring and promotion decisions, to understand how they assess the credibility of research outputs in these contexts. We found that concepts like credibility, trustworthiness, quality and impact lack consistent definitions and interpretations by researchers, which had already been observed in our interviews.

We also found that assessment of credibility is very important to most (81%) of researchers serving in these committees but fewer than half of respondents are satisfied with their ability to assess credibility. A substantial proportion of respondents (57% of respondents) report using journal reputation and JIF to assess credibility – proxies that research assessment reformers consider inappropriate to assess credibility because they don’t rely on intrinsic characteristics of the research.

This gap between importance of an assessment and satisfaction in the ability to conduct it was reflected in multiple aspects of credibility we tested and it was greatest for researchers seeking to assess the integrity of research (such as identifying signs of fabrication, falsification, or plagiarism), and the suitability and completeness of research methods. Non-traditional research outputs associated with Open Science practices – research data, code, protocol and preprints sharing – are particularly hard for researchers to assess, despite the potential of Open Science practices to signal trustworthiness.

Our results suggest opportunities to develop better guidance and better signals to support the evaluation of research credibility and trustworthiness – and ultimately support research assessment reform, away from the use of inappropriate proxies for impact and towards assessing the intrinsic characteristics and values researchers see as important.

DOI : https://doi.org/10.31222/osf.io/ht836

Reporting of interventional clinical trial results in a French academic center: a survey of completed studies

Authors : Anne Sophie Alix Doucet, Constant VINATIER, Loïc Fin, Hervé Léna, Hélène Rangé, Clara Locher, Florian Naudet

Background: The dissemination of clinical trial results is an important scientific and ethical endeavour. This survey of completed interventional studies in a French academic center describes their reporting status.

Methods: We explored all interventional studies sponsored by Rennes University Hospital identified on the French Open Science Monitor which tracks trials registered on EUCTR or clinicaltrials.gov, and provides an automatic assessment of the reporting of results. For each study, we ascertained the actual reporting of results using systematic searches on the hospital internal database, bibliographic databases (Google Scholar, PubMed), and by contacting all principal investigators (PIs). We describe several features (including total budget and numbers of trial participants) of the studies that did not report any results.

Results: The French Open Science Monitor identified 93 interventional studies, among which 10 (11%) reported results. In contrast, our survey identified 36 studies (39%) reporting primary analysis results and an additional 18 (19%) reporting results for secondary analyses (without results for their primary analysis). The overall budget for studies that did not report any results was estimated to be €5,051,253 for a total of 6,735 trial participants. The most frequent reasons for the absence of results reported by PIs were lack of time for 18 (42%), and logistic difficulties (e.g. delay in obtaining results or another blocking factor) for 12 (28%). An association was found between non-publication and negative results (adjusted Odds Ratio = 4.70, 95% Confidence Interval [1.67;14.11]).

Conclusions: Even allowing for the fact that automatic searches underestimate the number of studies with published results, the level of reporting was disappointingly low. This amounts to a waste of trial participants’ implication and money. Corrective actions are needed.

URL : Reporting of interventional clinical trial results in a French academic center: a survey of completed studies

DOI : https://doi.org/10.21203/rs.3.rs-3782467/v1