Over-promotion and caution in abstracts of preprints during the COVID-19 crisis

Authors : Frederique Bordignon, Liana Ermakova, Marianne Noel

The abstract is known to be a promotional genre where researchers tend to exaggerate the benefit of their research and use a promotional discourse to catch the reader’s attention. The COVID-19 pandemic has prompted intensive research and has changed traditional publishing with the massive adoption of preprints by researchers.

Our aim is to investigate whether the crisis and the ensuing scientific and economic competition have changed the lexical content of abstracts. We propose a comparative study of abstracts associated with preprints issued in response to the pandemic relative to abstracts produced during the closest pre-pandemic period.

We show that with the increase (on average and in percentage) of positive words (especially effective) and the slight decrease of negative words, there is a strong increase in hedge words (the most frequent of which are the modal verbs can and may).

Hedge words counterbalance the excessive use of positive words and thus invite the readers, who go probably beyond the ‘usual’ audience, to be cautious with the obtained results.

The abstracts of preprints urgently produced in response to the COVID-19 crisis stand between uncertainty and over-promotion, illustrating the balance that authors have to achieve between promoting their results and appealing for caution.

DOI : https://doi.org/10.1002/leap.1411

Do researchers know what the h-index is? And how do they estimate its importance?

Authors : Pantea Kamrani, Isabelle Dorsch, Wolfgang G. Stock

The h-index is a widely used scientometric indicator on the researcher level working with a simple combination of publication and citation counts. In this article, we pursue two goals, namely the collection of empirical data about researchers’ personal estimations of the importance of the h-index for themselves as well as for their academic disciplines, and on the researchers’ concrete knowledge on the h-index and the way of its calculation.

We worked with an online survey (including a knowledge test on the calculation of the h-index), which was finished by 1081 German university professors. We distinguished between the results for all participants, and, additionally, the results by gender, generation, and field of knowledge.

We found a clear binary division between the academic knowledge fields: For the sciences and medicine the h-index is important for the researchers themselves and for their disciplines, while for the humanities and social sciences, economics, and law the h-index is considerably less important.

Two fifths of the professors do not know details on the h-index or wrongly deem to know what the h-index is and failed our test. The researchers’ knowledge on the h-index is much smaller in the academic branches of the humanities and the social sciences.

As the h-index is important for many researchers and as not all researchers are very knowledgeable about this author-specific indicator, it seems to be necessary to make researchers more aware of scholarly metrics literacy.

URL : Do researchers know what the h-index is? And how do they estimate its importance?

DOI : https://doi.org/10.1007/s11192-021-03968-1

TeamTree analysis: A new approach to evaluate scientific production

Author : Frank W. Pfrieger

Advances in science and technology depend on the work of research teams and the publication of results through peer-reviewed articles representing a growing socio-economic resource. Current methods to mine the scientific literature regarding a field of interest focus on content, but the workforce credited by authorship remains largely unexplored.

Notably, appropriate measures of scientific production are debated. Here, a new bibliometric approach named TeamTree analysis is introduced that visualizes the development and composition of the workforce driving a field.

A new citation-independent measure that scales with the H index estimates impact based on publication record, genealogical ties and collaborative connections.

This author-centered approach complements existing tools to mine the scientific literature and to evaluate research across disciplines.

URL : TeamTree analysis: A new approach to evaluate scientific production

DOI : https://doi.org/10.1371/journal.pone.0253847

Article Processing Charges based publications: to which extent the price explains scientific impact?

Authors : Abdelghani Maddi, David Sapinho

The present study aims to analyze relationship between Citations Normalized Score (NCS) of scientific publications and Article Processing Charges (APCs) amounts of Gold Open access publications.

To do so, we use APCs information provided by OpenAPC database and citations scores of publications in the Web of Science database (WoS). Database covers the period from 2006 to 2019 with 83,752 articles published in 4751 journals belonging to 267 distinct publishers.

Results show that contrary to this belief, paying dearly does not necessarily increase the impact of publications. First, large publishers with high impact are not the most expensive.

Second, publishers with the highest APCs are not necessarily the best in terms of impact. Correlation between APCs and impact is moderate. Otherwise, in the econometric analysis we have shown that publication quality is strongly determined by journal quality in which it is published. International collaboration also plays an important role in citations score.

URL : https://arxiv.org/abs/2107.07348

From indexation policies through citation networks to normalized citation impacts: Web of Science, Scopus, and Dimensions as varying resonance chambers

Authors : Stephan Stahlschmidt, Dimity Stephen

Dimensions was introduced as an alternative bibliometric database to the well-established Web of Science (WoS) and Scopus, however all three databases have fundamental differences in coverage and content, resultant from their owners’ indexation philosophies.

In light of these differences, we explore here, using a citation network analysis and assessment of normalised citation impact of “duplicate” publications, whether the three databases offer structurally different perspectives of the bibliometric landscape or if they are essentially homogenous substitutes.

Our citation network analysis of core and exclusive 2016-2018 publications revealed a large set of core publications indexed in all three databases that are highly self-referential. In comparison, each database selected a set of exclusive publications that appeared to hold similarly low levels of relevance to the core set and to one another, with slightly more internal communication between exclusive publications in Scopus and Dimensions than WoS.

Our comparison of normalised citations for 41,848 publications indexed in all three databases found that German sectors were valuated as more impactful in Scopus and Dimensions compared to WoS, particularly for sectors with an applied research focus.

We conclude that the databases do present structurally different perspectives, although Scopus and Dimensions with their additional circle of applied research vary more from the more base research-focused WoS than they do from one another.

URL : https://arxiv.org/abs/2106.01695

Researchers’ attitudes towards the h-index on Twitter 2007–2020: criticism and acceptance

Authors : Mike Thelwall, Kayvan Kousha

The h-index is an indicator of the scientific impact of an academic publishing career. Its hybrid publishing/citation nature and inherent bias against younger researchers, women, people in low resourced countries, and those not prioritizing publishing arguably give it little value for most formal and informal research evaluations.

Nevertheless, it is well-known by academics, used in some promotion decisions, and is prominent in bibliometric databases, such as Google Scholar. In the context of this apparent conflict, it is important to understand researchers’ attitudes towards the h-index.

This article used public tweets in English to analyse how scholars discuss the h-index in public: is it mentioned, are tweets about it positive or negative, and has interest decreased since its shortcomings were exposed?

The January 2021 Twitter Academic Research initiative was harnessed to download all English tweets mentioning the h-index from the 2006 start of Twitter until the end of 2020. The results showed a constantly increasing number of tweets.

Whilst the most popular tweets unapologetically used the h-index as an indicator of research performance, 28.5% of tweets were critical of its simplistic nature and others joked about it (8%). The results suggest that interest in the h-index is still increasing online despite scientists willing to evaluate the h-index in public tending to be critical.

Nevertheless, in limited situations it may be effective at succinctly conveying the message that a researcher has had a successful publishing career.

DOI : https://doi.org/10.1007/s11192-021-03961-8

The rise of multiple institutional affiliations in academia

Authors : Hanna Hottenrott, Michael E. Rose, Cornelia Lawson

This study provides the first systematic, international, large-scale evidence on the extent and nature of multiple institutional affiliations on journal publications. Studying more than 15 million authors and 22 million articles from 40 countries we document that: In 2019, almost one in three articles was (co-)authored by authors with multiple affiliations and the share of authors with multiple affiliations increased from around 10% to 16% since 1996.

The growth of multiple affiliations is prevalent in all fields and it is stronger in high impact journals. About 60% of multiple affiliations are between institutions from within the academic sector.

International co-affiliations, which account for about a quarter of multiple affiliations, most often involve institutions from the United States, China, Germany and the United Kingdom, suggesting a core-periphery network. Network analysis also reveals a number communities of countries that are more likely to share affiliations.

We discuss potential causes and show that the timing of the rise in multiple affiliations can be linked to the introduction of more competitive funding structures such as “excellence initiatives” in a number of countries. We discuss implications for science and science policy.

URL : The rise of multiple institutional affiliations in academia

DOI : https://doi.org/10.1002/asi.24472