Do researchers know what the h-index is? And how do they estimate its importance?

Authors : Pantea Kamrani, Isabelle Dorsch, Wolfgang G. Stock

The h-index is a widely used scientometric indicator on the researcher level working with a simple combination of publication and citation counts. In this article, we pursue two goals, namely the collection of empirical data about researchers’ personal estimations of the importance of the h-index for themselves as well as for their academic disciplines, and on the researchers’ concrete knowledge on the h-index and the way of its calculation.

We worked with an online survey (including a knowledge test on the calculation of the h-index), which was finished by 1081 German university professors. We distinguished between the results for all participants, and, additionally, the results by gender, generation, and field of knowledge.

We found a clear binary division between the academic knowledge fields: For the sciences and medicine the h-index is important for the researchers themselves and for their disciplines, while for the humanities and social sciences, economics, and law the h-index is considerably less important.

Two fifths of the professors do not know details on the h-index or wrongly deem to know what the h-index is and failed our test. The researchers’ knowledge on the h-index is much smaller in the academic branches of the humanities and the social sciences.

As the h-index is important for many researchers and as not all researchers are very knowledgeable about this author-specific indicator, it seems to be necessary to make researchers more aware of scholarly metrics literacy.

URL : Do researchers know what the h-index is? And how do they estimate its importance?

DOI : https://doi.org/10.1007/s11192-021-03968-1

Web analytics for open access academic journals: justification, planning and implementation

Authors : Alex Vitela Caraveo, Cristóbal Urbano

An overview is presented of resources and web analytics strategies useful in setting solutions for capturing usage statistics and assessing audiences for open access academic journals.

A set of complementary metrics to citations is contemplated to help journal editors and managers to provide evidence of the performance of the journal as a whole, and of each article in particular, in the web environment.

The measurements and indicators selected seek to generate added value for editorial management in order to ensure its sustainability. The proposal is based on three areas: counts of visits and downloads, optimization of the website alongside with campaigns to attract visitors, and preparation of a dashboard for strategic evaluation.

It is concluded that, from the creation of web performance measurement plans based on the resources and proposals analysed, journals may be in a better position to plan the data-driven web optimization in order to attract authors and readers and to offer the accountability that the actors involved in the editorial process need to assess their open access business model.

DOI : https://dx.doi.org/10.1344/BiD2020.45.20

Requiem for impact factors and high publication charges

Authors : Chris R Triggle, Ross MacDonald, David J. Triggle, Donald Grierson

Journal impact factors, publication charges and assessment of quality and accuracy of scientific research are critical for researchers, managers, funders, policy makers, and society. Editors and publishers compete for impact factor rankings, to demonstrate how important their journals are, and researchers strive to publish in perceived top journals, despite high publication and access charges.

This raises questions of how top journals are identified, whether assessments of impacts are accurate and whether high publication charges borne by the research community are justified, bearing in mind that they also collectively provide free peer-review to the publishers.

Although traditional journals accelerated peer review and publication during the COVID-19 pandemic, preprint servers made a greater impact with over 30,000 open access articles becoming available and accelerating a trend already seen in other fields of research.

We review and comment on the advantages and disadvantages of a range of assessment methods and the way in which they are used by researchers, managers, employers and publishers.

We argue that new approaches to assessment are required to provide a realistic and comprehensive measure of the value of research and journals and we support open access publishing at a modest, affordable price to benefit research producers and consumers.

URL : Requiem for impact factors and high publication charges

DOI : https://doi.org/10.1080/08989621.2021.1909481

How is science clicked on Twitter? Click metrics for Bitly short links to scientific publications

Authors : Zhichao Fang, Rodrigo Costas, Wencan Tian, Xianwen Wang, Paul Wouters

To provide some context for the potential engagement behavior of Twitter users around science, this article investigates how Bitly short links to scientific publications embedded in scholarly Twitter mentions are clicked on Twitter.

Based on the click metrics of over 1.1 million Bitly short links referring to Web of Science (WoS) publications, our results show that around 49.5% of them were not clicked by Twitter users. For those Bitly short links with clicks from Twitter, the majority of their Twitter clicks accumulated within a short period of time after they were first tweeted.

Bitly short links to the publications in the field of Social Sciences and Humanities tend to attract more clicks from Twitter over other subject fields. This article also assesses the extent to which Twitter clicks are correlated with some other impact indicators.

Twitter clicks are weakly correlated with scholarly impact indicators (WoS citations and Mendeley readers), but moderately correlated to other Twitter engagement indicators (total retweets and total likes).

In light of these results, we highlight the importance of paying more attention to the click metrics of URLs in scholarly Twitter mentions, to improve our understanding about the more effective dissemination and reception of science information on Twitter.

URL : How is science clicked on Twitter? Click metrics for Bitly short links to scientific publications

DOI : https://doi.org/10.1002/asi.24458

Conjoint analysis of researchers’ hidden preferences for bibliometrics, altmetrics, and usage metrics

Authors : Steffen Lemke, Athanasios Mazarakis, Isabella Peters

The amount of annually published scholarly articles is growing steadily, as is the number of indicators through which impact of publications is measured. Little is known about how the increasing variety of available metrics affects researchers’ processes of selecting literature to read.

We conducted ranking experiments embedded into an online survey with 247 participating researchers, most from social sciences. Participants completed series of tasks in which they were asked to rank fictitious publications regarding their expected relevance, based on their scores regarding six prototypical metrics.

Through applying logistic regression, cluster analysis, and manual coding of survey answers, we obtained detailed data on how prominent metrics for research impact influence our participants in decisions about which scientific articles to read.

Survey answers revealed a combination of qualitative and quantitative characteristics that researchers consult when selecting literature, while regression analysis showed that among quantitative metrics, citation counts tend to be of highest concern, followed by Journal Impact Factors.

Our results suggest a comparatively favorable view of many researchers on bibliometrics and widespread skepticism toward altmetrics.

The findings underline the importance of equipping researchers with solid knowledge about specific metrics’ limitations, as they seem to play significant roles in researchers’ everyday relevance assessments.

URL : Conjoint analysis of researchers’ hidden preferences for bibliometrics, altmetrics, and usage metrics

DOI : https://doi.org/10.1002/asi.24445

Preprints as accelerator of scholarly communication: An empirical analysis in Mathematics

Authors : Zhiqi Wang, Yue Chen, Wolfgang Glänzel

In this study we analyse the key driving factors of preprints in enhancing scholarly communication. To this end we use four groups of metrics, one referring to scholarly communication and based on bibliometric indicators (Web of Science and Scopus citations), while the others reflect usage (usage counts in Web of Science), capture (Mendeley readers) and social media attention (Tweets).

Hereby we measure two effects associated with preprint publishing: publication delay and impact. We define and use several indicators to assess the impact of journal articles with previous preprint versions in arXiv. In particular, the indicators measure several times characterizing the process of arXiv preprints publishing and the reviewing process of the journal versions, and the ageing patterns of citations to preprints.

In addition, we compare the observed patterns between preprints and non-OA articles without any previous preprint versions in arXiv. We could observe that the “early-view” and “open-access” effects of preprints contribute to a measurable citation and readership advantage of preprints.

Articles with preprint versions are more likely to be mentioned in social media and have shorter Altmetric attention delay. Usage and capture prove to have only moderate but stronger correlation with citations than Tweets. The different slopes of the regression lines between the different indicators reflect different order of magnitude of usage, capture and citation data.

URL : https://arxiv.org/abs/2011.11940

Our Study is Published, But the Journey is Not Finished!

Authors : Olivier Pourret, Katsuhiko Suzuki, Yoshio Takahashi

Each June, we receive e-mails from publishers welcoming the evolution of their journals’ journal impact factor (JIF). The JIF is a controversial metric (Callaway 2016), and it is worth asking, “What’s behind it?”

In this age of “publish or perish” (Harzing 2007), we take much time and effort to write our papers and get them published. But how much time and effort do we put into finding readers or ensuring that we are reaching the right audience? Are metrics, such as the JIF, good guides for how well we are doing at reaching our target audience?

DOI : https://doi.org/10.2138/gselements.16.4.229