How can revivals of scientific publications be explained using bibliometric methods? A case study discovering booster papers for the 1985 Physics Nobel Prize paper

Authors : Robin Haunschild, Werner Marx, Jürgen Weis

The unusual citation profile of the 1985 Physics Nobel Prize paper has been analyzed. The number of citing papers per year exhibits a maximum of 123 citations in the mid-1980s and increases to more than 200 citations about two decades later.

The publication set of the citing papers was analyzed in terms of co-authorships and research topics. The USA and (more recently) the People’s Republic of China appear prominently among the countries of the citing authors. A keyword analysis of the citing papers revealed research dealing with topological insulators as one of the major newly evolving research topics. An analysis of the co-cited papers has been performed via reference publication year spectroscopy (RPYS).

The most-frequently co-cited papers (the peak papers of the RPYS spectrogram) were identified and discussed. As a result, we found two primary booster papers and three secondary booster papers that renewed the interest in the 1985 Physics Nobel Prize paper.

URL : How can revivals of scientific publications be explained using bibliometric methods? A case study discovering booster papers for the 1985 Physics Nobel Prize paper

DOI : https://doi.org/10.1007/s11192-023-04906-z

Growth rates of modern science: a latent piecewise growth curve approach to model publication numbers from established and new literature databases

Authors : Lutz Bornmann, Robin Haunschild, Rüdiger Mutz

Growth of science is a prevalent issue in science of science studies. In recent years, two new bibliographic databases have been introduced, which can be used to study growth processes in science from centuries back: Dimensions from Digital Science and Microsoft Academic.

In this study, we used publication data from these new databases and added publication data from two established databases (Web of Science from Clarivate Analytics and Scopus from Elsevier) to investigate scientific growth processes from the beginning of the modern science system until today.

We estimated regression models that included simultaneously the publication counts from the four databases. The results of the unrestricted growth of science calculations show that the overall growth rate amounts to 4.10% with a doubling time of 17.3 years. As the comparison of various segmented regression models in the current study revealed, models with four or five segments fit the publication data best.

We demonstrated that these segments with different growth rates can be interpreted very well, since they are related to either phases of economic (e.g., industrialization) and/or political developments (e.g., Second World War).

In this study, we additionally analyzed scientific growth in two broad fields (Physical and Technical Sciences as well as Life Sciences) and the relationship of scientific and economic growth in UK.

The comparison between the two fields revealed only slight differences. The comparison of the British economic and scientific growth rates showed that the economic growth rate is slightly lower than the scientific growth rate.

URL : Growth rates of modern science: a latent piecewise growth curve approach to model publication numbers from established and new literature databases

DOI : https://doi.org/10.1057/s41599-021-00903-w

Do altmetrics assess societal impact in the same way as case studies? An empirical analysis testing the convergent validity of altmetrics based on data from the UK Research Excellence Framework (REF)

Authors : Lutz Bornmann, Robin Haunschild, Jonathan Adams

Altmetrics have been proposed as a way to assess the societal impact of research. Although altmetrics are already in use as impact or attention metrics in different contexts, it is still not clear whether they really capture or reflect societal impact.

This study is based on altmetrics, citation counts, research output and case study data from the UK Research Excellence Framework (REF), and peers’ REF assessments of research output and societal impact. We investigated the convergent validity of altmetrics by using two REF datasets: publications submitted as research output (PRO) to the REF and publications referenced in case studies (PCS).

Case studies, which are intended to demonstrate societal impact, should cite the most relevant research papers. We used the MHq’ indicator for assessing impact – an indicator which has been introduced for count data with many zeros.

The results of the first part of the analysis show that news media as well as mentions on Facebook, in blogs, in Wikipedia, and in policy-related documents have higher MHq’ values for PCS than for PRO.

Thus, the altmetric indicators seem to have convergent validity for these data. In the second part of the analysis, altmetrics have been correlated with REF reviewers’ average scores on PCS. The negative or close to zero correlations question the convergent validity of altmetrics in that context.

We suggest that they may capture a different aspect of societal impact (which can be called unknown attention) to that seen by reviewers (who are interested in the causal link between research and action in society).

URL : https://arxiv.org/abs/1807.03977

Allegation of scientific misconduct increases Twitter attention

Authors : Lutz Bornmann, Robin Haunschild

The web-based microblogging system Twitter is a very popular altmetrics source for measuring the broader impact of science. In this case study, we demonstrate how problematic the use of Twitter data for research evaluation can be, even though the aspiration of measurement is degraded from impact to attention measurement.

We collected the Twitter data for the paper published by Yamamizu et al. (2017). An investigative committee found that the main figures in the paper are fraudulent.

URL : https://arxiv.org/abs/1802.00606

To what extent does the Leiden Manifesto also apply to altmetrics? A discussion of the manifesto against the background of research into altmetrics

Authors : Lutz Bornmann, Robin Haunschild

Purpose

Hicks, Wouters, Waltman, de Rijcke, and Rafols (2015) have formulated the so-called Leiden manifesto, in which they have assembled the ten principles for a meaningful evaluation of research on the basis of bibliometric data.

Approach

In this work the attempt is made to indicate the relevance of the Leiden manifesto for altmetrics.

Results

As shown by the discussion of the ten principles against the background of the knowledge about and the research into altmetrics, the principles also have a great importance for altmetrics and should be taken into account in their application.

Originality

Altmetrics is already frequently used in the area of research evaluation. Thus, it is important that the user of altmetrics data knows the relevance of the Leiden manifesto also in this area.

URL : https://figshare.com/articles/To_what_extent_does_the_Leiden_Manifesto_also_apply_to_altmetrics_A_discussion_of_the_manifesto_against_the_background_of_research_into_altmetrics/1464981