Authors : Yifang Ma, Brian Uzzi
Scientific prizes are among the greatest recognition a scientist receives from their peers and arguably shape the direction of a field by conferring credibility to persons, ideas, and disciplines, providing financial rewards, and promoting rituals that reinforce scientific communities.
The proliferation of prizes and links among prizes suggest that the prize network embodies information about scientists and ideas poised to grow in acclaim. Using comprehensive new data on prizes and prizewinners worldwide and across disciplines, we examine the growth dynamics and interlocking relationships found in the worldwide scientific prize network.
We focus on understanding how the knowledge linkages among prizes and scientists’ propensities for prizewinning are related to knowledge pathways across disciplines and stratification within disciplines.
We find several key links between prizes and scientific advances.
First, despite a proliferation of diverse prizes over time and across the globe, prizes are more concentrated within a relatively small group of scientific elites, and ties within the elites are more clustered, suggesting that a relatively constrained number of ideas and scholars lead science.
Second, we find that certain prizes are strongly interlocked within and between disciplines by scientists who win multiple prizes, revealing the key pathways by which knowledge systematically gains credit and spreads through the network.
Third, we find that genealogical and co authorship networks strongly predict who wins one or more prizes and explains the high level of interconnections among acclaimed scientists and their path breaking ideas.
URL : https://arxiv.org/abs/1808.09412
Auteurs/Authors : Vincent Larivière, Cassidy R. Sugimoto
L’ensemble de la communauté scientifique réclame depuis plusieurs années des indicateurs fiables permettant de mesurer les répercussions de la recherche. La ferveur inégalée autour de la mesure de l’influence de la recherche, combinée avec les nouveaux modes de diffusion des connaissances à l’ère numérique, a révolutionné le domaine de la scientométrie.
Il s’agit là d’une discipline qui comprend toutes les façons dont nous collectons les documents savants et analysons quantitativement leur production ainsi que leurs usages, des citations aux tweets. Les données et les indicateurs ainsi recueillis sont utilisés pour comprendre la science, stimuler la recherche ou distribuer les ressources.
Curieusement, il n’existe aucun ouvrage qui explique les fondements historiques, les concepts et les sources de la scientométrie, ou qui en fournirait une critique éclairée ou même qui formulerait des recommandations pour un usage optimal. D’où l’importance de celui-ci.
À sa façon, chacun est un acteur de la société du savoir et devrait se soucier des outils qui aident à guider son évolution : c’est pourquoi ce livre s’adresse à tous, savants comme profanes.
URL : https://pum.umontreal.ca/catalogue/mesurer-la-science/
Authors : Saskia Woutersen-Windhouwer, Jaroen Kuijper
In Amsterdam, the libraries of the University of Amsterdam (UvA) and the Amsterdam University of Applied Sciences (AUAS) cooperate closely. In this cooperation, the differences between a research university (i.c. UvA) and a university of applied sciences (i.c. AUAS) become particularly clear when we look at the aim and implementation of open access policies.
The open access plan of the AUAS removes not only financial and legal barriers, but also language barriers.
This makes the research output FAIR (findable, accessible, interoperable and reusable) to the primary target group of the product, and more importantly, it enables interaction between the AUAS and a wide audience, consisting of researchers from other disciplines, and a wide range of professionals, enterprises, civil servants, schools and citizens.
In the search for co-financing by enterprises and other stakeholders, and to fulfil their valorisation requirements, these target groups are currently becoming more important for research universities as well. Here, we show what research universities can learn from the open access policy of the AUAS.
URL : How to reach a wider audience with open access publishing: what research universities can learn from universities of applied sciences
DOI : http://doi.org/10.18352/lq.10237
Authors : Ari Melo Mariano, Maíra Rocha Santos
Measurement is a complicated but very necessary task. Many indices have been created in an effort to define the quality of knowledge produced but they have attracted strong criticism, having become synonymous with individualism, competition and mere productivity and, furthermore, they fail to head science towards addressing local demands or towards producing international knowledge by means of collaboration.
Institutions, countries, publishers, governments and authors have a latent need to create quality and productivity indices because they can serve as filters that influence far-reaching decision making and even decisions on the professional promotion of university teachers.
Even so, in the present-day context, the very creators of those indices admit that they were not designed for that purpose, given that different research areas, the age of the researcher, the country and the language spoken all have an influence on the index calculations.
Accordingly, this research sets out three indices designed to head science towards its universal objective by valuing collaboration and the dissemination of knowledge.
It is hoped that the proposed indices may provoke new discussions and the proposal of new, more assertive indicators for the analysis of scientific research quality.
URL : https://arxiv.org/abs/1807.07595
Authors: Rachel Ann Miles, Stacy Konkiel, Sarah Sutton
Academic librarians, especially in the field of scholarly communication, are often expected to understand and engage with research impact indicators. However, much of the current literature speculates about how academic librarians are using and implementing research impact indicators in their practice.
This study analyzed the results from a 2015 survey administered to over 13,000 academic librarians at Carnegie-classified R1 institutions in the United States. The survey concentrated on academic librarians’ familiarity with and usage of research impact indicators.
This study uncovered findings related to academic librarians’ various levels of familiarity with research impact indicators and how they implement and use research impact indicators in their professional development and in their library job duties.
In general, academic librarians with regular scholarly communication support duties tend to have higher levels of familiarity of research impact indicators. In general, academic librarians are most familiar with the citation counts and usage statistics and least familiar with altmetrics.
During consultations with faculty, the Journal Impact Factor (JIF) and citation counts are more likely to be addressed than the author h-index, altmetrics, qualitative measures, and expert peer reviews.
The survey results also hint towards a growing interest in altmetrics among academic librarians for their professional advancement.
Academic librarians are continually challenged to keep pace with the changing landscape of research impact metrics and research assessment models. By keeping pace and implementing research impact indicators in their own practices, academic librarians can provide a crucial service to the wider academic community.
URL : Scholarly Communication Librarians’ Relationship with Research Impact Indicators: An Analysis of a National Survey of Academic Librarians in the United States
DOI : http://doi.org/10.7710/2162-3309.2212
Authors : Lutz Bornmann, Robin Haunschild, Jonathan Adams
Altmetrics have been proposed as a way to assess the societal impact of research. Although altmetrics are already in use as impact or attention metrics in different contexts, it is still not clear whether they really capture or reflect societal impact.
This study is based on altmetrics, citation counts, research output and case study data from the UK Research Excellence Framework (REF), and peers’ REF assessments of research output and societal impact. We investigated the convergent validity of altmetrics by using two REF datasets: publications submitted as research output (PRO) to the REF and publications referenced in case studies (PCS).
Case studies, which are intended to demonstrate societal impact, should cite the most relevant research papers. We used the MHq’ indicator for assessing impact – an indicator which has been introduced for count data with many zeros.
The results of the first part of the analysis show that news media as well as mentions on Facebook, in blogs, in Wikipedia, and in policy-related documents have higher MHq’ values for PCS than for PRO.
Thus, the altmetric indicators seem to have convergent validity for these data. In the second part of the analysis, altmetrics have been correlated with REF reviewers’ average scores on PCS. The negative or close to zero correlations question the convergent validity of altmetrics in that context.
We suggest that they may capture a different aspect of societal impact (which can be called unknown attention) to that seen by reviewers (who are interested in the causal link between research and action in society).
URL : https://arxiv.org/abs/1807.03977
Authors : Megan C Evans, Christopher Cvitanovic
Scientists are increasingly required to demonstrate the real world tangible impacts arising from their research. Despite significant advances in scholarship dedicated to understanding and improving the relationships between science, policy and practice, much of the existing literature remains high level, theoretical, and not immediately accessible to early career researchers (ECRs) who work outside of the policy sciences.
In this paper, we draw on the literature and our own experiences working in the environmental sciences to provide an accessible resource for ECRs seeking to achieve policy impact in their chosen field. First, we describe key concepts in public policy to provide sufficient background for the non-expert.
Next, we articulate a number of practical steps and tools that can help ECRs to identify and enhance the policy relevance of their research, better understand the policy world in practice and identify a range of pathways to achieving impact.
Finally, we draw on our personal experiences to highlight some of the key individual characteristics and values that are needed to operate more effectively at the interface of science, policy and practice.
Our hope is that the information and tools provided here can help to empower ECRs to create their own pathways to impact that best suit their individual goals, circumstances, interests and strengths.
URL : An introduction to achieving policy impact for early career researchers
Alternative location : https://www.nature.com/articles/s41599-018-0144-2