How to reach a wider audience with open access publishing: what research universities can learn from universities of applied sciences

Authors : Saskia Woutersen-Windhouwer, Jaroen Kuijper

In Amsterdam, the libraries of the University of Amsterdam (UvA) and the Amsterdam University of Applied Sciences (AUAS) cooperate closely. In this cooperation, the differences between a research university (i.c. UvA) and a university of applied sciences (i.c. AUAS) become particularly clear when we look at the aim and implementation of open access policies.

The open access plan of the AUAS removes not only financial and legal barriers, but also language barriers.

This makes the research output FAIR (findable, accessible, interoperable and reusable) to the primary target group of the product, and more importantly, it enables interaction between the AUAS and a wide audience, consisting of researchers from other disciplines, and a wide range of professionals, enterprises, civil servants, schools and citizens.

In the search for co-financing by enterprises and other stakeholders, and to fulfil their valorisation requirements, these target groups are currently becoming more important for research universities as well. Here, we show what research universities can learn from the open access policy of the AUAS.

URL : How to reach a wider audience with open access publishing: what research universities can learn from universities of applied sciences

DOI : http://doi.org/10.18352/lq.10237

Universalizing science: alternative indices to direct research

Authors : Ari Melo Mariano, Maíra Rocha Santos

Measurement is a complicated but very necessary task. Many indices have been created in an effort to define the quality of knowledge produced but they have attracted strong criticism, having become synonymous with individualism, competition and mere productivity and, furthermore, they fail to head science towards addressing local demands or towards producing international knowledge by means of collaboration.

Institutions, countries, publishers, governments and authors have a latent need to create quality and productivity indices because they can serve as filters that influence far-reaching decision making and even decisions on the professional promotion of university teachers.

Even so, in the present-day context, the very creators of those indices admit that they were not designed for that purpose, given that different research areas, the age of the researcher, the country and the language spoken all have an influence on the index calculations.

Accordingly, this research sets out three indices designed to head science towards its universal objective by valuing collaboration and the dissemination of knowledge.

It is hoped that the proposed indices may provoke new discussions and the proposal of new, more assertive indicators for the analysis of scientific research quality.

URL : https://arxiv.org/abs/1807.07595

Scholarly Communication Librarians’ Relationship with Research Impact Indicators: An Analysis of a National Survey of Academic Librarians in the United States

Authors: Rachel Ann Miles, Stacy Konkiel, Sarah Sutton

INTRODUCTION

Academic librarians, especially in the field of scholarly communication, are often expected to understand and engage with research impact indicators. However, much of the current literature speculates about how academic librarians are using and implementing research impact indicators in their practice.

METHODS

This study analyzed the results from a 2015 survey administered to over 13,000 academic librarians at Carnegie-classified R1 institutions in the United States. The survey concentrated on academic librarians’ familiarity with and usage of research impact indicators.

RESULTS

This study uncovered findings related to academic librarians’ various levels of familiarity with research impact indicators and how they implement and use research impact indicators in their professional development and in their library job duties.

DISCUSSION

In general, academic librarians with regular scholarly communication support duties tend to have higher levels of familiarity of research impact indicators. In general, academic librarians are most familiar with the citation counts and usage statistics and least familiar with altmetrics.

During consultations with faculty, the Journal Impact Factor (JIF) and citation counts are more likely to be addressed than the author h-index, altmetrics, qualitative measures, and expert peer reviews.

The survey results also hint towards a growing interest in altmetrics among academic librarians for their professional advancement.

CONCLUSION

Academic librarians are continually challenged to keep pace with the changing landscape of research impact metrics and research assessment models. By keeping pace and implementing research impact indicators in their own practices, academic librarians can provide a crucial service to the wider academic community.

URL : Scholarly Communication Librarians’ Relationship with Research Impact Indicators: An Analysis of a National Survey of Academic Librarians in the United States

DOI : http://doi.org/10.7710/2162-3309.2212

Do altmetrics assess societal impact in the same way as case studies? An empirical analysis testing the convergent validity of altmetrics based on data from the UK Research Excellence Framework (REF)

Authors : Lutz Bornmann, Robin Haunschild, Jonathan Adams

Altmetrics have been proposed as a way to assess the societal impact of research. Although altmetrics are already in use as impact or attention metrics in different contexts, it is still not clear whether they really capture or reflect societal impact.

This study is based on altmetrics, citation counts, research output and case study data from the UK Research Excellence Framework (REF), and peers’ REF assessments of research output and societal impact. We investigated the convergent validity of altmetrics by using two REF datasets: publications submitted as research output (PRO) to the REF and publications referenced in case studies (PCS).

Case studies, which are intended to demonstrate societal impact, should cite the most relevant research papers. We used the MHq’ indicator for assessing impact – an indicator which has been introduced for count data with many zeros.

The results of the first part of the analysis show that news media as well as mentions on Facebook, in blogs, in Wikipedia, and in policy-related documents have higher MHq’ values for PCS than for PRO.

Thus, the altmetric indicators seem to have convergent validity for these data. In the second part of the analysis, altmetrics have been correlated with REF reviewers’ average scores on PCS. The negative or close to zero correlations question the convergent validity of altmetrics in that context.

We suggest that they may capture a different aspect of societal impact (which can be called unknown attention) to that seen by reviewers (who are interested in the causal link between research and action in society).

URL : https://arxiv.org/abs/1807.03977

An introduction to achieving policy impact for early career researchers

Authors : Megan C Evans, Christopher Cvitanovic

Scientists are increasingly required to demonstrate the real world tangible impacts arising from their research. Despite significant advances in scholarship dedicated to understanding and improving the relationships between science, policy and practice, much of the existing literature remains high level, theoretical, and not immediately accessible to early career researchers (ECRs) who work outside of the policy sciences.

In this paper, we draw on the literature and our own experiences working in the environmental sciences to provide an accessible resource for ECRs seeking to achieve policy impact in their chosen field. First, we describe key concepts in public policy to provide sufficient background for the non-expert.

Next, we articulate a number of practical steps and tools that can help ECRs to identify and enhance the policy relevance of their research, better understand the policy world in practice and identify a range of pathways to achieving impact.

Finally, we draw on our personal experiences to highlight some of the key individual characteristics and values that are needed to operate more effectively at the interface of science, policy and practice.

Our hope is that the information and tools provided here can help to empower ECRs to create their own pathways to impact that best suit their individual goals, circumstances, interests and strengths.

URL : An introduction to achieving policy impact for early career researchers

Alternative location : https://www.nature.com/articles/s41599-018-0144-2

Google Scholar as a data source for research assessment

Authors : Emilio Delgado López-Cózar, Enrique Orduna-Malea, Alberto Martín-Martín

The launch of Google Scholar (GS) marked the beginning of a revolution in the scientific information market. This search engine, unlike traditional databases, automatically indexes information from the academic web. Its ease of use, together with its wide coverage and fast indexing speed, have made it the first tool most scientists currently turn to when they need to carry out a literature search.

Additionally, the fact that its search results were accompanied from the beginning by citation counts, as well as the later development of secondary products which leverage this citation data (such as Google Scholar Metrics and Google Scholar Citations), made many scientists wonder about its potential as a source of data for bibliometric analyses.

The goal of this chapter is to lay the foundations for the use of GS as a supplementary source (and in some disciplines, arguably the best alternative) for scientific evaluation.

First, we present a general overview of how GS works. Second, we present empirical evidences about its main characteristics (size, coverage, and growth rate). Third, we carry out a systematic analysis of the main limitations this search engine presents as a tool for the evaluation of scientific performance.

Lastly, we discuss the main differences between GS and other more traditional bibliographic databases in light of the correlations found between their citation data. We conclude that Google Scholar presents a broader view of the academic world because it has brought to light a great amount of sources that were not previously visible.

URL : https://arxiv.org/abs/1806.04435

Collaboration Diversity and Scientific Impact

Authors : Yuxiao Dong, Hao Ma, Jie Tang, Kuansan Wang

The shift from individual effort to collaborative output has benefited science, with scientific work pursued collaboratively having increasingly led to more highly impactful research than that pursued individually.

However, understanding of how the diversity of a collaborative team influences the production of knowledge and innovation is sorely lacking. Here, we study this question by breaking down the process of scientific collaboration of 32.9 million papers over the last five decades.

We find that the probability of producing a top-cited publication increases as a function of the diversity of a team of collaborators—namely, the distinct number of institutions represented by the team.

We discover striking phenomena where a smaller, yet more diverse team is more likely to generate highly innovative work than a relatively larger team within one institution.

We demonstrate that the synergy of collaboration diversity is universal across different generations, research fields, and tiers of institutions and individual authors.

Our findings suggest that collaboration diversity strongly and positively correlates with the production of scientific innovation, giving rise to the potential revolution of the policies used by funding agencies and authorities to fund research projects, and broadly the principles used to organize teams, organizations, and societies.

URL : https://arxiv.org/abs/1806.03694