Traditional metrics largely overlook grey literature. The new altmetrics introduced in 2010 as » new, online scholarly tools (that allow) to make new filters » (Altmetrics Manifesto), can include all kinds of scholarly output which makes them interesting for grey literature.
The topic of our paper is the connection between altmetrics and grey literature. Do altmetrics offer new opportunities for the development and impact of grey literature?
In particular, the paper explores how altmetrics could add value to grey literature, in particular how reference managers, repositories, academic search engines and social networks can produce altmetrics of dissertations, reports, conference papers etc.
We explore, too, how new altmetric tools incorporate grey literature as source for impact assessment, and if they do. The discussion analyses the potential but also the limits of the actual application of altmetrics to grey literatures and highlights the importance of unique identifiers, above all the DOI.
For the moment, grey literature missed the opportunity to get on board of the new movement.
However, getting grey literature into the heart of the coming mainstream adoption of altmetrics is not only essential for the future of grey literature in open science but also for academic and institutional control of research output and societal impact.This can be a special mission for academic librarians.
The use of web 2.0 is becoming the essential part of present day life. People are spending time for many purposes and academic activities among these uses of web 2.0 social media services by users are prominent for searching, sharing, discussing, and messaging of scholarly content.
The wider use of social media has given birth to various buzz words and ‘altmetrics’ is one of them. In simple words, altmetrics provides online measurement of scholars or scholarly content derived from the web 2.0 social media platforms.
Altmetrics is diversified in nature and categorised in five categories i.e. (i) recommended (ii) cited (iii) saved (iv) discussed and (v) viewed. Altmetrics are becoming widely used by publishers (for showcasing research impact of authors over readers), librarians and repository managers (for adding value to their libraries and institutional repositories) and by the researchers (for complementing reading by instantly visualising papers online attention).
The study aims to investigate the relationships between consumption of e-journals distributed by Elsevier ScienceDirect platform, publication (articles) and impact (citations) in a sample of 13 French universities, from 2003 to 2009.
It adopts a value perspective as it questions whether or not publication activity and impact are some kind of return led by consumption. A bibliometric approach was used to explore the relations between these three variables.
The analysis developed indicators inspired by the mathematical h-Index technique. Results show that the relation between consumption, publication and citations depends on the discipline’s profile, the intensity of research and the size of each institution.
Moreover, although relations have been observed between the three variables, it is not possible to determine which variable comes first to explain the phenomena. The study concludes by showing strong correlations, which nevertheless do not lead to clear causal relations.
The article provide practical implication for academic library managers who want to show the added value of their electronic e-journals collections can replicate the study approach. Also for policy makers who want to take into account e-journals usage as an informative tool to predict the importance of publication activity.
Originality: The study is the first French contribution to e-journal value studies. Its originality consists in developing a value viewpoint that relies on a bibliometric approach.
Authors : Frank Mueller-Langer, Marc Scheufen, Patrick Waelbroeck
Universities in developing countries have rarely been able to subscribe to academic journals in the past. The “Online Access to Research in the Environment” initiative (OARE) provides institutions in developing countries with free online access to more than 5,700 environmental science journals.
Here we analyze the effect of OARE registration on scientific output by research institutions in five developing countries. We apply a difference-in-difference estimation method using panel data for 18,955 journal articles from 798 research institutions.
We find that online access via OARE increases publication output by at least 43% while lower-ranked institutions located in remote areas benefit less. These results are robust when we apply instrumental variables to account for the information diffusion process and a Bayesian estimation method to control for self-selection into the initiative.
Authors : Trisha Greenhalgh, James Raftery, Steve Hanney, Matthew Glover
Impact occurs when research generates benefits (health, economic, cultural) in addition to building the academic knowledge base. Its mechanisms are complex and reflect the multiple ways in which knowledge is generated and utilised.
Much progress has been made in measuring both the outcomes of research and the processes and activities through which these are achieved, though the measurement of impact is not without its critics.
We review the strengths and limitations of six established approaches (Payback, Research Impact Framework, Canadian Academy of Health Sciences, monetisation, societal impact assessment, UK Research Excellence Framework) plus recently developed and largely untested ones (including metrics and electronic databases).
We conclude that (1) different approaches to impact assessment are appropriate in different circumstances; (2) the most robust and sophisticated approaches are labour-intensive and not always feasible or affordable; (3) whilst most metrics tend to capture direct and proximate impacts, more indirect and diffuse elements of the research-impact link can and should be measured; and (4) research on research impact is a rapidly developing field with new methodologies on the horizon.
Many research libraries are looking for new ways to demonstrate value for their parent institutions. Metrics, assessment, and promotion of research continue to grow in importance, but have not always fallen into the scope of services for the research library.
Montana State University (MSU) Library recognized a need and interest to quantify the citation record and scholarly output of our university. Within this vision in mind, we began positioning citation collection as the data engine that drives scholarly communication, deposits into our IR, and assessment of research activities.
We envisioned a project that might: provide transparency around the acts of scholarship at our university; celebrate the research we produce; and build new relationships between our researchers.
Authors : Simon Wakeling, Peter Willett, Claire Creaser, Jenny Fry, Stephen Pinfield, Valérie Spezi
In this paper we present the first comprehensive bibliometric analysis of eleven open-access mega-journals (OAMJs).
OAMJs are a relatively recent phenomenon, and have been characterised as having four key characteristics: large size; broad disciplinary scope; a Gold-OA business model; and a peer-review policy that seeks to determine only the scientific soundness of the research rather than evaluate the novelty or significance of the work. Our investigation focuses on four key modes of analysis: journal outputs (the number of articles published and changes in output over time); OAMJ author characteristics (nationalities and institutional affiliations); subject areas (the disciplinary scope of OAMJs, and variations in sub-disciplinary output); and citation profiles (the citation distributions of each OAMJ, and the impact of citing journals).
We found that while the total output of the eleven mega-journals grew by 14.9% between 2014 and 2015, this growth is largely attributable to the increased output of Scientific Reports and Medicine.
We also found substantial variation in the geographical distribution of authors. Several journals have a relatively high proportion of Chinese authors, and we suggest this may be linked to these journals’ high Journal Impact Factors (JIFs).
The mega-journals were also found to vary in subject scope, with several journals publishing disproportionately high numbers of articles in certain sub-disciplines.
Our citation analsysis offers support for Björk & Catani’s suggestion that OAMJs’s citation distributions can be similar to those of traditional journals, while noting considerable variation in citation rates across the eleven titles.
We conclude that while the OAMJ term is useful as a means of grouping journals which share a set of key characteristics, there is no such thing as a “typical” mega-journal, and we suggest several areas for additional research that might help us better understand the current and future role of OAMJs in scholarly communication.