Authors : David M. Nichols, Michael B. Twidale
The characterization of scholarly communication is dominated by citation-based measures. In this paper we propose several metrics to describe different facets of open access and open research.
We discuss measures to represent the public availability of articles along with their archival location, licenses, access costs, and supporting information. Calculations illustrating these new metrics are presented using the authors’ publications.
We argue that explicit measurement of openness is necessary for a holistic description of research outputs.
URL : http://hdl.handle.net/10289/10842
Authors : Qing Ke, Yong-Yeol Ahn, Cassidy R. Sugimoto
Metrics derived from Twitter and other social media—often referred to as altmetrics—are increasingly used to estimate the broader social impacts of scholarship. Such efforts, however, may produce highly misleading results, as the entities that participate in conversations about science on these platforms are largely unknown.
For instance, if altmetric activities are generated mainly by scientists, does it really capture broader social impacts of science? Here we present a systematic approach to identifying and analyzing scientists on Twitter.
Our method can identify scientists across many disciplines, without relying on external bibliographic data, and be easily adapted to identify other stakeholder groups in science.
We investigate the demographics, sharing behaviors, and interconnectivity of the identified scientists.
We find that Twitter has been employed by scholars across the disciplinary spectrum, with an over-representation of social and computer and information scientists; under-representation of mathematical, physical, and life scientists; and a better representation of women compared to scholarly publishing.
Analysis of the sharing of URLs reveals a distinct imprint of scholarly sites, yet only a small fraction of shared URLs are science-related. We find an assortative mixing with respect to disciplines in the networks between scientists, suggesting the maintenance of disciplinary walls in social media.
Our work contributes to the literature both methodologically and conceptually—we provide new methods for disambiguating and identifying particular actors on social media and describing the behaviors of scientists, thus providing foundational information for the construction and use of indicators on the basis of social media metrics.
URL : A systematic identification and analysis of scientists on Twitter
DOI : https://doi.org/10.1371/journal.pone.0175368
« This report presents the findings and recommendations of the Independent Review of the Role of Metrics in Research Assessment and Management. The review was chaired by Professor James Wilsdon, supported by an independent and multidisciplinary group of experts in scientometrics, research funding, research policy, publishing, university management and administration.
This review has gone beyond earlier studies to take a deeper look at potential uses and limitations of research metrics and indicators. It has explored the use of metrics across different disciplines, and assessed their potential contribution to the development of research excellence and impact. It has analysed their role in processes of research assessment, including the next cycle of the Research Excellence Framework (REF). It has considered the changing ways in which universities are using quantitative indicators in their management systems, and the growing power of league tables and rankings. And it has considered the negative or unintended effects of metrics on various aspects of research culture. »
URL : http://microblogging.infodocs.eu/wp-content/uploads/2015/07/2015_metric_tide.pdf
Related URL : http://www.hefce.ac.uk/media/HEFCE,2014/Content/Pubs/Independentresearch/2015/The,Metric,Tide/2015_metric_tide.pdf
« Evaluating and comparing the academic performance of a journal, a researcher or a single paper has long remained a critical, necessary but also controversial issue. Most of existing metrics invalidate comparison across different fields of science or even between different types of papers in the same field. This paper proposes a new metric, called return on citation (ROC), which is simply a citation ratio but applies to evaluating the paper, the journal and the researcher in a consistent way, allowing comparison across different fields of science and between different types of papers and discouraging unnecessary and coercive/self-citation. »
URL : http://arxiv.org/abs/1412.8420
Open Access and the Changing Landscape of Research Impact Indicators: New Roles for Repositories :
« The debate about the need to revise metrics that evaluate research excellence has been ongoing for years, and a number of studies have identified important issues that have yet to be addressed. Internet and other technological developments have enabled the collection of richer data and new approaches to research assessment exercises. Open access strongly advocates for maximizing research impact by enhancing seamless accessibility. In addition, new tools and strategies have been used by open access journals and repositories to showcase how science can benefit from free online dissemination. Latest players in the debate include initiatives based on alt-metrics, which enrich the landscape with promising indicators. To start with, the article gives a brief overview of the debate and the role of open access in advancing a new frame to assess science. Next, the work focuses on the strategy that the Spanish National Research Council’s repository DIGITAL.CSIC is implementing to collect a rich set of statistics and other metrics that are useful for repository administrators, researchers and the institution alike. A preliminary analysis of data hints at correlations between free dissemination of research through DIGITAL.CSIC and enhanced impact, reusability and sharing of CSIC science on the web. »
URL : http://www.mdpi.com/2304-6775/1/2/56
Universality of scholarly impact metrics:
« We present a method to quantify the disciplinary bias of any scholarly impact metric, and introduce a simple universal metric that allows to compare the impact of scholars across scientific disciplines. »
URL : http://arxiv.org/abs/1305.6339
Repositories in Google Scholar Metrics or what is this document type doing in a place as such? :
« The present paper analyzes GS Metrics, Google’s newest product aiming at ranking journals according to their H-Index. Specifically, we analyze GS Metrics’ decision of considering journals and repositories as equal and therefore, including them in the product. In this sense, the authors position themselves against this decision and provide several arguments of different nature warning against the shortcomings this product has. The first one is of a conceptual nature and is related to the definition of journal and repository. Secondly, they refer at the methodological issues mixing repositories and journals can bring out. Then, they deepen on many other flaws GS Metrics presents. Finally, GS Metrics and its possible use as an evaluation tool are discussed and possible solutions to its shortcomings are provided. »
URL : http://cybermetrics.cindoc.csic.es/articles/v16i1p4.html