Evaluation of research activities of universities of Ukraine and Belarus: a set of bibliometric indicators and its implementation

Authors : Vladimir Lazarev, Serhii Nazarovets, Alexey Skalaban

Monitoring bibliometric indicators of University rankings is considered as a subject of a University library activity. In order to fulfill comparative assessment of research activities of the universities of Ukraine and Belarus the authors introduced a set of bibliometric indicators.

A comparative assessment of the research activities of corresponding universities was fulfilled; the data on the leading universities are presented. The sensitivity of the one of the indicators to rapid changes of the research activity of universities and the fact that the other one is normalized across the fields of science condition advantage of the proposed set over the one that was used in practice of the corresponding national rankings.

URL : https://arxiv.org/abs/1711.02059

Usage Bibliometrics as a Tool to Measure Research Activity

Authors : Edwin A. Henneken, Michael J. Kurtz

Measures for research activity and impact have become an integral ingredient in the assessment of a wide range of entities (individual researchers, organizations, instruments, regions, disciplines).

Traditional bibliometric indicators, like publication and citation based indicators, provide an essential part of this picture, but cannot describe the complete picture.

Since reading scholarly publications is an essential part of the research life cycle, it is only natural to introduce measures for this activity in attempts to quantify the efficiency, productivity and impact of an entity.

Citations and reads are significantly different signals, so taken together, they provide a more complete picture of research activity. Most scholarly publications are now accessed online, making the study of reads and their patterns possible.

Click-stream logs allow us to follow information access by the entire research community, real-time. Publication and citation datasets just reflect activity by authors. In addition, download statistics will help us identify publications with significant impact, but which do not attract many citations.

Click-stream signals are arguably more complex than, say, citation signals. For one, they are a superposition of different classes of readers. Systematic downloads by crawlers also contaminate the signal, as does browsing behavior.

We discuss the complexities associated with clickstream data and how, with proper filtering, statistically significant relations and conclusions can be inferred from download statistics.

We describe how download statistics can be used to describe research activity at different levels of aggregation, ranging from organizations to countries. These statistics show a correlation with socio-economic indicators.

A comparison will be made with traditional bibliometric indicators. We will argue that astronomy is representative of more general trends.

URL : https://arxiv.org/abs/1706.02153

A scientists’ view of scientometrics: Not everything that counts can be counted

Authors : Ralph Kenna, Olesya Mryglod, Bertrand Berche

Like it or not, attempts to evaluate and monitor the quality of academic research have become increasingly prevalent worldwide. Performance reviews range from at the level of individuals, through research groups and departments, to entire universities.

Many of these are informed by, or functions of, simple scientometric indicators and the results of such exercises impact onto careers, funding and prestige. However, there is sometimes a failure to appreciate that scientometrics are, at best, very blunt instruments and their incorrect usage can be misleading.

Rather than accepting the rise and fall of individuals and institutions on the basis of such imprecise measures, calls have been made for indicators be regularly scrutinised and for improvements to the evidence base in this area.

It is thus incumbent upon the scientific community, especially the physics, complexity-science and scientometrics communities, to scrutinise metric indicators. Here, we review recent attempts to do this and show that some metrics in widespread use cannot be used as reliable indicators research quality.

URL : https://arxiv.org/abs/1703.10407

Évaluation des productions scientifiques des innovations en SHS…

Évaluation des productions scientifiques : des innovations en SHS?:

Sommaire :

  • Discours d’inauguration, F Ruggiu
  • L’évaluation de la qualité des publications en économie, C Bosquet [et al.]
  • Témoignage : le cas des revues de psychologie, J Pétard
  • RIBAC : un outil au service des acteurs de la recherche en SHS, M Dassa [et al.]
  • L’évaluation en Sciences Humaines et Sociales : Comment mesurer ce qui compte, MC Maurel
  • Le classement des revues en SHS : nouvelles perspectives européennes., G Mirdal
  • Open Access et évaluation des productions scientifiques dans l’espace européen de la recherche, C Ramjoué
  • Introduction au libre accès dans la recherche, C Kosmopoulos
  • JournalBase, Une étude comparative internationale des bases de données des revues scientifiques en sciences humaines et sociales (SHS), M Dassa [et al.]
  • Les indicateurs de la recherche en SHS, J Dubucs
  • L’évaluation scientifique en SHS : les questions méthodologiques et perspectives de solutions, G Filliatreau
  • Les SHS au prisme de l’évaluation par l’AERES, P Glaude

URL : http://journalbase.sciencesconf.org/conference/journalbase/eda_fr.pdf

L’implication des bibliothèques universitaires francophones dans l’évaluation de…

L’implication des bibliothèques universitaires francophones dans l’évaluation de la recherche au travers du traitement des publications scientifiques: Belgique – France – Suisse – Canada :

« Depuis de nombreuses années la révolution numérique touche directement les professionnels de l’information et de la documentation. Plus récemment de nouvelles pressions pèsent sur les bibliothèques. Les contingences économiques, institutionnelles et scientifiques poussent le secteur à se remettre en question jusque dans ses fondements. A travers une enquête adressée au personnel des bibliothèques universitaires francophones de Belgique, France, Suisse et Canada, ce mémoire tente d’évaluer dans quelle mesure le personnel des BU perçoit les mutations en cours dans son environnement professionnel en général, et en particulier concernant l’évaluation de la recherche et le traitement des publications scientifiques. Les résultats font apparaître des disparités nationales et déterminent des observations globales. Généralement le personnel est favorable aux changements en matière de traitement des publications scientifiques et à l’évaluation de la recherche. Mais globalement, il n’est ni impliqué, ni préparé à une plus grande intégration de ses activités dans le contexte de la recherche au sein de son institution. »

URL : http://memsic.ccsd.cnrs.fr/mem_00741049

Towards a Book Publishers Citation Reports First approach…

Towards a Book Publishers Citation Reports. First approach using the Book Citation Index :

« The absence of books and book chapters in the Web of Science Citation Indexes (SCI, SSCI and A&HCI) has always been considered an important flaw but the Thomson Reuters ‘Book Citation Index’ database was finally available in October of 2010 indexing 29,618 books and 379,082 book chapters. The Book Citation Index opens a new window of opportunities for analyzing these fields from a bibliometric point of view. The main objective of this article is to analyze different impact indicators referred to the scientific publishers included in the Book Citation Index for the Social Sciences and Humanities fields during 2006-2011. This way we construct what we have called the ‘Book Publishers Citation Reports’. For this, we present a total of 19 rankings according to the different disciplines in Humanities & Arts and Social Sciences & Law with six indicators for scientific publishers »

URL : http://arxiv.org/abs/1207.7067

The Five Stars of Online Journal Articles —…

The Five Stars of Online Journal Articles — a Framework for Article Evaluation :

« I propose five factors — peer review, open access, enriched content, available datasets and machine-readable metadata — as the Five Stars of Online Journal Articles, a constellation of five independent criteria within a multi-dimensional publishing universe against which online journal articles can be evaluated, to see how well they match up with current visions for enhanced research communications. Achievement along each of these publishing axes can vary, analogous to the different stars within the constellation shining with varying luminosities. I suggest a five-point scale for each, by which a journal article can be evaluated, and provide diagrammatic representations for such evaluations. While the criteria adopted for these scales are somewhat arbitrary, and while the rating of a particular article on each axis may involve elements of subjective judgment, these Five Stars of Online Journal Articles provide a conceptual framework by which to judge the degree to which any article achieves or falls short of the ideal, which should be useful to authors, editors and publishers. I exemplify such evaluations using my own recent publications of relevance to semantic publishing. »

URL : http://www.dlib.org/dlib/january12/shotton/01shotton.html