Towards a Book Publishers Citation Reports First approach…

Statut

Towards a Book Publishers Citation Reports. First approach using the Book Citation Index :

“The absence of books and book chapters in the Web of Science Citation Indexes (SCI, SSCI and A&HCI) has always been considered an important flaw but the Thomson Reuters ‘Book Citation Index’ database was finally available in October of 2010 indexing 29,618 books and 379,082 book chapters. The Book Citation Index opens a new window of opportunities for analyzing these fields from a bibliometric point of view. The main objective of this article is to analyze different impact indicators referred to the scientific publishers included in the Book Citation Index for the Social Sciences and Humanities fields during 2006-2011. This way we construct what we have called the ‘Book Publishers Citation Reports’. For this, we present a total of 19 rankings according to the different disciplines in Humanities & Arts and Social Sciences & Law with six indicators for scientific publishers”

URL : http://arxiv.org/abs/1207.7067

The Five Stars of Online Journal Articles —…

The Five Stars of Online Journal Articles — a Framework for Article Evaluation :

“I propose five factors — peer review, open access, enriched content, available datasets and machine-readable metadata — as the Five Stars of Online Journal Articles, a constellation of five independent criteria within a multi-dimensional publishing universe against which online journal articles can be evaluated, to see how well they match up with current visions for enhanced research communications. Achievement along each of these publishing axes can vary, analogous to the different stars within the constellation shining with varying luminosities. I suggest a five-point scale for each, by which a journal article can be evaluated, and provide diagrammatic representations for such evaluations. While the criteria adopted for these scales are somewhat arbitrary, and while the rating of a particular article on each axis may involve elements of subjective judgment, these Five Stars of Online Journal Articles provide a conceptual framework by which to judge the degree to which any article achieves or falls short of the ideal, which should be useful to authors, editors and publishers. I exemplify such evaluations using my own recent publications of relevance to semantic publishing.”

URL : http://www.dlib.org/dlib/january12/shotton/01shotton.html

Réseaux de co citations et Open Access pour…

Réseaux de co-citations et Open Access : pour un renouveau des méthodes d’évaluation :

“Depuis quelques années, la méthodologie en matière d’évaluation des publications scientifiques fait l’objet d’une réflexion accrue, tant de la part des chercheurs que des professionnels de la documentation appelés à l’appliquer. Le modèle classique, en usage dans la majorité des instances administratives, émanant de l’Institute for Scientific Information de Philadelphie (ISI) révèle un certain nombre de limites notamment dues à l’absence de nuances et à des modes de calcul bruts qui semblent privilégier l’aspect comptable et quantitatif au détriment de la qualité. Les modèles alternatifs proposent l’application fine et nuancée de la méthode mathématique de la ” marche aléatoire ” (random walk) avec l’exécution itérative de l’algorithme de Lawrence Page (utilisé par Google pour le tri et l’affichage des pages Web) algorithme pondéré connu sous l’appellation Weighted PageRank. Associée à la détection des co-citations, cette méthode permet de visualiser des collèges invisibles, des affinités entre titres de périodiques et surtout la traçabilité des citations. Celle-ci est déterminante dans la définition qualitative de l’évaluation qui semble minorée dans le modèle classique.”

URL : http://archivesic.ccsd.cnrs.fr/sic_00589630/fr/

Fractional counting of citations in rese…

Fractional counting of citations in research evaluation: An option for cross- and interdisciplinary assessments :

“In the case of the scientometric evaluation of multi- or interdisciplinary units one risks to compare apples with oranges: each paper has to assessed in comparison to an appropriate reference set. We suggest that the set of citing papers first can be considered as the relevant representation of the field of impact. In order to normalize for differences in citation behavior among fields, citations can be fractionally counted proportionately to the length of the reference lists in the citing papers. This new method enables us to compare among units with different disciplinary affiliations at the paper level and also to assess the statistical significance of differences among sets. Twenty-seven departments of the Tsinghua University in Beijing are thus compared. Among them, the Department of Chinese Language and Linguistics is upgraded from the 19th to the second position in the ranking. The overall impact of 19 of the 27 departments is not significantly different at the 5% level when thus normalized for different citation potentials”.

URL : http://arxiv.org/abs/1012.0359

Web Services for Bibliometrics

Institutional repositories have spread in universities where they provide services for recording, distributing, and preserving the institution’s intellectual output. When the Lausanne “academic server”, named SERVAL, was launched at the end of 2008, the Faculty of Biology and Medicine addressed from the outset the issue of quality of metadata. Accuracy is fundamental since research funds are allocated on the basis of the statistics and indicators provided by the repository. The Head of faculty also charged the medical library to explore different ways to measure and assess the research output. The first step for the Lausanne university medical library was to implement the PubMed and the Web of Science web services to easily extract clean bibliographic information from the databases directly into the repository.

Now the medical library is testing other web services (from CrossRef, Web of Science, etc.) to generate quantitative data on research impact mainly. The approach is essentially based on citation linking. Although the utility of citation and bibliometric evaluation is still debated, the most prevalent output measures used for research evaluation are still those based on citation analysis. Even when a new scientific evaluation indicator is proposed, such as h-index, we can always see its link with citation. Additionally, the results of a new indicator are often compared with citation analysis. The presentation will review the web services which might be used in institutional repositories to collect and aggregate citation information for the researchers’ publications.

URL : http://archivesic.ccsd.cnrs.fr/sic_00540289/fr/