The open access movement seeks to encourage all researchers to make their works openly available and free of paywalls so more people can access their knowledge. Yet some researchers who study open access (OA) continue to publish their work in paywalled journals and fail to make it open.
This project set out to study just how many published research articles about OA fall into this category, how many are being made open (whether by being published in a gold OA or hybrid journal or through open deposit), and how library and information science authors compare to other disciplines researching this field.
Because of the growth of tools available to help researchers find open versions of articles, this study also sought to compare how these new tools compare to Google Scholar in their ability to disseminating OA research.
From a sample collected from Web of Science of articles published since 2010, the study found that although a majority of research articles about OA are open in some form, a little more than a quarter are not.
A smaller rate of library science researchers made their work open compared to non-library science researchers. In looking at the copyright of these articles published in hybrid and open journals, authors were more likely to retain copyright ownership if they printed in an open journal compared to authors in hybrid journals.
Articles were more likely to be published with a Creative Commons license if published in an open journal compared to those published in hybrid journals.
Funding acknowledgements found in scientific publications have been used to study the impact of funding on research since the 1970s. However, no broad scale indexation of that paratextual element was done until 2008, when Thomson Reuters Web of Science started to add funding acknowledgement information to its bibliographic records.
As this new information provides a new dimension to bibliometric data that can be systematically exploited, it is important to understand the characteristics of these data and the underlying implications for their use.
This paper analyses the presence and distribution of funding acknowledgement data covered in Web of Science.
Our results show that prior to 2009 funding acknowledgements coverage is extremely low and therefore not reliable. Since 2008, funding information has been collected mainly for publications indexed in the Science Citation Index Expanded (SCIE); more recently (2015), inclusion of funding texts for publications indexed in the Social Science Citation Index (SSCI) has been implemented.
Arts & Humanities Citation Index (AHCI) content is not indexed for funding acknowledgement data. Moreover, English-language publications are the most reliably covered.
Finally, not all types of documents are equally covered for funding information indexation and only articles and reviews show consistent coverage.
The characterization of the funding acknowledgement information collected by Thomson Reuters can therefore help understand the possibilities offered by the data but also their limitations.
Journal classification systems play an important role in bibliometric analyses. The two most important bibliographic databases, Web of Science and Scopus, each provide a journal classification system. However, no study has systematically investigated the accuracy of these classification systems. To examine and compare the accuracy of journal classification systems, we define two criteria on the basis of direct citation relations between journals and categories.
We use Criterion I to select journals that have weak connections with their assigned categories, and we use Criterion II to identify journals that are not assigned to categories with which they have strong connections. If a journal satisfies either of the two criteria, we conclude that its assignment to categories may be questionable. Accordingly, we identify all journals with questionable classifications in Web of Science and Scopus.
Furthermore, we perform a more in-depth analysis for the field of Library and Information Science to assess whether our proposed criteria are appropriate and whether they yield meaningful results. It turns out that according to our citation-based criteria Web of Science performs significantly better than Scopus in terms of the accuracy of its journal classification system.
We compare the visibility and performance of Latin American and Caribbean (LAC) publications in the Core collection indexes included in the Web of Science (WoS) — Science Citation Index Expanded, Social Sciences Citation Index, and Arts & Humanities Citation Index — and the Scielo Citation Index (SciELO CI) which was integrated into the larger WoS platform in 2014.
The purpose of this comparison is to contribute to our understanding of the communication of scientific knowledge produced in Latin America and the Caribbean, and to provide some reflections on the potential benefits of the articulation of regional indexing exercises into WoS for a better understanding of geographic and disciplinary contributions.
How is the regional level of ScieLO CI related to the global one of WoS? In WoS, LAC authors are integrated at the global level in international networks; for example, as postdocs. In SciELO CI, south-south collaboration is more central, and the focus is shifted towards social problems. The articulation of SciELO into WoS may improve the international standardization (for example, of referencing) in the regional journals, but comes at the price of losing independence of the journal inclusion criteria.
« Les bases de données internationales de l’Institute for Scientific Information (ISI) sont des outils incontournables mais incomplets pour évaluer la performance de la recherche et fournir des indicateurs statistiques sur le volume de la production scientifique d’un pays. Dans ce contexte, nous présenterons les résultats d’une étude bibliométrique de la production scientifique issue de la Faculté de Médecine et de Pharmacie-Casablanca. Nous mettrons l’accent sur les possibilités offertes par l’open access (la voie verte et la voie dorée) pour augmenter la visibilité de la production locale. »
A Comparison between Two Main Academic Literature Collections: Web of Science and Scopus Databases :
« Nowadays, the worlds scientific community has been publishing an enormous number of papers in different scientific fields. In such environment, it is essential to know which databases are equally efficient and objective for literature searches. It seems that two most extensive databases are Web of Science and Scopus. Besides searching the literature, these two databases used to rank journals in terms of their productivity and the total citations received to indicate the journals impact, prestige or influence. This article attempts to provide a comprehensive comparison of these databases to answer frequent questions which researchers ask, such as: How Web of Science and Scopus are different? In which aspects these two databases are similar? Or, if the researchers are forced to choose one of them, which one should they prefer? For answering these questions, these two databases will be compared based on their qualitative and quantitative characteristics. »
Institutional repositories have spread in universities where they provide services for recording, distributing, and preserving the institution’s intellectual output. When the Lausanne « academic server », named SERVAL, was launched at the end of 2008, the Faculty of Biology and Medicine addressed from the outset the issue of quality of metadata. Accuracy is fundamental since research funds are allocated on the basis of the statistics and indicators provided by the repository. The Head of faculty also charged the medical library to explore different ways to measure and assess the research output. The first step for the Lausanne university medical library was to implement the PubMed and the Web of Science web services to easily extract clean bibliographic information from the databases directly into the repository.
Now the medical library is testing other web services (from CrossRef, Web of Science, etc.) to generate quantitative data on research impact mainly. The approach is essentially based on citation linking. Although the utility of citation and bibliometric evaluation is still debated, the most prevalent output measures used for research evaluation are still those based on citation analysis. Even when a new scientific evaluation indicator is proposed, such as h-index, we can always see its link with citation. Additionally, the results of a new indicator are often compared with citation analysis. The presentation will review the web services which might be used in institutional repositories to collect and aggregate citation information for the researchers’ publications.