Author : Science-Metrix Inc.
This report details population-level measurements of the open access (OA) availability of publications indexed in two bibliometric databases—the Web of Science (WoS) by Clarivate Analytics and Scopus by Elsevier. This was achieved by matching the bibliometric database populations to the 1science database to determine the availability of the papers in OA form.
A comparative analysis of the recall and precision levels of the 1science database was performed using Scopus and the WoS. This helped to characterize the 1science database. Two policy-relevant indicators were selected for in-depth analyses: country affiliation of authors on publications, and scientific disciplines. These indicators were selected because they are very frequently used in bibliometric studies, including those performed by the National Science Foundation (NSF), and they appear in the NSF’s Science and Engineering Indicators.
URL : http://www.science-metrix.com/sites/default/files/science-metrix/publications/science-metrix_open_access_availability_scientific_publications_report.pdf
Authors : Jeroen Bosman, Bianca Kramer
Across the world there is growing interest in open access publishing among researchers, institutions, funders and publishers alike. It is assumed that open access levels are growing, but hitherto the exact levels and patterns of open access have been hard to determine and detailed quantitative studies are scarce.
Using newly available open access status data from oaDOI in Web of Science we are now able to explore year-on-year open access levels across research fields, languages, countries, institutions, funders and topics, and try to relate the resulting patterns to disciplinary, national and institutional contexts.
With data from the oaDOI API we also look at the detailed breakdown of open access by types of gold open access (pure gold, hybrid and bronze), using universities in the Netherlands as an example.
There is huge diversity in open access levels on all dimensions, with unexpected levels for e.g. Portuguese as language, Astronomy & Astrophysics as research field, countries like Tanzania, Peru and Latvia, and Zika as topic.
We explore methodological issues and offer suggestions to improve conditions for tracking open access status of research output. Finally, we suggest potential future applications for research and policy development. We have shared all data and code openly.
URL : Open access levels: a quantitative exploration using Web of Science and oaDOI data
DOI : https://doi.org/10.7287/peerj.preprints.3520v1
Authors : Rodrigo Costas, Jeroen van Honk, Thomas Franssen
In this paper we present a novel methodology for identifying scholars with a Twitter account. By combining bibliometric data from Web of Science and Twitter users identified by Altmetric.com we have obtained the largest set of individual scholars matched with Twitter users made so far.
Our methodology consists of a combination of matching algorithms, considering different linguistic elements of both author names and Twitter names; followed by a rule-based scoring system that weights the common occurrence of several elements related with the names, individual elements and activities of both Twitter users and scholars matched.
Our results indicate that about 2% of the overall population of scholars in the Web of Science is active on Twitter. By domain we find a strong presence of researchers from the Social Sciences and the Humanities. Natural Sciences is the domain with the lowest level of scholars on Twitter.
Researchers on Twitter also tend to be younger than those that are not on Twitter. As this is a bibliometric-based approach, it is important to highlight the reliance of the method on the number of publications produced and tweeted by the scholars, thus the share of scholars on Twitter ranges between 1% and 5% depending on their level of productivity. Further research is suggested in order to improve and expand the methodology.
URL : https://arxiv.org/abs/1712.05667
Journals were central to Eugene Garfield’s research interests. Among other things, journals are considered as units of analysis for bibliographic databases such as the Web of Science and Scopus. In addition to providing a basis for disciplinary classifications of journals, journal citation patterns span networks across boundaries to variable extents.
Using betweenness centrality (BC) and diversity, we elaborate on the question of how to distinguish and rank journals in terms of interdisciplinarity. Interdisciplinarity, however, is difficult to operationalize in the absence of an operational definition of disciplines; the diversity of a unit of analysis is sample-dependent. BC can be considered as a measure of multi-disciplinarity.
Diversity of co-citation in a citing document has been considered as an indicator of knowledge integration, but an author can also generate trans-disciplinary—that is, non-disciplined—variation by citing sources from other disciplines.
Diversity in the bibliographic coupling among citing documents can analogously be considered as diffusion or differentiation of knowledge across disciplines. Because the citation networks in the cited direction reflect both structure and variation, diversity in this direction is perhaps the best available measure of interdisciplinarity at the journal level.
Furthermore, diversity is based on a summation and can therefore be decomposed; differences among (sub)sets can be tested for statistical significance. In the appendix, a general-purpose routine for measuring diversity in networks is provided.
URL : Betweenness and diversity in journal citation networks as measures of interdisciplinarity—A tribute to Eugene Garfield
DOI : https://doi.org/10.1007/s11192-017-2528-2
Author : Teresa Schultz
The open access movement seeks to encourage all researchers to make their works openly available and free of paywalls so more people can access their knowledge. Yet some researchers who study open access (OA) continue to publish their work in paywalled journals and fail to make it open.
This project set out to study just how many published research articles about OA fall into this category, how many are being made open (whether by being published in a gold OA or hybrid journal or through open deposit), and how library and information science authors compare to other disciplines researching this field.
Because of the growth of tools available to help researchers find open versions of articles, this study also sought to compare how these new tools compare to Google Scholar in their ability to disseminating OA research.
From a sample collected from Web of Science of articles published since 2010, the study found that although a majority of research articles about OA are open in some form, a little more than a quarter are not.
A smaller rate of library science researchers made their work open compared to non-library science researchers. In looking at the copyright of these articles published in hybrid and open journals, authors were more likely to retain copyright ownership if they printed in an open journal compared to authors in hybrid journals.
Articles were more likely to be published with a Creative Commons license if published in an open journal compared to those published in hybrid journals.
URL : Practicing What You Preach: Evaluating Access of Open Access Research
DOI : https://dx.doi.org/10.17605/OSF.IO/YBDR8
Funding acknowledgements found in scientific publications have been used to study the impact of funding on research since the 1970s. However, no broad scale indexation of that paratextual element was done until 2008, when Thomson Reuters Web of Science started to add funding acknowledgement information to its bibliographic records.
As this new information provides a new dimension to bibliometric data that can be systematically exploited, it is important to understand the characteristics of these data and the underlying implications for their use.
This paper analyses the presence and distribution of funding acknowledgement data covered in Web of Science.
Our results show that prior to 2009 funding acknowledgements coverage is extremely low and therefore not reliable. Since 2008, funding information has been collected mainly for publications indexed in the Science Citation Index Expanded (SCIE); more recently (2015), inclusion of funding texts for publications indexed in the Social Science Citation Index (SSCI) has been implemented.
Arts & Humanities Citation Index (AHCI) content is not indexed for funding acknowledgement data. Moreover, English-language publications are the most reliably covered.
Finally, not all types of documents are equally covered for funding information indexation and only articles and reviews show consistent coverage.
The characterization of the funding acknowledgement information collected by Thomson Reuters can therefore help understand the possibilities offered by the data but also their limitations.
URL : http://arxiv.org/abs/1604.04780
Journal classification systems play an important role in bibliometric analyses. The two most important bibliographic databases, Web of Science and Scopus, each provide a journal classification system. However, no study has systematically investigated the accuracy of these classification systems. To examine and compare the accuracy of journal classification systems, we define two criteria on the basis of direct citation relations between journals and categories.
We use Criterion I to select journals that have weak connections with their assigned categories, and we use Criterion II to identify journals that are not assigned to categories with which they have strong connections. If a journal satisfies either of the two criteria, we conclude that its assignment to categories may be questionable. Accordingly, we identify all journals with questionable classifications in Web of Science and Scopus.
Furthermore, we perform a more in-depth analysis for the field of Library and Information Science to assess whether our proposed criteria are appropriate and whether they yield meaningful results. It turns out that according to our citation-based criteria Web of Science performs significantly better than Scopus in terms of the accuracy of its journal classification system.
URL : http://arxiv.org/abs/1511.00735