Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World

Author : Raminta Pranckutė

Nowadays, the importance of bibliographic databases (DBs) has increased enormously, as they are the main providers of publication metadata and bibliometric indicators universally used both for research assessment practices and for performing daily tasks. Because the reliability of these tasks firstly depends on the data source, all users of the DBs should be able to choose the most suitable one.

Web of Science (WoS) and Scopus are the two main bibliographic DBs. The comprehensive evaluation of the DBs’ coverage is practically impossible without extensive bibliometric analyses or literature reviews, but most DBs users do not have bibliometric competence and/or are not willing to invest additional time for such evaluations.

Apart from that, the convenience of the DB’s interface, performance, provided impact indicators and additional tools may also influence the users’ choice. The main goal of this work is to provide all of the potential users with an all-inclusive description of the two main bibliographic DBs by gathering the findings that are presented in the most recent literature and information provided by the owners of the DBs at one place.

This overview should aid all stakeholders employing publication and citation data in selecting the most suitable DB.

URL : Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World

DOI : https://doi.org/10.3390/publications9010012

Consistency of interdisciplinarity measures

Authors : Qi Wang, Jesper Wiborg Schneider

Assessing interdisciplinarity is an important and challenging work in bibliometric studies. Previous studies tend to emphasize that the nature and concept of interdisciplinary is ambiguous and uncertain (e.g. Leydesdorff & Rafols 2010, Rafols & Meyer, 2010, Sugimoto & Weingart, 2014).

As a consequence, various different measures of interdisciplinarity have been proposed. However, few studies have examined the relations between these measures. In this context, this paper aims to systematically review these interdisciplinarity measures, and explore their inherent relations.

We examine these measures in relation to the Web of Science (WoS) journal subject categories (SCs), and also an interdisciplinary research center at Aarhus University.

In line with the conclusion of Digital Science (2016), our results reveal that the current situation of interdisciplinarity measurement in science studies is confusing and unsatisfying. We obtained surprisingly dissimilar results with measures that supposedly should measure similar features.

We suggest that interdisciplinarity as a measurement construct should be used and interpreted with caution in future research evaluation and research policies.

URL : https://arxiv.org/abs/1810.00577

A Multi-dimensional Investigation of the Effects of Publication Retraction on Scholarly Impact

Authors : Xin Shuai, Isabelle Moulinier, Jason Rollins, Tonya Custis, Frank Schilder, Mathilda Edmunds

Over the past few decades, the rate of publication retractions has increased dramatically in academia. In this study, we investigate retractions from a quantitative perspective, aiming to answer two fundamental questions.

One, how do retractions influence the scholarly impact of retracted papers, authors, and institutions? Two, does this influence propagate to the wider academic community through scholarly associations?

Specifically, we analyzed a set of retracted articles indexed in Thomson Reuters Web of Science (WoS), and ran multiple experiments to compare changes in scholarly impact against a control set of non-retracted articles, authors, and institutions.

We further applied the Granger Causality test to investigate whether different scientific topics are dynamically affected by retracted papers occurring within those topics.

Our results show two key findings: first, the scholarly impact of retracted papers and authors significantly decreases after retraction, and the most severe impact decrease correlates to retractions based on proven purposeful scientific misconduct; second, this retraction penalty does not seem to spread through the broader scholarly social graph, but instead has a limited and localized effect.

Our findings may provide useful insights for scholars or science committees to evaluate the scholarly value of papers, authors, or institutions related to retractions.

URL : https://arxiv.org/abs/1602.09123

The coverage of Microsoft Academic: Analyzing the publication output of a university

Authors : Sven E. Hug, Martin P. Brändle

This is the first in-depth study on the coverage of Microsoft Academic (MA). The coverage of a verified publication list of a university was analyzed on the level of individual publications in MA, Scopus, and Web of Science (WoS).

Citation counts were analyzed and issues related to data retrieval and data quality were examined. A Perl script was written to retrieve metadata from MA. We find that MA covers journal articles, working papers, and conference items to a substantial extent. MA surpasses Scopus and WoS clearly with respect to book-related document types and conference items but falls slightly behind Scopus with regard to journal articles.

MA shows the same biases as Scopus and WoS with regard to the coverage of the social sciences and humanities, non-English publications, and open-access publications. Rank correlations of citation counts are high between MA and the benchmark databases.

We find that the publication year is correct for 89.5% of all publications and the number of authors for 95.1% of the journal articles. Given the fast and ongoing development of MA, we conclude that MA is on the verge of becoming a bibliometric superpower. However, comprehensive studies on the quality of MA data are still lacking.

URL : https://arxiv.org/abs/1703.05539

Quantifying the changing role of past publications

Our current societies increasingly rely on electronic repositories of collective knowledge. An archetype of these databases is the Web of Science (WoS) that stores scientific publications. In contrast to several other forms of knowledge — e.g., Wikipedia articles — a scientific paper does not change after its “birth”.

Nonetheless, from the moment a paper is published it exists within the evolving web of other papers, thus, its actual meaning to the reader changes.

To track how scientific ideas (represented by groups of scientific papers) appear and evolve, we apply a novel combination of algorithms explicitly allowing for papers to change their groups. We (i) identify the overlapping clusters of the undirected yearly co-citation networks of the WoS (1975-2008) and (ii) match these yearly clusters (groups) to form group timelines.

After visualizing the longest lived groups of the entire data set we assign topic labels to the groups. We find that in the entire Web of Science multidisciplinarity is clearly over-represented among cutting edge ideas. In addition, we provide detailed examples for papers that (i) change their topic labels and (ii) move between groups.

URL : http://arxiv.org/abs/1605.00509

The application of bibliometrics to research evaluation in the humanities and social sciences: an exploratory study using normalized Google Scholar data for the publications of a research institute

In the humanities and social sciences, bibliometric methods for the assessment of research performance are (so far) less common. The current study takes a concrete example in an attempt to evaluate a research institute from the area of social sciences and humanities with the help of data from Google Scholar (GS).

In order to use GS for a bibliometric study, we have developed procedures for the normalisation of citation impact, building on the procedures of classical bibliometrics. In order to test the convergent validity of the normalized citation impact scores, we have calculated normalized scores for a subset of the publications based on data from the WoS or Scopus.

Even if scores calculated with the help of GS and WoS/Scopus are not identical for the different publication types (considered here), they are so similar that they result in the same assessment of the institute investigated in this study: For example, the institute’s papers whose journals are covered in WoS are cited at about an average rate (compared with the other papers in the journals).

URL :  : https://figshare.com/articles/The_application_of_bibliometrics_to_research_evaluation_in_the_humanities_and_social_sciences_an_exploratory_study_using_normalized_Google_Scholar_data_for_the_publications_of_a_research_institute/1293588