Correcting duplicate publications: follow up study of MEDLINE tagged duplications

Authors : Mario Malički, Ana Utrobičić, Ana Marušić


As MEDLINE indexers tag similar articles as duplicates even when journals have not addressed the duplication(s), we sought to determine the reasons behind the tagged duplications, and if the journals had undertaken or had planned to undertake any actions to address them.

Materials and methods

On 16 January 2013, we extracted all tagged duplicate publications (DPs), analysed published notices, and then contacted MEDLINE and editors regarding cases unaddressed by notices.

For non-respondents, we compared full text of the articles. We followed up the study for the next 5 years to see if any changes occurred.


We found 1011 indexed DPs, which represented 555 possible DP cases (in MEDLINE, both the original and the duplicate are assigned a DP tag). Six cases were excluded as we could not obtain their full text.

Additional 190 (35%) cases were incorrectly tagged as DPs. Of 359 actual cases of DPs, 200 (54%) were due to publishers’ actions (e.g. identical publications in the same journal), and 159 (46%) due to authors’ actions (e.g. article submission to more than one journal). Of the 359 cases, 185 (52%) were addressed by notices, but only 25 (7%) retracted.

Following our notifications, MEDLINE corrected 138 (73%) incorrectly tagged cases, and editors retracted 8 articles.


Despite clear policies on how to handle DPs, just half (54%) of the DPs in MEDLINE were addressed by journals and only 9% retracted. Publishers, editors, and indexers need to develop and implement standards for better correction of duplicate published records.

URL : Correcting duplicate publications: follow up study of MEDLINE tagged duplications


Merits and Limits: Applying open data to monitor open access publications in bibliometric databases

Authors : Aliakbar Akbaritabar, Stephan Stahlschmidt

Identifying and monitoring Open Access (OA) publications might seem a trivial task while practical efforts prove otherwise. Contradictory information arise often depending on metadata employed.

We strive to assign OA status to publications in Web of Science (WOS) and Scopus while complementing it with different sources of OA information to resolve contradicting cases.

We linked publications from WOS and Scopus via DOIs and ISSNs to Unpaywall, Crossref, DOAJ and ROAD. Only about 50% of articles and reviews from WOS and Scopus could be matched via a DOI to Unpaywall.

Matching with Crossref brought 56 distinct licences, which define in many cases the legally binding access status of publications. But only 44% of publications hold only a single licence on Crossref, while more than 50% have no licence information submitted to Crossref.

Contrasting OA information from Crossref licences with Unpaywall we found contradictory cases overall amounting to more than 25%, which might be partially explained by (ex-)including green OA.

A further manual check found about 17% of OA publications that are not accessible and 15% non-OA publications that are accessible through publishers’ websites. These preliminary results suggest that identification of OA state of publications denotes a difficult and currently unfulfilled task.


Open data to evaluate academic researchers: an experiment with the Italian Scientific Habilitation

Authors : Angelo Di Iorio, Silvio Peroni, Francesco Poggi

The need for scholarly open data is ever increasing. While there are large repositories of open access articles and free publication indexes, there are still a few examples of free citation networks and their coverage is partial.

One of the results is that most of the evaluation processes based on citation counts rely on commercial citation databases. Things are changing under the pressure of the Initiative for Open Citations (I4OC), whose goal is to campaign for scholarly publishers to make their citations as totally open.

This paper investigates the growth of open citations with an experiment on the Italian Scientific Habilitation, the National process for University Professor qualification which instead uses data from commercial indexes.

We simulated the procedure by only using open data and explored similarities and differences with the official results. The outcomes of the experiment show that the amount of open citation data currently available is not yet enough for obtaining similar results.


Improving the discoverability and web impact of open repositories: techniques and evaluation

Author : George Macgregor

In this contribution we experiment with a suite of repository adjustments and improvements performed on Strathprints, the University of Strathclyde, Glasgow, institutional repository powered by EPrints 3.3.13.

These adjustments were designed to support improved repository web visibility and user engagement, thereby improving usage. Although the experiments were performed on EPrints it is thought that most of the adopted improvements are equally applicable to any other repository platform.

Following preliminary results reported elsewhere, and using Strathprints as a case study, this paper outlines the approaches implemented, reports on comparative search traffic data and usage metrics, and delivers conclusions on the efficacy of the techniques implemented.

The evaluation provides persuasive evidence that specific enhancements to technical aspects of a repository can result in significant improvements to repository visibility, resulting in a greater web impact and consequent increases in content usage.

COUNTER usage grew by 33% and traffic to Strathprints from Google and Google Scholar was found to increase by 63% and 99% respectively. Other insights from the evaluation are also explored.

The results are likely to positively inform the work of repository practitioners and open scientists.


A Principled Approach to Online Publication Listings and Scientific Resource Sharing

Authors : Jacquelijn Ringersma, Karin Kastens, Ulla Tschida, Jos van Berkum

The Max Planck Institute (MPI) for Psycholinguistics has developed a service to manage and present the scholarly output of their researchers. The PubMan database manages publication metadata and full-texts of publications published by their scholars.

All relevant information regarding a researcher’s work is brought together in this database, including supplementary materials and links to the MPI database for primary research data.

The PubMan metadata is harvested into the MPI website CMS (Plone). The system developed for the creation of the publication lists, allows the researcher to create a selection of the harvested data in a variety of formats.


“When You Use Social Media You Are Not Working”: Barriers for the Use of Metrics in Social Sciences

Authors : Steffen Lemke, Maryam Mehrazar, Athanasios Mazarakis, Isabella Peters

The Social Sciences have long been struggling with quantitative forms of research assessment—insufficient coverage in prominent citation indices and overall lower citation counts than in STM subject areas have led to a widespread weariness regarding bibliometric evaluations among social scientists.

Fueled by the rise of the social web, new hope is often placed on alternative metrics that measure the attention scholarly publications receive online, in particular on social media. But almost a decade after the coining of the term altmetrics for this new group of indicators, the uptake of the concept in the Social Sciences still seems to be low.

Just like with traditional bibliometric indicators, one central problem hindering the applicability of altmetrics for the Social Sciences is the low coverage of social science publications on the respective data sources—which in the case of altmetrics are the various social media platforms on which interactions with scientific outputs can be measured.

Another reason is that social scientists have strong opinions about the usefulness of metrics for research evaluation which may hinder broad acceptance of altmetrics too. We conducted qualitative interviews and online surveys with researchers to identify the concerns which inhibit the use of social media and the utilization of metrics for research evaluation in the Social Sciences.

By analyzing the response data from the interviews in conjunction with the response data from the surveys, we identify the key concerns that inhibit social scientists from (1) applying social media for professional purposes and (2) making use of the wide array of metrics available.

Our findings show that aspects of time consumption, privacy, dealing with information overload, and prevalent styles of communication are predominant concerns inhibiting Social Science researchers from using social media platforms for their work.

Regarding indicators for research impact we identify a widespread lack of knowledge about existing metrics, their methodologies and meanings as a major hindrance for their uptake through social scientists.

The results have implications for future developments of scholarly online tools and show that researchers could benefit considerably from additional formal training regarding the correct application and interpretation of metrics.

URL : “When You Use Social Media You Are Not Working”: Barriers for the Use of Metrics in Social Sciences


Few Open Access Journals are Plan S Compliant

Authors : Jan Erik Frantsvåg, Tormod Eismann Strømme

Much of the debate on Plan S seems to concentrate on how to make toll access journals open access, taking for granted that existing open access journals are Plan S compliant.

We suspected this was not so, and set out to explore this using DOAJ’s journal metadata. We conclude that an overwhelmingly large majority of open access journals are not Plan S compliant, and that it is small HSS publishers not charging APCs that are least compliant and will face major challenges with becoming compliant.

Plan S need to give special considerations to smaller publishers and/or non-APC-based journals.

URL : Few Open Access Journals are Plan S Compliant

Alternative location :