The impact of the open-access status on journal indices: a review of medical journals

Authors : Saif Aldeen AlRyalat, Mohammad Saleh, Mohammad Alaqraa, Alaa Alfukaha, Yara Alkayed, Maryann Abaza, Hadeel Abu Saa, Mohamed Alshamiry


Over the past few decades, there has been an increase in the number of open access (OA) journals in almost all disciplines. This increase in OA journals was accompanied an increase in funding to support such movements.

Medical fields are among the highest funded fields, which further promoted its journals to move toward OA publishing. Here, we aim to compare OA and non-OA journals in terms of citation metrics and other indices.


We collected data on the included journals from Scopus Source List on 1st November 2018.  We filtered the list for medical journals only. For each journal, we extracted data regarding citation metrics, scholarly output, and wither the journal is OA or non-OA.


On the 2017 Scopus list of journals, there was 5835 medical journals. Upon analyzing the difference between medical OA and non-OA journals, we found that OA journals had a significantly higher CiteScore (p< 0.001), percent cited (p< 0.001), and source normalized impact per paper (SNIP) (p< 0.001), whereas non-OA journals had higher scholarly output (p< 0.001).

Among the five largest journal publishers, Springer Nature published the highest frequency of OA articles (31.5%), while Wiley-Blackwell had the lowest frequency among its medical journals (4.4%).


Among medical journals, although non-OA journals still have higher output in terms of articles per year, OA journals have higher citation metrics.

URL : The impact of the open-access status on journal indices: a review of medical journals

Transformations disciplinaires en Littérature et Sciences Humaines à l’heure numérique

Auteur/Author : Xavier-Laurent Salvador

Il existe une spécificité disciplinaire des LSHS qui ont en commun l’étude de l’objet documentaire. Ce dernier n’est plus conservé en bibliothèque mais consulté sous une forme chimiquement stable du silicium qui en constitue la substance.

L’émergence de nouvelles sources épistémologiques et la multiplication des compétences technologiques nécessaires pour la conduite d’une chaîne éditoriale favorise l’émergence d’une dynamique, les Humanités Numériques, que l’on pourrait définir à ce stade comme une discipline dont l’objet d’étude est à proprement parler le document numérique lui-même, éventuellement perçu comme un original second, et dont l’objectif est l’encyclopédisme numérique libre et partagé en rupture avec les processus d’édition savante.


Opening Up Open Access Institutional Repositories to Demonstrate Value: Two Universities’ Pilots on Including Metadata-Only Records

Authors: Karen Bjork, Rebel Cummings-Sauls, Ryan Otto


Institutional repository managers are continuously looking for new ways to demonstrate the value of their repositories. One way to do this is to create a more inclusive repository that provides reliable information about the research output produced by faculty affiliated with the institution.


This article details two pilot projects that evaluated how their repositories could track faculty research output through the inclusion of metadata-only (no full-text) records.

The purpose of each pilot project was to determine the feasibility and provide an assessment of the long-term impact on the repository’s mission statement, staffing, and collection development policies.


This article shares the results of the pilot project and explores the impact for faculty and end users as well as the implications for repositories.

URL : Opening Up Open Access Institutional Repositories to Demonstrate Value: Two Universities’ Pilots on Including Metadata-Only Records


Blockchain and OECD data repositories: opportunities and policymaking implications

Authors : Miguel-Angel Sicilia, Anna Visvizi


The purpose of this paper is to employ the case of Organization for Economic Cooperation and Development (OECD) data repositories to examine the potential of blockchain technology in the context of addressing basic contemporary societal concerns, such as transparency, accountability and trust in the policymaking process. Current approaches to sharing data employ standardized metadata, in which the provider of the service is assumed to be a trusted party.

However, derived data, analytic processes or links from policies, are in many cases not shared in the same form, thus breaking the provenance trace and making the repetition of analysis conducted in the past difficult. Similarly, it becomes tricky to test whether certain conditions justifying policies implemented still apply.

A higher level of reuse would require a decentralized approach to sharing both data and analytic scripts and software. This could be supported by a combination of blockchain and decentralized file system technology.


The findings presented in this paper have been derived from an analysis of a case study, i.e., analytics using data made available by the OECD. The set of data the OECD provides is vast and is used broadly.

The argument is structured as follows. First, current issues and topics shaping the debate on blockchain are outlined. Then, a redefinition of the main artifacts on which some simple or convoluted analytic results are based is revised for some concrete purposes.

The requirements on provenance, trust and repeatability are discussed with regards to the architecture proposed, and a proof of concept using smart contracts is used for reasoning on relevant scenarios.


A combination of decentralized file systems and an open blockchain such as Ethereum supporting smart contracts can ascertain that the set of artifacts used for the analytics is shared. This enables the sequence underlying the successive stages of research and/or policymaking to be preserved.

This suggests that, in turn, and ex post, it becomes possible to test whether evidence supporting certain findings and/or policy decisions still hold. Moreover, unlike traditional databases, blockchain technology makes it possible that immutable records can be stored.

This means that the artifacts can be used for further exploitation or repetition of results. In practical terms, the use of blockchain technology creates the opportunity to enhance the evidence-based approach to policy design and policy recommendations that the OECD fosters.

That is, it might enable the stakeholders not only to use the data available in the OECD repositories but also to assess corrections to a given policy strategy or modify its scope.

Research limitations/implications

Blockchains and related technologies are still maturing, and several questions related to their use and potential remain underexplored. Several issues require particular consideration in future research, including anonymity, scalability and stability of the data repository.

This research took as example OECD data repositories, precisely to make the point that more research and more dialogue between the research and policymaking community is needed to embrace the challenges and opportunities blockchain technology generates.

Several questions that this research prompts have not been addressed. For instance, the question of how the sharing economy concept for the specifics of the case could be employed in the context of blockchain has not been dealt with.

Practical implications

The practical implications of the research presented here can be summarized in two ways. On the one hand, by suggesting how a combination of decentralized file systems and an open blockchain, such as Ethereum supporting smart contracts, can ascertain that artifacts are shared, this paper paves the way toward a discussion on how to make this approach and solution reality.

The approach and architecture proposed in this paper would provide a way to increase the scope of the reuse of statistical data and results and thus would improve the effectiveness of decision making as well as the transparency of the evidence supporting policy.

Social implications

Decentralizing analytic artifacts will add to existing open data practices an additional layer of benefits for different actors, including but not limited to policymakers, journalists, analysts and/or researchers without the need to establish centrally managed institutions.

Moreover, due to the degree of decentralization and absence of a single-entry point, the vulnerability of data repositories to cyberthreats might be reduced. Simultaneously, by ensuring that artifacts derived from data based in those distributed depositories are made immutable therein, full reproducibility of conclusions concerning the data is possible.

In the field of data-driven policymaking processes, it might allow policymakers to devise more accurate ways of addressing pressing issues and challenges.


This paper offers the first blueprint of a form of sharing that complements open data practices with the decentralized approach of blockchain and decentralized file systems.

The case of OECD data repositories is used to highlight that while data storing is important, the real added value of blockchain technology rests in the possible change on how we use the data and data sets in the repositories. It would eventually enable a more transparent and actionable approach to linking policy up with the supporting evidence.

From a different angle, throughout the paper the case is made that rather than simply data, artifacts from conducted analyses should be made persistent in a blockchain.

What is at stake is the full reproducibility of conclusions based on a given set of data, coupled with the possibility of ex post testing the validity of the assumptions and evidence underlying those conclusions.


Do open educational resources improve student learning? Implications of the access hypothesis

Authors : Phillip J. Grimaldi, Debshila Basu Mallick, Andrew E. Waters, Richard G. Baraniuk

Open Educational Resources (OER) have been lauded for their ability to reduce student costs and improve equity in higher education. Research examining whether OER provides learning benefits have produced mixed results, with most studies showing null effects.

We argue that the common methods used to examine OER efficacy are unlikely to detect positive effects based on predictions of the access hypothesis. The access hypothesis states that OER benefits learning by providing access to critical course materials, and therefore predicts that OER should only benefit students who would not otherwise have access to the materials.

Through the use of simulation analysis, we demonstrate that even if there is a learning benefit of OER, standard research methods are unlikely to detect it.

URL : Do open educational resources improve student learning? Implications of the access hypothesis


Sci-Hub, a challenge for academic and research libraries

Authors : Llarina González-Solar, Viviana Fernández-Marcial

Sci-Hub emerged into the field of scientific communication in 2011 as a platform for free access to scientific papers. It is the most popular of the so-called shadow libraries, systems that overcome the limits of legal access to scientific publications, standing apart from the open access movement.

Besides from the media coverage that has served to boost its popularity, several studies reveal the impact of Sci-Hub among researchers, who have embraced this initiative. Sci-Hub has revealed new forms of access to scientific information, affecting academic and research libraries that cannot remain on the sidelines.

This study addresses the Sci-Hub phenomenon and its implications for academic and research libraries from different points of view, through a bibliographic review and an analysis of examples of action.


How to Fight Fair Use Fear, Uncertainty, and Doubt : The Experience of One Open Educational Resource

Author : Lindsey Weeramuni

At the launch of one of the early online open educational resources (OER) in 2002, the approach to addressing copyright was uncertain. Did the university or the faculty own their material? How would the third-party material be handled? Was all of its use considered fair use under Section 107 of the U.S. Copyright Act (Title 17, United States Code) because of its educational purpose?

Or was permission-seeking necessary for this project to succeed and protect the integrity of faculty and university? For many years, this OER was conservative in its approach to third-party material, avoiding making fair use claims on the theory that it was too risky and difficult to prove in the face of an infringement claim.

Additionally, being one of the early projects of its kind, there was fear of becoming a target for ambitious copyright holders wanting to make headlines (and perhaps win lawsuits). It was not until 2009 that the Code of Best Practices in Fair Use for OpenCourseWare was written by a community of practitioners who believed that if fair use worked for documentary film makers, video creators, and others (including big media), it worked in open education as well.

Once this Code was adopted, universities and institutions were able to offer more rich and complete course content to their users than before. This paper explains how it happened at this early open educational resource offering.

URL : How to Fight Fair Use Fear, Uncertainty, and Doubt : The Experience of One Open Educational Resource