Opening Review in LIS Journals: A Status Report

Author : Emily Ford


Peer-review practices in scholarly publishing are changing. Digital publishing mechanisms allow for open peer review, a peer review process that discloses author and reviewer identities to one another.

This model of peer review is increasingly implemented in scholarly publishing. In science, technology, engineering, and math (STEM) disciplines, open peer review is implemented in journal publishing processes, and, in the humanities and social sciences, it is often coupled with new scholarship practices, such as the digital humanities.

This article reports findings from an exploratory study on peer-review and publishing practices in Library and Information Science (LIS), focusing on LIS’s relationships with open peer review.


Editors of LIS journals were surveyed regarding journal peer review and publishing practices.


This article reports the general “pulse” of attitudes and conversations regarding open peer review and discusses its challenges in LIS. Results show an ideological split between traditionally-published journals and open access and association-affiliated journals. Open access and association-affiliated journal editors are more likely to consider investigating open peer review.


The LIS community of journal editors, authors, reviewers, and readers need to discuss open peer review as well as experiment with it. Experiments with open peer review in scholarly LIS publishing will inform our praxis as librarians.

URL : Opening Review in LIS Journals: A Status Report


Researchers’ Individual Publication Rate Has Not Increased in a Century

Authors : Daniele Fanelli, Vincent Larivière

Debates over the pros and cons of a “publish or perish” philosophy have inflamed academia for at least half a century. Growing concerns, in particular, are expressed for policies that reward “quantity” at the expense of “quality,” because these might prompt scientists to unduly multiply their publications by fractioning (“salami slicing”), duplicating, rushing, simplifying, or even fabricating their results.

To assess the reasonableness of these concerns, we analyzed publication patterns of over 40,000 researchers that, between the years 1900 and 2013, have published two or more papers within 15 years, in any of the disciplines covered by the Web of Science.

The total number of papers published by researchers during their early career period (first fifteen years) has increased in recent decades, but so has their average number of co-authors. If we take the latter factor into account, by measuring productivity fractionally or by only counting papers published as first author, we observe no increase in productivity throughout the century.

Even after the 1980s, adjusted productivity has not increased for most disciplines and countries. These results are robust to methodological choices and are actually conservative with respect to the hypothesis that publication rates are growing.

Therefore, the widespread belief that pressures to publish are causing the scientific literature to be flooded with salami-sliced, trivial, incomplete, duplicated, plagiarized and false results is likely to be incorrect or at least exaggerated.

URL : Researchers’ Individual Publication Rate Has Not Increased in a Century


Changes in the digital scholarly environment and issues of trust: An exploratory, qualitative analysis

Authors : Anthony Watkinson, David Nicholas, Clare Thornley, Eti Herman, Hamid R. Jamali, Rachel Volentine, Suzie Allard, Kenneth Levine, Carol Tenopir

The paper reports on some of the results of a research project into how changes in digital behaviour and services impacts on concepts of trust and authority held by researchers in the sciences and social sciences in the UK and the USA.

Interviews were used in conjunction with a group of focus groups to establish the form and topic of questions put to a larger international sample in an online questionnaire. The results of these 87 interviews were analysed to determine whether or not attitudes have indeed changed in terms of sources of information used, citation behaviour in choosing references, and in dissemination practices.

It was found that there was marked continuity in attitudes though an increased emphasis on personal judgement over established and new metrics. Journals (or books in some disciplines) were more highly respected than other sources and still the vehicle for formal scholarly communication.

The interviews confirmed that though an open access model did not in most cases lead to mistrust of a journal, a substantial number of researchers were worried about the approaches from what are called predatory OA journals. Established researchers did not on the whole use social media in their professional lives but a question about outreach revealed that it was recognised as effective in reaching a wider audience.

There was a remarkable similarity in practice across research attitudes in all the disciplines covered andin both the countries where interviews were held.


What does ‘green’ open access mean? Tracking twelve years of changes to journal publisher selfarchiving policies

Authors : Elizabeth Gadd, Denise Troll Covey

Traces the 12-year self-archiving policy journey of the original 107 publishers listed on the SHERPA/RoMEO Publisher Policy Database in 2004, through to 2015. Maps the RoMEO colour codes (‘green’, ‘blue’, ‘yellow’ and ‘white’) and related restrictions and conditions over time.

Finds that while the volume of publishers allowing some form of self-archiving (pre-print, post-print or both) has increased by 12% over the 12 years, the volume of restrictions around how, where and when self-archiving may take place has increased 119%, 190% and 1000% respectively.

A significant positive correlation was found between the increase in self-archiving restrictions and the introduction of Gold paid open access options. Suggests that by conveying only the version of a paper that authors may self-archive, the RoMEO colour codes do not address all the key elements of the Bethesda Definition of Open Access.

Compares the number of RoMEO ‘green’ publishers over time with those meeting the definition for ‘redefined green’ (allowing embargo-free deposit of the post-print in an institutional repository). Finds that RoMEO ‘green’ increased by 8% and ‘redefined green’ decreased by 35% over the 12 years.

Concludes that the RoMEO colour codes no longer convey a commitment to green open access as originally intended. Calls for open access advocates, funders, institutions and authors to redefine what ‘green’ means to better reflect a publisher’s commitment to self-archiving.


Impact de l’Open Access sur les citations : une étude de cas

Auteurs/Authors : Frédérique Bordignon, Mathieu Andro

De multiples études, dans la littérature internationale, ont cherché à évaluer l’impact de l’Open Access sur le taux de citation des articles scientifiques. La présente étude, en langue française, reste limitée aux publications 2010 de l’Ecole des Ponts.

Elle offre néanmoins un état de l’art des précédentes études sur le sujet à un lectorat de professionnels francophones et a pour originalité de mesurer le nombre moyen de citations par mois, avant et après “libération” Open Access des articles et d’éviter ainsi la plupart des biais qui peuvent être rencontrés dans ce type de démarche.

En plus de confirmer, comme beaucoup d’autres l’ont fait auparavant, un avantage net de l’Open Access sur le taux de citation en informatique, sciences de la terre et de l’univers, ingénierie, sciences environnementales, mathématiques, physique et astronomie, elle montre aussi qu’une « libération » précoce peut avoir un impact plus favorable qu’une « libération » tardive dans certains champs disciplinaires, comme les mathématiques et physique/astronomie.

URL : Impact de l’Open Access sur les citations : une étude de cas

The Authorship Dilemma: Alphabetical or Contribution?

Authors : Margareta Ackerman, Simina Brânzei

Scientific communities have adopted different conventions for ordering authors on publications.

Are these choices inconsequential, or do they have significant influence on individual authors, the quality of the projects completed, and research communities at large? What are the trade-offs of using one convention over another?

In order to investigate these questions, we formulate a basic two-player game theoretic model, which already illustrates interesting phenomena that can occur in more realistic settings.

We find that alphabetical ordering can improve research quality, while contribution-based ordering leads to a denser collaboration network and a greater number of publications.

Contrary to the assumption that free riding is a weakness of the alphabetical ordering scheme, this phenomenon can occur under any contribution scheme, and the worst case occurs under contribution-based ordering.

Finally, we show how authors working on multiple projects can cooperate to attain optimal research quality and eliminate free riding given either contribution scheme.


Measuring Scientific Impact Beyond Citation Counts

Authors : Robert M. Patton, Christopher G. Stahl, Jack C. Wells

The measurement of scientific progress remains a significant challenge exasperated by the use of multiple different types of metrics that are often incorrectly used, overused, or even explicitly abused.

Several metrics such as h-index or journal impact factor (JIF) are often used as a means to assess whether an author, article, or journal creates an « impact » on science. Unfortunately, external forces can be used to manipulate these metrics thereby diluting the value of their intended, original purpose.

This work highlights these issues and the need to more clearly define « impact » as well as emphasize the need for better metrics that leverage full content analysis of publications.