Citation.js: a format-independent, modular bibliography tool for the browser and command line

Author : Lars G Willighagen

Background

Given the vast number of standards and formats for bibliographical data, any program working with bibliographies and citations has to be able to interpret such data. This paper describes the development of Citation.js (https://citation.js.org/), a tool to parse and format according to those standards.

The program follows modern guidelines for software in general and JavaScript in specific, such as version control, source code analysis, integration testing and semantic versioning.

Results

The result is an extensible tool that has already seen adaption in a variety of sources and use cases: as part of a server-side page generator of a publishing platform, as part of a local extensible document generator, and as part of an in-browser converter of extracted references.

Use cases range from transforming a list of DOIs or Wikidata identifiers into a BibTeX file on the command line, to displaying RIS references on a webpage with added Altmetric badges to generating “How to cite this” sections on a blog.

The accuracy of conversions is currently 27 % for properties and 60 % for types on average and a typical initialization takes 120 ms in browsers and 1 s with Node.js on the command line.

Conclusions

Citation.js is a library supporting various formats of bibliographic information in a broad selection of use cases and environments. Given the support for plugins, more formats can be added with relative ease.

URL : Citation.js: a format-independent, modular bibliography tool for the browser and command line

DOI : https://doi.org/10.7287/peerj.preprints.27466v2

Share or perish: Social media and the International Journal of Mental Health Nursing

Authors : Paul McNamara, Kim Usher

The impact of published research is sometimes measured by the number of citations an individual article accumulates. However, the time from publication to citation can be extensive. Years may pass before authors are able to measure the impact of their publication. Social media provides individuals and organizations a powerful medium with which to share information.

The power of social media is sometimes harnessed to share scholarly works, especially journal article citations and quotes. A non‐traditional bibliometric is required to understand the impact social media has on disseminating scholarly works/research.

The International Journal of Mental Health Nursing (IJMHN) appointed a social media editor as of 1 January 2017 to implement a strategy to increase the impact and reach of the journal’s articles.

To measure the impact of the IJMHN social media strategy, quantitative data for the eighteen months prior to the social media editor start date, and the eighteen months after that date (i.e.: from 01 July 2015 to 30 June 2018) were acquired and analysed.

Quantitative evidence demonstrates the effectiveness of one journal’s social media strategy in increasing the reach and readership of the articles it publishes.

This information may be of interest to those considering where to publish their research, those wanting to amplify the reach of their research, those who fund research, and journal editors and boards.

DOI : https://doi.org/10.1111/inm.12600

Comparing quality of reporting between preprints and peer-reviewed articles in the biomedical literature

Authors : Clarissa F. D. Carneiro, Victor G. S. Queiroz, Thiago C. Moulin, Carlos A. M. Carvalho, Clarissa B. Haas, Danielle Rayêe, David E. Henshall, Evandro A. De-Souza, Felippe Espinelli, Flávia Z. Boos, Gerson D. Guercio, Igor R. Costa, Karina L. Hajdu, Martin Modrák, Pedro B. Tan, Steven J. Burgess, Sylvia F. S. Guerra, Vanessa T. Bortoluzzi, Olavo B. Amara

Preprint usage is growing rapidly in the life sciences; however, questions remain on the relative quality of preprints when compared to published articles. An objective dimension of quality that is readily measurable is completeness of reporting, as transparency can improve the reader’s ability to independently interpret data and reproduce findings.

In this observational study, we compared random samples of articles published in bioRxiv and in PubMed-indexed journals in 2016 using a quality of reporting questionnaire. We found that peer-reviewed articles had, on average, higher quality of reporting than preprints, although this difference was small.

We found larger differences favoring PubMed in subjective ratings of how clearly titles and abstracts presented the main findings and how easy it was to locate relevant reporting information.

Interestingly, an exploratory analysis showed that preprints with figures and legends embedded within text had reporting scores similar to PubMed articles.

These differences cannot be directly attributed to peer review or editorial processes, as manuscripts might already differ before submission due to greater uptake of preprints by particular research communities.

Nevertheless, our results show that quality of reporting in preprints in the life sciences is within a similar range as that of peer-reviewed articles, albeit slightly lower on average, supporting the idea that preprints should be considered valid scientific contributions.

An ongoing second phase of the project is comparing preprints to their own published versions in order to more directly assess the effects of peer review.

URL : Comparing quality of reporting between preprints and peer-reviewed articles in the biomedical literature

DOI : https://doi.org/10.1101/581892

Assessing the size of the affordability problem in scholarly publishing

Authors : Alexander Grossmann, Björn Brembs​

For many decades, the hyperinflation of subscription prices for scholarly journals have concerned scholarly institutions. After years of fruitless efforts to solve this “serials crisis”, open access has been proposed as the latest potential solution. However, also the prices for open access publishing are high and are rising well beyond inflation.

What has been missing from the public discussion so far is a quantitative approach to determine the actual costs of efficiently publishing a scholarly article using state-of-the-art technologies, such that informed decisions can be made as to appropriate price levels.

Here we provide a granular, step-by-step calculation of the costs associated with publishing primary research articles, from submission, through peer-review, to publication, indexing and archiving.

We find that these costs range from less than US$200 per article in modern, large scale publishing platforms using post-publication peer-review, to about US$1,000 per article in prestigious journals with rejection rates exceeding 90%.

The publication costs for a representative scholarly article today come to lie at around US$400. We discuss the additional non-publication items that make up the difference between publication costs and final price.

URL : Assessing the size of the affordability problem in scholarly publishing

DOI : https://doi.org/10.7287/peerj.preprints.27809v1

Open access policies of leading medical journals: a cross-sectional study

Authors : Tim S Ellison, Laura Schmidt, Amy Williams, Christopher C Winchester

Objectives

Academical and not-for-profit research funders are increasingly requiring that the research they fund must be published open access, with some insisting on publishing with a Creative Commons Attribution (CC BY) licence to allow the broadest possible use.

We aimed to clarify the open access variants provided by leading medical journals and record the availability of the CC BY licence for commercially funded research.

Methods

We identified medical journals with a 2015 impact factor of ≥15.0 on 24 May 2017, then excluded from the analysis journals that only publish review articles. Between 29 June 2017 and 26 July 2017, we collected information about each journal’s open access policies from their websites and/or by email contact.

We contacted the journals by email again between 6 December 2017 and 2 January 2018 to confirm our findings.

Results

Thirty-five medical journals publishing original research from 13 publishers were included in the analysis. All 35 journals offered some form of open access allowing articles to be free-to-read, either immediately on publication or after a delay of up to 12 months.

Of these journals, 21 (60%) provided immediate open access with a CC BY licence under certain circumstances (eg, to specific research funders). Of these 21, 20 only offered a CC BY licence to authors funded by non-commercial organisations and one offered this option to any funder who required it.

Conclusions

Most leading medical journals do not offer to authors reporting commercially funded research an open access licence that allows unrestricted sharing and adaptation of the published material.

The journals’ policies are therefore not aligned with open access declarations and guidelines. Commercial research funders lag behind academical funders in the development of mandatory open access policies, and it is time for them to work with publishers to advance the dissemination of the research they fund.

URL : Open access policies of leading medical journals: a cross-sectional study

DOI : http://dx.doi.org/10.1136/bmjopen-2018-028655

Please, no more scientific journals! The strategy of the scientific publication system

Author : Miguel A. Fortuna

In the same way ecosystems tend to increase maturity by decreasing the flow of energy per unit biomass, we should move towards a more mature science by publishing less but high-quality papers and getting away from joining large teams in small roles. That is, we should decrease our scientific productivity for good.

URL : https://arxiv.org/abs/1906.02927

The F3-index. Valuing reviewers for scholarly journals

Authors : Federico Bianchi, Francisco Grimaldo, Flaminio Squazzoni

This paper presents an index that measures reviewer contribution to editorial processes of scholarly journals. Following a metaphor of ranking algorithms in sports tournaments, we created an index that considers reviewers on different context-specific dimensions, i.e., report delivery time, the length of the report and the alignment of recommendations to editorial decisions.

To test the index, we used a dataset of peer review in a multi-disciplinary journal, including 544 reviewers on 606 submissions in six years. Although limited by sample size, the test showed that the index identifies outstanding contributors and weak performing reviewers efficiently.

Our index is flexible, contemplates extensions and could be incorporated into available scholarly journal management tools. It can assist editors in rewarding high performing reviewers and managing editorial turnover.

URL : The F3-index. Valuing reviewers for scholarly journals

DOI : https://doi.org/10.1016/j.joi.2018.11.007