Why we publish where we do: Faculty publishing values and their relationship to review, promotion and tenure expectations

Authors : Meredith T. Niles, Lesley A. Schimanski, Erin C. McKiernan, Juan P. Alperin

Using an online survey of academics at 55 randomly selected institutions across the US and Canada, we explore priorities for publishing decisions and their perceived importance within review, promotion, and tenure (RPT).

We find that respondents most value journal readership, while they believe their peers most value prestige and related metrics such as impact factor when submitting their work for publication.

Respondents indicated that total number of publications, number of publications per year, and journal name recognition were the most valued factors in RPT. Older and tenured respondents (most likely to serve on RPT committees) were less likely to value journal prestige and metrics for publishing, while untenured respondents were more likely to value these factors.

These results suggest disconnects between what academics value versus what they think their peers value, and between the importance of journal prestige and metrics for tenured versus untenured faculty in publishing and RPT perceptions.

URL : Why we publish where we do: Faculty publishing values and their relationship to review, promotion and tenure expectations

Do Download Reports Reliably Measure Journal Usage? Trusting the Fox to Count Your Hens?

Authors : Alex Wood-Doughty, Ted Bergstrom, Douglas G. Steigerwald

Download rates of academic journals have joined citation counts as commonly used indicators of the value of journal subscriptions. While citations reflect worldwide influence, the value of a journal subscription to a single library is more reliably measured by the rate at which it is downloaded by local users.

If reported download rates accurately measure local usage, there is a strong case for using them to compare the cost-effectiveness of journal subscriptions. We examine data for nearly 8,000 journals downloaded at the ten universities in the University of California system during a period of six years.

We find that controlling for number of articles, publisher, and year of download, the ratio of downloads to citations differs substantially among academic disciplines.

After adding academic disciplines to the control variables, there remain substantial “publisher effects”, with some publishers reporting significantly more downloads than would be predicted by the characteristics of their journals.

These cross-publisher differences suggest that the currently available download statistics, which are supplied by publishers, are not sufficiently reliable to allow libraries to make subscription decisions based on price and reported downloads, at least without making an adjustment for publisher effects in download reports.

URL : Do Download Reports Reliably Measure Journal Usage? Trusting the Fox to Count Your Hens?

DOI: https://doi.org/10.5860/crl.80.5.694

Developing a model for university presses

Authors : Megan Taylor, Kathrine S H Jensen

This article presents a model for developing a university press based around three guiding principles and six key stages of the publishing process, with associated activities.

The model is designed to be applicable to a range of business models, including subscription, open access and hybrid. The guiding principles, publishing stages and strategic points all constitute the building blocks necessary to implement and maintain a sustainable university press.

At the centre of the model there are three interconnected main guiding principles: strategic alignment, stakeholder relationships and demonstrating impact.

The publishing process outlined in the outer ring of the model is made up of six sections: editorial, production, dissemination, preservation, communication and analytics.

These sections were based on the main stages that a journal article or monograph goes through from proposal or commissioning stage through to publication and beyond.

The model highlights the overall importance of working in partnership and building relationships as key to developing and maintaining a successful press.

URL : Developing a model for university presses

DOI : http://doi.org/10.1629/uksg.469

Perceived publication pressure in Amsterdam: Survey of all disciplinary fields and academic ranks

Authors : Tamarinde L. Haven, Lex M. Bouter, Yvo M. Smulders, Joeri K. Tijdink

Publications determine to a large extent the possibility to stay in academia (“publish or perish”). While some pressure to publish may incentivise high quality research, too much publication pressure is likely to have detrimental effects on both the scientific enterprise and on individual researchers.

Our research question was: What is the level of perceived publication pressure in the four academic institutions in Amsterdam and does the pressure to publish differ between academic ranks and disciplinary fields?

Investigating researchers in Amsterdam with the revised Publication Pressure Questionnaire, we find that a negative attitude towards the current publication climate is present across academic ranks and disciplinary fields.

Postdocs and assistant professors (M = 3.42) perceive the greatest publication stress and PhD-students (M = 2.44) perceive a significant lack of resources to relieve publication stress. Results indicate the need for a healthier publication climate where the quality and integrity of research is rewarded.

URL : Perceived publication pressure in Amsterdam: Survey of all disciplinary fields and academic ranks

DOI : https://doi.org/10.1371/journal.pone.0217931

Citation.js: a format-independent, modular bibliography tool for the browser and command line

Author : Lars G Willighagen

Background

Given the vast number of standards and formats for bibliographical data, any program working with bibliographies and citations has to be able to interpret such data. This paper describes the development of Citation.js (https://citation.js.org/), a tool to parse and format according to those standards.

The program follows modern guidelines for software in general and JavaScript in specific, such as version control, source code analysis, integration testing and semantic versioning.

Results

The result is an extensible tool that has already seen adaption in a variety of sources and use cases: as part of a server-side page generator of a publishing platform, as part of a local extensible document generator, and as part of an in-browser converter of extracted references.

Use cases range from transforming a list of DOIs or Wikidata identifiers into a BibTeX file on the command line, to displaying RIS references on a webpage with added Altmetric badges to generating “How to cite this” sections on a blog.

The accuracy of conversions is currently 27 % for properties and 60 % for types on average and a typical initialization takes 120 ms in browsers and 1 s with Node.js on the command line.

Conclusions

Citation.js is a library supporting various formats of bibliographic information in a broad selection of use cases and environments. Given the support for plugins, more formats can be added with relative ease.

URL : Citation.js: a format-independent, modular bibliography tool for the browser and command line

DOI : https://doi.org/10.7287/peerj.preprints.27466v2

Share or perish: Social media and the International Journal of Mental Health Nursing

Authors : Paul McNamara, Kim Usher

The impact of published research is sometimes measured by the number of citations an individual article accumulates. However, the time from publication to citation can be extensive. Years may pass before authors are able to measure the impact of their publication. Social media provides individuals and organizations a powerful medium with which to share information.

The power of social media is sometimes harnessed to share scholarly works, especially journal article citations and quotes. A non‐traditional bibliometric is required to understand the impact social media has on disseminating scholarly works/research.

The International Journal of Mental Health Nursing (IJMHN) appointed a social media editor as of 1 January 2017 to implement a strategy to increase the impact and reach of the journal’s articles.

To measure the impact of the IJMHN social media strategy, quantitative data for the eighteen months prior to the social media editor start date, and the eighteen months after that date (i.e.: from 01 July 2015 to 30 June 2018) were acquired and analysed.

Quantitative evidence demonstrates the effectiveness of one journal’s social media strategy in increasing the reach and readership of the articles it publishes.

This information may be of interest to those considering where to publish their research, those wanting to amplify the reach of their research, those who fund research, and journal editors and boards.

DOI : https://doi.org/10.1111/inm.12600

Comparing quality of reporting between preprints and peer-reviewed articles in the biomedical literature

Authors : Clarissa F. D. Carneiro, Victor G. S. Queiroz, Thiago C. Moulin, Carlos A. M. Carvalho, Clarissa B. Haas, Danielle Rayêe, David E. Henshall, Evandro A. De-Souza, Felippe Espinelli, Flávia Z. Boos, Gerson D. Guercio, Igor R. Costa, Karina L. Hajdu, Martin Modrák, Pedro B. Tan, Steven J. Burgess, Sylvia F. S. Guerra, Vanessa T. Bortoluzzi, Olavo B. Amara

Preprint usage is growing rapidly in the life sciences; however, questions remain on the relative quality of preprints when compared to published articles. An objective dimension of quality that is readily measurable is completeness of reporting, as transparency can improve the reader’s ability to independently interpret data and reproduce findings.

In this observational study, we compared random samples of articles published in bioRxiv and in PubMed-indexed journals in 2016 using a quality of reporting questionnaire. We found that peer-reviewed articles had, on average, higher quality of reporting than preprints, although this difference was small.

We found larger differences favoring PubMed in subjective ratings of how clearly titles and abstracts presented the main findings and how easy it was to locate relevant reporting information.

Interestingly, an exploratory analysis showed that preprints with figures and legends embedded within text had reporting scores similar to PubMed articles.

These differences cannot be directly attributed to peer review or editorial processes, as manuscripts might already differ before submission due to greater uptake of preprints by particular research communities.

Nevertheless, our results show that quality of reporting in preprints in the life sciences is within a similar range as that of peer-reviewed articles, albeit slightly lower on average, supporting the idea that preprints should be considered valid scientific contributions.

An ongoing second phase of the project is comparing preprints to their own published versions in order to more directly assess the effects of peer review.

URL : Comparing quality of reporting between preprints and peer-reviewed articles in the biomedical literature

DOI : https://doi.org/10.1101/581892