Open access megajournals: The publisher perspective (Part 2: Operational realities)

Authors : Simon Wakeling ,Valérie Spezi, Jenny Fry, Claire Creaser, Stephen Pinfield, Peter Willett

This paper is the second of two Learned Publishing articles in which we report the results of a series of interviews, with senior publishers and editors exploring open access megajournals (OAMJs).

Megajournals (of which PLoS One is the best known example) represent a relatively new approach to scholarly communication and can be characterized as large, broad-scope, open access journals, which take an innovative approach to peer review, basing acceptance decisions solely on the technical or scientific soundness of the article. B

ased on interviews with 31 publishers and editors, this paper reports the perceived cultural, operational, and technical challenges associated with launching, growing, and maintaining a megajournal.

We find that overcoming these challenges while delivering the societal benefits associated with OAMJs is seen to require significant investment in people and systems, as well as an ongoing commitment to the model.

URL : Open access megajournals: The publisher perspective (Part 2: Operational realities)

Alternative location : http://onlinelibrary.wiley.com/doi/10.1002/leap.1118/full

 

Open access megajournals: The publisher perspective (Part 1: Motivations)

Authors : Simon Wakeling ,Valérie Spezi , Jenny Fry, Claire Creaser, Stephen Pinfield, Peter Willett

This paper is the first of two Learned Publishing articles in which we report the results of a series of interviews with senior publishers and editors exploring open access megajournals (OAMJs).

Megajournals (of which PLoS One is the best known example) represent a relatively new approach to scholarly communication and can be characterized as large, broad-scope, open access journals that take an innovative approach to peer review, basing acceptance decisions solely on the technical or scientific soundness of the article.

This model is often said to support the broader goals of the open science movement. Based on in-depth interviews with 31 publishers and editors representing 16 different organizations (10 of which publish a megajournal), this paper reports how the term ‘megajournal’ is understood and publishers’ rationale and motivations for launching (or not launching) an OAMJ.

We find that while there is general agreement on the common characteristics of megajournals, there is not yet a consensus on their relative importance. We also find seven motivating factors that were said to drive the launch of an OAMJ and link each of these factors to potential societal and business benefits.

These results suggest that the often polarized debate surrounding OAMJs is a consequence of the extent to which observers perceive publishers to be motivated by these societal or business benefits.

URL : Open access megajournals: The publisher perspective (Part 1: Motivations)

Alternative location : http://onlinelibrary.wiley.com/doi/10.1002/leap.1117/full

 

 

A prospective study on an innovative online forum for peer reviewing of surgical science

Authors : Martin Almquist, Regula S. von Allmen, Dan Carradice, Steven J. Oosterling, Kirsty McFarlane, Bas Wijnhoven

Background

Peer review is important to the scientific process. However, the present system has been criticised and accused of bias, lack of transparency, failure to detect significant breakthrough and error. At the British Journal of Surgery (BJS), after surveying authors’ and reviewers’ opinions on peer review, we piloted an open online forum with the aim of improving the peer review process.

Methods

In December 2014, a web-based survey assessing attitudes towards open online review was sent to reviewers with a BJS account in Scholar One. From April to June 2015, authors were invited to allow their manuscripts to undergo online peer review in addition to the standard peer review process.

The quality of each review was evaluated by editors and editorial assistants using a validated instrument based on a Likert scale.

Results

The survey was sent to 6635 reviewers. In all, 1454 (21.9%) responded. Support for online peer review was strong, with only 10% stating that they would not subject their manuscripts to online peer review. The most prevalent concern was about intellectual property, being highlighted in 118 of 284 comments (41.5%).

Out of 265 eligible manuscripts, 110 were included in the online peer review trial. Around 7000 potential reviewers were invited to review each manuscript.

In all, 44 of 110 manuscripts (40%) received 100 reviews from 59 reviewers, alongside 115 conventional reviews. The quality of the open forum reviews was lower than for conventional reviews (2.13 (± 0.75) versus 2.84 (± 0.71), P<0.001).

Conclusion

Open online peer review is feasible in this setting, but it attracts few reviews, of lower quality than conventional peer reviews.

URL : A prospective study on an innovative online forum for peer reviewing of surgical science

DOI : https://doi.org/10.1371/journal.pone.0179031

 

Using Peer Review to Support Development of Community Resources for Research Data Management

Authors : Heather Soyka, Amber Budden, Viv Hutchison, David Bloom, Jonah Duckles, Amy Hodge, Matthew S. Mayernik, Timothée Poisot, Shannon Rauch, Gail Steinhart, Leah Wasser, Amanda L. Whitmire, Stephanie Wright

Objective

To ensure that resources designed to teach skills and best practices for scientific research data sharing and management are useful, the maintainers of those materials need to evaluate and update them to ensure their accuracy, currency, and quality.

This paper advances the use and process of outside peer review for community resources in addressing ongoing accuracy, quality, and currency issues. It further describes the next step of moving the updated materials to an online collaborative community platform for future iterative review in order to build upon mechanisms for open science, ongoing iteration, participation, and transparent community engagement.

Setting

Research data management resources were developed in support of the DataONE (Data Observation Network for Earth) project, which has deployed a sustainable, long-term network to ensure the preservation and access to multi-scale, multi-discipline, and multi-national environmental and biological science data (Michener et al. 2012).

Created by members of the Community Engagement and Education (CEE) Working Group in 2011-2012, the freely available Educational Modules included three complementary components (slides, handouts, and exercises) that were designed to be adaptable for use in classrooms as well as for research data management training.

Methods

Because the modules were initially created and launched in 2011-2012, the current members of the (renamed) Community Engagement and Outreach (CEO) Working Group were concerned that the materials could be and / or quickly become outdated and should be reviewed for accuracy, currency, and quality.

In November 2015, the Working Group developed an evaluation rubric for use by outside reviewers. Review criteria were developed based on surveys and usage scenarios from previous DataONE projects.

Peer reviewers were selected from the DataONE community network for their expertise in the areas covered by one of the 11 educational modules. Reviewers were contacted in March 2016, and were asked to volunteer to complete their evaluations online within one month of the request, by using a customized Google form.

Results

For the 11 modules, 22 completed reviews were received by April 2016 from outside experts. Comments on all three components of each module (slides, handouts, and exercises) were compiled and evaluated by the postdoctoral fellow attached to the CEO Working Group.

These reviews contributed to the full evaluation and revision by members of the Working Group of all educational modules in September 2016. This review process, as well as the potential lack of funding for ongoing maintenance by Working Group members or paid staff, provoked the group to transform the modules to a more stable, non-proprietary format, and move them to an online open repository hosting platform, GitHub.

These decisions were made to foster sustainability, community engagement, version control, and transparency.

Conclusion

Outside peer review of the modules by experts in the field was beneficial for highlighting areas of weakness or overlap in the education modules. The modules were initially created in 2011-2012 by an earlier iteration of the Working Group, and updates were needed due to the constant evolving practices in the field.

Because the review process was lengthy (approximately one year) comparative to the rate of innovations in data management practices, the Working Group discussed other options that would allow community members to make updates available more quickly.

The intent of migrating the modules to an online collaborative platform (GitHub) is to allow for iterative updates and ongoing outside review, and to provide further transparency about accuracy, currency, and quality in the spirit of open science and collaboration.

Documentation about this project may be useful for others trying to develop and maintain educational resources for engagement and outreach, particularly in communities and spaces where information changes quickly, and open platforms are already in common use.

URL : Using Peer Review to Support Development of Community Resources for Research Data Management

DOI : https://doi.org/10.7191/jeslib.2017.1114

Rethinking Data Sharing and Human Participant Protection in Social Science Research: Applications from the Qualitative Realm

Authors : Dessi Kirilova, Sebastian Karcher

While data sharing is becoming increasingly common in quantitative social inquiry, qualitative data are rarely shared. One factor inhibiting data sharing is a concern about human participant protections and privacy.

Protecting the confidentiality and safety of research participants is a concern for both quantitative and qualitative researchers, but it raises specific concerns within the epistemic context of qualitative research.

Thus, the applicability of emerging protection models from the quantitative realm must be carefully evaluated for application to the qualitative realm. At the same time, qualitative scholars already employ a variety of strategies for human-participant protection implicitly or informally during the research process.

In this practice paper, we assess available strategies for protecting human participants and how they can be deployed. We describe a spectrum of possible data management options, such as de-identification and applying access controls, including some already employed by the Qualitative Data Repository (QDR) in tandem with its pilot depositors.

Throughout the discussion, we consider the tension between modifying data or restricting access to them, and retaining their analytic value.

We argue that developing explicit guidelines for sharing qualitative data generated through interaction with humans will allow scholars to address privacy concerns and increase the secondary use of their data.

URL : Rethinking Data Sharing and Human Participant Protection in Social Science Research: Applications from the Qualitative Realm

DOI : http://doi.org/10.5334/dsj-2017-043

 

Standardising and harmonising research data policy in scholarly publishing

Authors : Iain Hrynaszkiewicz, Aliaksandr Birukou, Mathias Astell, Sowmya Swaminathan, Amye Kenall, Varsha Khodiyar

To address the complexities researchers face during publication, and the potential community-wide benefits of wider adoption of clear data policies, the publisher Springer Nature has developed a standardised, common framework for the research data policies of all its journals. An expert working group was convened to audit and identify common features of research data policies of the journals published by Springer Nature, where policies were present.

The group then consulted with approximately 30 editors, covering all research disciplines, within the organisation. The group also consulted with academic editors and librarians and funders, which informed development of the framework and the creation of supporting resources.

Four types of data policy were defined in recognition that some journals and research communities are more ready than others to adopt strong data policies. As of January 2017 more than 700 journals have adopted a standard policy and this number is growing weekly. To potentially enable standardisation and harmonisation of data policy across funders, institutions, repositories, societies and other publishers the policy framework was made available under a Creative Commons license.

However, the framework requires wider debate with these stakeholders and an Interest Group within the Research Data Alliance (RDA) has been formed to initiate this process.

This paper was presented at the 12th International Digital Curation Conference, Edinburgh, UK on 22 February 2017 and will be submitted to International Journal of Digital Curation.

URL : Standardising and harmonising research data policy in scholarly publishing

DOI : https://doi.org/10.1101/122929

Recommended versus Certified Repositories: Mind the Gap

Authors : Sean Edward Husen, Zoë G. de Wilde, Anita de Waard, Helena Cousijn

Researchers are increasingly required to make research data publicly available in data repositories. Although several organisations propose criteria to recommend and evaluate the quality of data repositories, there is no consensus of what constitutes a good data repository.

In this paper, we investigate, first, which data repositories are recommended by various stakeholders (publishers, funders, and community organizations) and second, which repositories are certified by a number of organisations.

We then compare these two lists of repositories, and the criteria for recommendation and certification. We find that criteria used by organisations recommending and certifying repositories are similar, although the certification criteria are generally more detailed.

We distil the lists of criteria into seven main categories: “Mission”, “Community/Recognition”, “Legal and Contractual Compliance”, “Access/Accessibility”, “Technical Structure/Interface”, “Retrievability” and “Preservation”.

Although the criteria are similar, the lists of repositories that are recommended by the various agencies are very different. Out of all of the recommended repositories, less than 6% obtained certification.

As certification is becoming more important, steps should be taken to decrease this gap between recommended and certified repositories, and ensure that certification standards become applicable, and applied, to the repositories which researchers are currently using.

URL : Recommended versus Certified Repositories: Mind the Gap

DOI: https://doi.org/10.5334/dsj-2017-042