Ethics approval in applications for open-access clinical trial data: An analysis of researcher statements to clinicalstudydatarequest.com

Authors : Derek So, Bartha M. Knoppers

Although there are a number of online platforms for patient-level clinical trial data sharing from industry sponsors, they are not very harmonized regarding the role of local ethics approval in the research proposal review process.

The first and largest of these platforms is ClinicalStudyDataRequest.com (CSDR), which includes over three thousand trials from thirteen sponsors including GlaxoSmithKline, Novartis, Roche, Sanofi, and Bayer. CSDR asks applicants to state whether they have received ethics approval for their research proposal, but in most cases does not require that they submit evidence of approval.

However, the website does require that applicants without ethical approval state the reason it was not required. In order to examine the perspectives of researchers on this topic, we coded every response to that question received by CSDR between June 2014 and February 2017.

Of 111 applicants who stated they were exempt from ethics approval, 63% mentioned de-identification, 57% mentioned the use of existing data, 33% referred to local or jurisdictional regulations, and 20% referred to the approvals obtained by the original study.

We conclude by examining the experience of CSDR within the broader context of the access mechanisms and policies currently being used by other data sharing platforms, and discuss how our findings might be used to help clinical trial data providers design clear and informative access documents.

URL : Ethics approval in applications for open-access clinical trial data: An analysis of researcher statements to clinicalstudydatarequest.com

DOI : https://doi.org/10.1371/journal.pone.0184491

Effectiveness of Anonymization in Double-Blind Review

Authors : Claire Le Goues, Yuriy Brun, Sven Apel, Emery Berger, Sarfraz Khurshid, Yannis Smaragdakis

Double-blind review relies on the authors’ ability and willingness to effectively anonymize their submissions. We explore anonymization effectiveness at ASE 2016, OOPSLA 2016, and PLDI 2016 by asking reviewers if they can guess author identities.

We find that 74%-90% of reviews contain no correct guess and that reviewers who self-identify as experts on a paper’s topic are more likely to attempt to guess, but no more likely to guess correctly.

We present our findings, summarize the PC chairs’ comments about administering double-blind review, discuss the advantages and disadvantages of revealing author identities part of the way through the process, and conclude by advocating for the continued use of double-blind review.

URL : https://arxiv.org/abs/1709.01609

We’ve failed: Pirate black open access is trumping green and gold and we must change our approach

Author : Toby Green

Key points

Sci-Hub has made nearly all articles freely available using a black open access model, leaving green and gold models in its dust.

 Why, after 20 years of effort, have green and gold open access not achieved more? Do we need ‘tae think again’?

 If human nature is to postpone change for as long as possible, are green and gold open access fundamentally flawed?

 Open and closed publishing models depend on bundle pricing paid by one stakeholder, the others getting a free ride. Is unbundling a fairer model?

If publishers changed course and unbundled their product, would this open a legal, fairer route to 100% open access and see off the pirates?

URL : http://onlinelibrary.wiley.com/doi/10.1002/leap.1116/full

Open access megajournals: The publisher perspective (Part 2: Operational realities)

Authors : Simon Wakeling ,Valérie Spezi, Jenny Fry, Claire Creaser, Stephen Pinfield, Peter Willett

This paper is the second of two Learned Publishing articles in which we report the results of a series of interviews, with senior publishers and editors exploring open access megajournals (OAMJs).

Megajournals (of which PLoS One is the best known example) represent a relatively new approach to scholarly communication and can be characterized as large, broad-scope, open access journals, which take an innovative approach to peer review, basing acceptance decisions solely on the technical or scientific soundness of the article. B

ased on interviews with 31 publishers and editors, this paper reports the perceived cultural, operational, and technical challenges associated with launching, growing, and maintaining a megajournal.

We find that overcoming these challenges while delivering the societal benefits associated with OAMJs is seen to require significant investment in people and systems, as well as an ongoing commitment to the model.

URL : Open access megajournals: The publisher perspective (Part 2: Operational realities)

Alternative location : http://onlinelibrary.wiley.com/doi/10.1002/leap.1118/full

 

Open access megajournals: The publisher perspective (Part 1: Motivations)

Authors : Simon Wakeling ,Valérie Spezi , Jenny Fry, Claire Creaser, Stephen Pinfield, Peter Willett

This paper is the first of two Learned Publishing articles in which we report the results of a series of interviews with senior publishers and editors exploring open access megajournals (OAMJs).

Megajournals (of which PLoS One is the best known example) represent a relatively new approach to scholarly communication and can be characterized as large, broad-scope, open access journals that take an innovative approach to peer review, basing acceptance decisions solely on the technical or scientific soundness of the article.

This model is often said to support the broader goals of the open science movement. Based on in-depth interviews with 31 publishers and editors representing 16 different organizations (10 of which publish a megajournal), this paper reports how the term ‘megajournal’ is understood and publishers’ rationale and motivations for launching (or not launching) an OAMJ.

We find that while there is general agreement on the common characteristics of megajournals, there is not yet a consensus on their relative importance. We also find seven motivating factors that were said to drive the launch of an OAMJ and link each of these factors to potential societal and business benefits.

These results suggest that the often polarized debate surrounding OAMJs is a consequence of the extent to which observers perceive publishers to be motivated by these societal or business benefits.

URL : Open access megajournals: The publisher perspective (Part 1: Motivations)

Alternative location : http://onlinelibrary.wiley.com/doi/10.1002/leap.1117/full

 

 

A prospective study on an innovative online forum for peer reviewing of surgical science

Authors : Martin Almquist, Regula S. von Allmen, Dan Carradice, Steven J. Oosterling, Kirsty McFarlane, Bas Wijnhoven

Background

Peer review is important to the scientific process. However, the present system has been criticised and accused of bias, lack of transparency, failure to detect significant breakthrough and error. At the British Journal of Surgery (BJS), after surveying authors’ and reviewers’ opinions on peer review, we piloted an open online forum with the aim of improving the peer review process.

Methods

In December 2014, a web-based survey assessing attitudes towards open online review was sent to reviewers with a BJS account in Scholar One. From April to June 2015, authors were invited to allow their manuscripts to undergo online peer review in addition to the standard peer review process.

The quality of each review was evaluated by editors and editorial assistants using a validated instrument based on a Likert scale.

Results

The survey was sent to 6635 reviewers. In all, 1454 (21.9%) responded. Support for online peer review was strong, with only 10% stating that they would not subject their manuscripts to online peer review. The most prevalent concern was about intellectual property, being highlighted in 118 of 284 comments (41.5%).

Out of 265 eligible manuscripts, 110 were included in the online peer review trial. Around 7000 potential reviewers were invited to review each manuscript.

In all, 44 of 110 manuscripts (40%) received 100 reviews from 59 reviewers, alongside 115 conventional reviews. The quality of the open forum reviews was lower than for conventional reviews (2.13 (± 0.75) versus 2.84 (± 0.71), P<0.001).

Conclusion

Open online peer review is feasible in this setting, but it attracts few reviews, of lower quality than conventional peer reviews.

URL : A prospective study on an innovative online forum for peer reviewing of surgical science

DOI : https://doi.org/10.1371/journal.pone.0179031

 

Using Peer Review to Support Development of Community Resources for Research Data Management

Authors : Heather Soyka, Amber Budden, Viv Hutchison, David Bloom, Jonah Duckles, Amy Hodge, Matthew S. Mayernik, Timothée Poisot, Shannon Rauch, Gail Steinhart, Leah Wasser, Amanda L. Whitmire, Stephanie Wright

Objective

To ensure that resources designed to teach skills and best practices for scientific research data sharing and management are useful, the maintainers of those materials need to evaluate and update them to ensure their accuracy, currency, and quality.

This paper advances the use and process of outside peer review for community resources in addressing ongoing accuracy, quality, and currency issues. It further describes the next step of moving the updated materials to an online collaborative community platform for future iterative review in order to build upon mechanisms for open science, ongoing iteration, participation, and transparent community engagement.

Setting

Research data management resources were developed in support of the DataONE (Data Observation Network for Earth) project, which has deployed a sustainable, long-term network to ensure the preservation and access to multi-scale, multi-discipline, and multi-national environmental and biological science data (Michener et al. 2012).

Created by members of the Community Engagement and Education (CEE) Working Group in 2011-2012, the freely available Educational Modules included three complementary components (slides, handouts, and exercises) that were designed to be adaptable for use in classrooms as well as for research data management training.

Methods

Because the modules were initially created and launched in 2011-2012, the current members of the (renamed) Community Engagement and Outreach (CEO) Working Group were concerned that the materials could be and / or quickly become outdated and should be reviewed for accuracy, currency, and quality.

In November 2015, the Working Group developed an evaluation rubric for use by outside reviewers. Review criteria were developed based on surveys and usage scenarios from previous DataONE projects.

Peer reviewers were selected from the DataONE community network for their expertise in the areas covered by one of the 11 educational modules. Reviewers were contacted in March 2016, and were asked to volunteer to complete their evaluations online within one month of the request, by using a customized Google form.

Results

For the 11 modules, 22 completed reviews were received by April 2016 from outside experts. Comments on all three components of each module (slides, handouts, and exercises) were compiled and evaluated by the postdoctoral fellow attached to the CEO Working Group.

These reviews contributed to the full evaluation and revision by members of the Working Group of all educational modules in September 2016. This review process, as well as the potential lack of funding for ongoing maintenance by Working Group members or paid staff, provoked the group to transform the modules to a more stable, non-proprietary format, and move them to an online open repository hosting platform, GitHub.

These decisions were made to foster sustainability, community engagement, version control, and transparency.

Conclusion

Outside peer review of the modules by experts in the field was beneficial for highlighting areas of weakness or overlap in the education modules. The modules were initially created in 2011-2012 by an earlier iteration of the Working Group, and updates were needed due to the constant evolving practices in the field.

Because the review process was lengthy (approximately one year) comparative to the rate of innovations in data management practices, the Working Group discussed other options that would allow community members to make updates available more quickly.

The intent of migrating the modules to an online collaborative platform (GitHub) is to allow for iterative updates and ongoing outside review, and to provide further transparency about accuracy, currency, and quality in the spirit of open science and collaboration.

Documentation about this project may be useful for others trying to develop and maintain educational resources for engagement and outreach, particularly in communities and spaces where information changes quickly, and open platforms are already in common use.

URL : Using Peer Review to Support Development of Community Resources for Research Data Management

DOI : https://doi.org/10.7191/jeslib.2017.1114