Transparency versus anonymity: which is better to eliminate bias in peer review?

Authors: Faye Holst, Kim Eggleton, Simon Harris

Peer review is a critical component of the scientific process. When conducted properly by dedicated and competent reviewers, it helps to safeguard the quality, validity, authority and rigour of academic work. However, bias in peer review is well documented and can skew objectivity of the review and hinder fair assessment of research.

To mitigate against bias and enhance accountability, IOP Publishing has introduced two different, but complementary, approaches to all their peer-reviewed, open access (OA) journals: double-anonymous peer review and transparent peer review.

Double-anonymous peer review, where the reviewer and author identities are concealed, is designed to tackle inequality in the scholarly publishing process as it reduces bias with respect to gender, race, country of origin or affiliation.

Transparent peer review shows readers the full peer review history, including reviewer reports, editor decision letters and the authors’ responses alongside the published article. Making this process visible to the community increases accountability, allows reviewers to be recognized more for their work and can aid the training of aspiring reviewers.

IOP Publishing is the first physics publisher to adopt both of these approaches portfolio wide. In this article we discuss how applying these methods has altered different elements of the publishing process. Early indicators show that there may be a marked difference in acceptance rates across regions.

URL : Transparency versus anonymity: which is better to eliminate bias in peer review?

DOI : http://doi.org/10.1629/uksg.584

Tragedy of the Data Commons

Accurate data is vital to enlightened research and policymaking, particularly publicly available data that are redacted to protect the identity of individuals.

Legal academics, however, are campaigning against data anonymization as a means to protect privacy, contending that wealth of information available on the Internet enables malfeasors to reverse-engineer the data and identify individuals within them.

Privacy scholars advocate for new legal restrictions on the collection and dissemination of research data. This Article challenges the dominant wisdom, arguing that properly de-identified data is not only safe, but of extraordinary social utility.

It makes three core claims. First, legal scholars have misinterpreted the relevant literature from computer science and statistics, and thus have significantly overstated the futility of anonymizing data. Second, the available evidence demonstrates that the risks from anonymized data are theoretical – they rarely, if ever, materialize. Finally, anonymized data is crucial to beneficial social research, and constitutes a public resource – a commons – under threat of depletion.

The Article concludes with a radical proposal: since current privacy policies overtax valuable research without reducing any realistic risks, law should provide a safe harbor for the dissemination of research data.”

URL : http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1789749