From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics

Motivation

Reproducing the results from a scientific paper can be challenging due to the absence of data and the computational tools required for their analysis. In addition, details relating to the procedures used to obtain the published results can be difficult to discern due to the use of natural language when reporting how experiments have been performed.

The Investigation/Study/Assay (ISA), Nanopublications (NP), and Research Objects (RO) models are conceptual data modelling frameworks that can structure such information from scientific papers. Computational workflow platforms can also be used to reproduce analyses of data in a principled manner. We assessed the extent by which ISA, NP, and RO models, together with the Galaxy workflow system, can capture the experimental processes and reproduce the findings of a previously published paper reporting on the development of SOAPdenovo2, a de novo genome assembler.

Results

Executable workflows were developed using Galaxy, which reproduced results that were consistent with the published findings. A structured representation of the information in the SOAPdenovo2 paper was produced by combining the use of ISA, NP, and RO models. By structuring the information in the published paper using these data and scientific workflow modelling frameworks, it was possible to explicitly declare elements of experimental design, variables, and findings.

The models served as guides in the curation of scientific information and this led to the identification of inconsistencies in the original published paper, thereby allowing its authors to publish corrections in the form of an errata.”

URL : From Peer-Reviewed to Peer-Reproduced in Scholarly Publishing: The Complementary Roles of Data Models and Workflows in Bioinformatics

DOI : 10.1371/journal.pone.0127612

Hybrid Review: Taking SoTL Beyond Traditional Peer Review for Journal Publication

Statut

“Developments in emergent technology offer innovative solutions for facilitating a hybrid review process; we examine a unique combination of private–peer and open–public review uniquely relevant for disseminating the scholarship of teaching and learning (SoTL) through development of the Journal of Instructional Research (JIR). An analysis of the hybrid review process (combining the strengths of traditional peer review with an integrative public review process) revealed substantial reviewer participation that contributed to a well–rounded, engaged review process. Public review feedback constructively addressed the value and relevance of the implications, methodology, content and written quality of the manuscripts; an additional layer of private, peer review further refined the manuscripts to determine suitability for publication. This, in turn, created a space where refinement of content, structure, and design of SoTL research was achieved through an interactive process of scholarly inquiry and dialogue.”

DOI: http://dx.doi.org/10.3998/3336451.0018.202

Peer review: still king in the digital age

Statut

“The article presents one of the main findings of an international study of 4,000 academic researchers that examined how trustworthiness is determined in the digital environment when it comes to scholarly reading, citing, and publishing. The study shows that peer review is still the most trustworthy characteristic of all. There is, though, a common perception that open access journals are not peer reviewed or do not have proper peer-review systems. Researchers appear to have moved inexorably from a print-based system to a digital system, but it has not significantly changed the way they decide what to trust. They do not trust social media. Only a minority – although significantly mostly young and early career researchers – thought that social media are anything other than more appropriate to personal interactions and peripheral to their professional/academic lives. There are other significant differences, according to the age of the researcher. Thus, in regard to choosing an outlet for publication of their work, young researchers are much less concerned with the fact that it is peer reviewed.”

URL : http://ciber-research.eu/download/20140120-Peer_review-Learned_Publishing_2015.pdf

Data without Peer: Examples of Data Peer Review in the Earth Sciences

Statut

“Peer review of data is an important process if data is to take its place as a first class research output. Much has been written about the theoretical aspects of peer review, but not as much about the actual process of doing it. This paper takes an experimental view, and selects seven datasets, all from the Earth Sciences and with DOIs from DataCite, and attempts to review them, with varying levels of success. Key issues identified from these case studies include the necessity of human readable metadata, accessibility of datasets, and permanence of links to and accessibility of metadata stored in other locations.”

URL : http://www.dlib.org/dlib/january15/callaghan/01callaghan.html

Open Science and peer-review in the humanities

Statut

“The purpose of this paper is to consider alternatives to the traditional system of peer review. I will argue that new methods of review should be more in accordance with the principles of Open Science. Current modes of carrying out peer review are functioning as barriers against more transparent ways of doing research. I will focus on peer reviewing as it is done in the humanities. These sciences seem to be clinging particularly tight to traditional ways of publishing and doing peer review. After looking at traditional peer review and the troubles related to it, I will discuss alternative ways of reviewing scholarly material. The anonymity of reviewers and authors, the appropriate time to make papers public, and how to reward reviewers are topics that are of importance in this context.”

URL : http://eprints.rclis.org/24136/

Exposing the predators. Methods to stop predatory journals

Statut

“The internet is greatly improving the impact of scholarly journals, but also poses new threats to their quality. Publishers have arisen that abuse the Gold Open Access model, in which the author pays a fee to get his article published, to make money with so-called predatory journals. These publishers falsely claim to conduct peer review, which makes them more prone to publish fraudulent and plagiarised research. This thesis looks at three possible methods to stop predatory journals: black- and white-lists, open peer review systems and new metrics. Black- and whitelists have set up rules and regulations that credible publishers and journals should follow. Open peer review systems should make it harder for predatory publishers to make false claims about their peer review process. Metrics should measure more aspects of research impact and become less liable to gaming. The question is, which of these three methods is the best candidate to stop predatory journals. As all three methods have their drawbacks, especially for new but high quality journals, none of them stop predatory journals on its own can. Rather, we need a system in which researchers, publishers and reviewers communicate more openly about the research they create, disseminate and read. But above all, we need to find a way to take away incentives for researchers and publishers to engage in fraudulent practices.”

URL : http://hdl.handle.net/1887/28943

Why Do We Still Have Journals?

Statut

“The Web has greatly reduced the barriers to entry for new journals and other platforms for communicating scientific output, and the number of journals continues to multiply. This leaves readers and authors with the daunting cognitive challenge of navigating the literature and discerning contributions that are both relevant and significant. Meanwhile, measures of journal impact that might guide the use of the literature have become more visible and consequential, leading to “impact gamesmanship” that renders the measures increasingly suspect. The incentive system created by our journals is broken. In this essay, I argue that the core technology of journals is not their distribution but their review process. The organization of the review process reflects assumptions about what a contribution is and how it should be evaluated. Through their review processes, journals can certify contributions, convene scholarly communities, and curate works that are worth reading. Different review processes thereby create incentives for different kinds of work. It’s time for a broader dialogue about how we connect the aims of the social science enterprise to our system of journals.”

URL : http://asq.sagepub.com/content/59/2/193.full