Toward a new model of scientific publishing discussion…

Toward a new model of scientific publishing: discussion and a proposal :

“The current system of publishing in the biological sciences is notable for its redundancy, inconsistency, sluggishness, and opacity. These problems persist, and grow worse, because the peer review system remains focused on deciding whether or not to publish a paper in a particular journal rather than providing (1) a high-quality evaluation of scientific merit and (2) the information necessary to organize and prioritize the literature. Online access has eliminated the need for journals as distribution channels, so their primary current role is to provide authors with feedback prior to publication and a quick way for other researchers to prioritize the literature based on which journal publishes a paper. However, the feedback provided by reviewers is not focused on scientific merit but on whether to publish in a particular journal, which is generally of little use to authors and an opaque and noisy basis for prioritizing the literature. Further, each submission of a rejected manuscript requires the entire machinery of peer review to creak to life anew. This redundancy incurs delays, inconsistency, and increased burdens on authors, reviewers, and editors. Finally, reviewers have no real incentive to review well or quickly, as their performance is not tracked, let alone rewarded. One of the consistent suggestions for modifying the current peer review system is the introduction of some form of post-publication reception, and the development of a marketplace where the priority of a paper rises and falls based on its reception from the field (see other articles in this special topics). However, the information that accompanies a paper into the marketplace is as important as the marketplace’s mechanics. Beyond suggestions concerning the mechanisms of reception, we propose an update to the system of publishing in which publication is guaranteed, but pre-publication peer review still occurs, giving the authors the opportunity to revise their work following a mini pre-reception from the field. This step also provides a consistent set of rankings and reviews to the marketplace, allowing for early prioritization and stabilizing its early dynamics. We further propose to improve the general quality of reviewing by providing tangible rewards to those who do it well.”
URL : http://www.frontiersin.org/computational_neuroscience/10.3389/fncom.2011.00055/full

Citation and Peer Review of Data Moving Towards…

Citation and Peer Review of Data: Moving Towards Formal Data Publication

“This paper discusses many of the issues associated with formally publishing data in academia, focusing primarily on the structures that need to be put in place for peer review and formal citation of datasets. Data publication is becoming increasingly important to the scientific community, as it will provide a mechanism for those who create data to receive academic credit for their work and will allow the conclusions arising from an analysis to be more readily verifiable, thus promoting transparency in the scientific process. Peer review of data will also provide a mechanism for ensuring the quality of datasets, and we provide suggestions on the types of activities one expects to see in the peer review of data. A simple taxonomy of data publication methodologies is presented and evaluated, and the paper concludes with a discussion of dataset granularity, transience and semantics, along with a recommended human-readable citation syntax.”

URL : http://www.ijdc.net/index.php/ijdc/article/view/181

Wikis in scholarly publishing Scientific research is…

Wikis in scholarly publishing :

“Scientific research is a process concerned with the creation, collective accumulation, contextualization, updating and maintenance of knowledge. Wikis provide an environment that allows to collectively accumulate, contextualize, update and maintain knowledge in a coherent and transparent fashion. Here, we examine the potential of wikis as platforms for scholarly publishing. In the hope to stimulate further discussion, the article itself was drafted on Species ID – http://species-id.net; a wiki that hosts a prototype for wiki-based scholarly publishing – where it can be updated, expanded or otherwise improved.”

URL : http://iospress.metapress.com/content/q42617538838t6j2/

Science and Technology Committee Eighth Report Peer review…

Science and Technology Committee – Eighth Report : Peer review in scientific publications :

“Peer review in scholarly publishing, in one form or another, has always been regarded as crucial to the reputation and reliability of scientific research. In recent years there have been an increasing number of reports and articles assessing the current state of peer review. In view of the importance of evidence-based scientific information to government, it seemed appropriate to undertake a detailed examination of the current peer-review system as used in scientific publications. Both to see whether it is operating effectively and to shine light on new and innovative approaches. We also explored some of the broader issues around research impact, publication ethics and research integrity.

We found that despite the many criticisms and the little solid evidence on the efficacy of pre-publication editorial peer review, it is considered by many as important and not something that can be dispensed with. There are, however, many ways in which current pre-publication peer-review practices can and should be improved and optimised, although we recognise that different types of peer review are suitable to different disciplines and research communities. Innovative approaches—such as the use of pre-print servers, open peer review, increased transparency and online repository-style journals—should be explored by publishers, in consultation with their journals and taking into account the requirements of their research communities. Some of these new approaches may help to reduce the necessary burden on researchers, and also help accelerate the pace of publication of research. We encourage greater recognition of the work carried out by reviewers, by both publishers and employers. All publishers need to have in place systems for recording and acknowledging the contribution of those involved in peer review.

Publishers also have a responsibility to ensure that the people involved in the peer-review process are adequately trained for the role that they play. Training for editors, authors and reviewers varies across the publishing sector and across different research institutions. We encourage publishers to work together to develop standards—which could be applied across the industry—to ensure that all editors, whether staff or academic, are fully equipped for the job that they do. Furthermore, we consider that all early-career researchers should be given the option for training in peer review; responsibility for this lies primarily with the funders of research.

Funders of research have an interest in ensuring that the work they fund is both scientifically sound and reproducible. We consider that it should be a fundamental aim of the peer-review process that all publications are scientifically sound. Reproducibility should be the gold standard that all peer reviewers and editors aim for when assessing whether a manuscript has supplied sufficient information to allow others to repeat and build on the experiments. As such, the presumption must be that, unless there is a strong reason otherwise, data should be fully disclosed and made publicly available. In line with this principle, data associated with all publicly funded research should, where possible, be made widely and freely available. The work of researchers who expend time and effort adding value to their data, to make it usable by others, should be acknowledged and encouraged.

While pre-publication peer review (the first records of which date back to the 17th century) continues to play an important role in ensuring that the scientific record is sound, the growth of post-publication peer review and commentary represents an enormous opportunity for experimentation with new media and social networking tools. Online communications allow the widespread sharing of links to articles, ensuring that interesting research is spread across the world, facilitating rapid commentary and review by the global audience. They also have a valuable role to play in alerting the community to potential deficiencies and problems with published work. We encourage the prudent use of online tools for post-publication review and commentary as a means of supplementing pre-publication review.

On the subject of impact, it was clear to us that the publication of peer-reviewed articles, particularly those that are published in journals with high Impact Factors, has a direct effect on the careers of researchers and the reputations of research institutions. Assessing the impact or perceived importance of research before it is published requires subjective judgement. We therefore have concerns about the use of journal Impact Factor as a proxy measure for the quality of individual articles. While we have been assured by research funders that they do not use this as a proxy measure for the quality of research or of individual articles, representatives of research institutions have suggested that publication in a high-impact journal is still an important consideration when assessing individuals for career progression. We consider that research institutions should be cautious about this approach as there is an element of chance in getting articles accepted in such journals. We have heard in the course of this inquiry that there is no substitute for reading the article itself in assessing the worth of a piece of research.

Finally, we found that the integrity of the peer-review process can only ever be as robust as the integrity of the people involved. Ethical and scientific misconduct—such as in the Wakefield case—damages peer review and science as a whole. Although it is not the role of peer review to police research integrity and identify fraud or misconduct, it does, on occasion, identify suspicious cases. While there is guidance in place for journal editors when ethical misconduct is suspected, we found the general oversight of research integrity in the UK to be unsatisfactory. We note that the UK Research Integrity Futures Working Group report recently made sensible recommendations about the way forward for research integrity in the UK, which have not been adopted. We recommend that the Government revisit the recommendation that the UK should have an oversight body for research integrity that provides “advice and support to research employers and assurance to research funders”, across all disciplines. Furthermore, while employers must take responsibility for the integrity of their employees’ research, we recommend that there be an external regulator overseeing research integrity. We also recommend that all UK research institutions have a specific member of staff leading on research integrity.”

URL : http://www.publications.parliament.uk/pa/cm201012/cmselect/cmsctech/856/85602.htm

Peer Review in Academic Promotion and Publishing: Its Meaning, Locus, and Future

Since 2005, and with generous support from the A.W. Mellon Foundation, The Future of Scholarly Communication Project at UC Berkeley’s Center for Studies in Higher Education (CSHE) has been exploring how academic values—including those related to peer review, publishing, sharing, and collaboration—influence scholarly communication practices and engagement with new technological affordances, open access publishing, and the public good.

The current phase of the project focuses on peer review in the Academy; this deeper look at peer review is a natural extension of our findings in Assessing the Future Landscape of Scholarly Communication: An Exploration of Faculty Values and Needs in Seven Disciplines (Harley et al. 2010), which stressed the need for a more nuanced academic reward system that is less dependent on citation metrics, the slavish adherence to marquee journals and university presses, and the growing tendency of institutions to outsource assessment of scholarship to such proxies as default promotion criteria.

This investigation is made urgent by a host of new challenges facing institutional peer review, such as assessing interdisciplinary scholarship, hybrid disciplines, the development of new online forms of edition making and collaborative curation for community resource use, heavily computational subdisciplines, large-scale collaborations around grand challenge questions, an increase in multiple authorship, a growing flood of low-quality publications, and the call by governments, funding bodies, universities, and individuals for the open access publication of taxpayer-subsidized research, including original data sets.

The challenges of assessing the current and future state of peer review are exacerbated by pressing questions of how the significant costs of high-quality scholarly publishing can be borne in the face of calls for alternative, usually university-based and open access, publishing models for both journals and books.

There is additionally the insidious and destructive “trickle down” of tenure and promotion requirements from elite research universities to less competitive and non-research-intensive institutions.

The entire system is further stressed by the mounting—and often unrealistic—government pressure on scholars in developed and emerging economies alike to publish their research in the most select peer-reviewed outlets, ostensibly to determine the distribution of government funds (via research assessment exercises) and/or to meet national imperatives to achieve research distinction internationally.

The global effect is a growing glut of low-quality publications that strains the efficient and effective practice of peer review, a practice that is, itself, primarily subsidized by universities in the form of faculty salaries. Library budgets and preservation services for this expansion of peer-reviewed publication have run out. Faculty time spent on peer review, in all of its guises, is being exhausted.

As part of our ongoing research, CSHE hosted two meetings to address the relationship between peer review in publication and that carried out for tenure and promotion. Our discussions included: The Dominant System of Peer Review: Types, Standards, Uses, Abuses, and Costs; A Very Tangled Web: Alternatives to the Current System of Peer Review; Creating New Models: The Role of Societies, Presses, Libraries, Information Technology Organizations, Commercial Publishers, and Other Stakeholders; and Open Access “Mandates” and Resolutions versus Developing New Models.

This report includes (1) an overview of the state of peer review in the Academy at large, (2) a set of recommendations for moving forward, (3) a proposed research agenda to examine in depth the effects of academic status-seeking on the entire academic enterprise, (4) proceedings from the workshop on the four topics noted above, and (5) four substantial and broadly conceived background papers on the workshop topics, with associated literature reviews.

The document explores, in particular, the tightly intertwined phenomena of peer review in publication and academic promotion, the values and associated costs to the Academy of the current system, experimental forms of peer review in various disciplinary areas, the effects of scholarly practices on the publishing system, and the possibilities and real costs of creating alternative loci for peer review and publishing that link scholarly societies, libraries, institutional repositories, and university presses.

We also explore the motivations and ingredients of successful open access resolutions that are directed at peer-reviewed article-length material. In doing so, this report suggests that creating a wider array of institutionally acceptable and cost-effective alternatives to peer reviewing and publishing scholarly work could maintain the quality of academic peer review, support greater research productivity, reduce the explosive growth of low-quality publications, increase the purchasing power of cash-strapped libraries, better support the free flow and preservation of ideas, and relieve the burden on overtaxed faculty of conducting too much peer review.”

URL : http://cshe.berkeley.edu/publications/publications.php?id=379

Towards Scholarly Communication 2.0: Pee…

Towards Scholarly Communication 2.0: Peer-to-Peer Review & Ranking in Open Access Preprint Repositories :

“In this paper we present our unified peer-to-peer review model for Open Access preprint repositories. Its objective is to improve the efficiency and effectivity of digital scholarly communication. The key elements of this model are standardized quality assessment instruments, public and private communication channels, special rankings and novel incentives. The model allows scholars to proficiently evaluate both the manuscripts and their peer reviews. These scrutinized manuscripts and peer reviews will then be made available to the relevant parties. These standardized quality assessments allow for new quality metrics for papers and peer reviews. The Reviewer Impact, which represents the peer review proficiency and peer review output of scholars, is one such metric. The model includes diverse rankings for scholars to appear in to receive better odds of having their own manuscripts noticed, read, peer reviewed and cited. Their specific ranking is proportional to their Reviewer Impact and the overall quality of their manuscripts. The Open Access preprint repository model is a suitable foundation for our model because of its high degree of accessibility, but little to no certification of its deposited manuscripts. With this combination we envision a novel, Open Access, peer-to-peer scholarly communication model that functions independently of, but not incompatibly with, the traditional journal publishing model: Scholarly Communication 2.0.”

URL : http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1681478

Combining peer review and metrics to ass…

Combining peer review and metrics to assess journals for inclusion in Scopus :

“Peer review has been in place for centuries as an accepted process to validate manuscripts submitted for publication in scientific journals. Yet a similarly rigorous assessment of content also happens a level up, when looking at the quality of journals that apply for indexing in bibliographic databases. Scopus, an abstract & citation database provided by Elsevier, indexing 18,000 scientific titles, is receiving an increasing number of title suggestions; in 2009 this grew to almost 5,000 in that year alone. Some of the suggested journals are dedicated to niche areas and/or are published in other languages than English. To ensure a fair and transparent evaluation of these titles and to address the rising interest in being indexed, Scopus redesigned its entire title evaluation process – basing it on a metrical scorecard and on the principles of peer review. By developing an online editorial system – the Scopus Title Evaluation Platform (STEP) – Scopus also created the prerequisite of an improved communication with publishers and editors about their journals.”

URL : http://www.ingentaconnect.com/content/alpsp/lp/2010/00000023/00000004/art00011