Content Volatility of Scientific Topics in Wikipedia: A Cautionary Tale

Statut

Wikipedia has quickly become one of the most frequently accessed encyclopedic references, despite the ease with which content can be changed and the potential for ‘edit wars’ surrounding controversial topics. Little is known about how this potential for controversy affects the accuracy and stability of information on scientific topics, especially those with associated political controversy. Here we present an analysis of the Wikipedia edit histories for seven scientific articles and show that topics we consider politically but not scientifically “controversial” (such as evolution and global warming) experience more frequent edits with more words changed per day than pages we consider “noncontroversial” (such as the standard model in physics or heliocentrism).

For example, over the period we analyzed, the global warming page was edited on average (geometric mean ±SD) 1.9±2.7 times resulting in 110.9±10.3 words changed per day, while the standard model in physics was only edited 0.2±1.4 times resulting in 9.4±5.0 words changed per day. The high rate of change observed in these pages makes it difficult for experts to monitor accuracy and contribute time-consuming corrections, to the possible detriment of scientific accuracy. As our society turns to Wikipedia as a primary source of scientific information, it is vital we read it critically and with the understanding that the content is dynamic and vulnerable to vandalism and other shenanigans.

URL : Content Volatility of Scientific Topics in Wikipedia: A Cautionary Tale

DOI : 10.1371/journal.pone.0134454

Open Access Indicators and Scholarly Communications in Latin America

Statut

This book is the result of a joint research and development project supported by UNESCO and undertaken in 2013 by UNESCO in partnership with the Public Knowledge Project (PKP), the Scientific Electronic Library Online (SciELO), the Network of Scientific Journals of Latin America, the Caribbean, Spain and Portugal (RedALyC), Africa Journals Online (AJOL), the Latin America Social Sciences SchoolBrazil (FLACSO-Brazil), and the Latin American Council of Social Sciences (CLACSO). This book aims to contribute to the understanding of scholarly production, use and reach through measures that are open and inclusive. The present book is divided into two sections.

The first section presents a narrative summary of Open Access in Latin America, including a description of the major regional initiatives that are collecting and systematizing data related to Open Access scholarship, and of available data that can be used to understand the (i) growth, (ii) reach, and (iii) impact of Open Access in developing regions. The first section ends with recommendations for future activities. The second section includes in-depth case-studies with the descriptions of indicators and methodologies of peer-review journal portals SciELO and Redalyc, and a case of subject digital repository maintained by CLACSO.

URL : https://microblogging.infodocs.eu/wp-content/uploads/2015/08/alperin2014.pdf

Alternative location : http://hdl.handle.net/10760/25122

Retraction policies of top scientific journals ranked by impact factor

Statut

Objective

This study gathered information about the retraction policies of the top 200 scientific journals, ranked by impact factor.

Methods

Editors of the top 200 science journals for the year 2012 were contacted by email.

Results

One hundred forty-seven journals (74%) responded to a request for information. Of these, 95 (65%) had a retraction policy. Of journals with a retraction policy, 94% had a policy that allows the editors to retract articles without authors’ consent.

Conclusions

The majority of journals in this sample had a retraction policy, and almost all of them would retract an article without the authors’ permission.

URL : http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4511053/

Publishing Ethics and Predatory Practices: A Dilemma for All Stakeholders of Science Communication

Publishing scholarly articles in traditional and newly-launched journals is a responsible task, requiring diligence from authors, reviewers, editors, and publishers. The current generation of scientific authors has ample opportunities for publicizing their research. However, they have to selectively target journals and publish in compliance with the established norms of publishing ethics. Over the past few years, numerous illegitimate or predatory journals have emerged in most fields of science. By exploiting gold Open Access publishing, these journals paved the way for low-quality articles that threatened to change the landscape of evidence-based science.

 

Authors, reviewers, editors, established publishers, and learned associations should be informed about predatory publishing practices and contribute to the trustworthiness of scholarly publications. In line with this, there have been several attempts to distinguish legitimate and illegitimate journals by blacklisting unethical journals (the Jeffrey Beall’s list), issuing a statement on transparency and best publishing practices (the Open Access Scholarly Publishers Association’s and other global organizations’ draft document), and tightening the indexing criteria by the Directory of Open Access Journals. None of these measures alone turned to be sufficient. All stakeholders of science communication should be aware of multiple facets of unethical practices and publish well-checked and evidence-based articles.

 

 

You are invited to submit…

Statut

The academic community is under great pressure to publish. This pressure is compounded by high rejection rates at many journals. A more recent trend is for some journals to send invitations directly to researchers inviting them to submit a manuscript to their journals. Many researchers find these invitations annoying and unsure how best to respond to them. We collected electronic invitations to submit a manuscript to a journal between April 1, 2014, and March 31, 2015.

We analyzed their content and cross-tabulated them against journals listed in Beall’s list of potential predatory journals. During this time period, 311 invitations were received for 204 journals, the majority of which were in Beall’s list (n = 244; 79 %). The invitations came throughout the calendar year and some journals sent up to six invitations. The majority of journals claimed to provide peer review (n = 179; 57.6 %) although no mention was made of expedited review process. Similarly, more than half of the journals claimed to be open access (n = 186; 59.8 %). The majority of invitations included an unsubscribe link (n = 187; 60.1 %). About half of the invitations came from biomedical journals (n = 179).

We discuss strategies researchers and institutions can consider to reduce the number of invitations received and strategies to handle those invitations that make it to the recipients’ inbox, thus helping to maintain the credibility and reputation of researchers and institutions.

URL : You are invited to submit…

Related URL : http://www.biomedcentral.com/1741-7015/13/180

Sizing the Problem of Improving Discovery and Access to NIH-Funded Data: A Preliminary Study

Statut

Objective

This study informs efforts to improve the discoverability of and access to biomedical datasets by providing a preliminary estimate of the number and type of datasets generated annually by research funded by the U.S. National Institutes of Health (NIH). It focuses on those datasets that are “invisible” or not deposited in a known repository.

Methods

We analyzed NIH-funded journal articles that were published in 2011, cited in PubMed and deposited in PubMed Central (PMC) to identify those that indicate data were submitted to a known repository. After excluding those articles, we analyzed a random sample of the remaining articles to estimate how many and what types of invisible datasets were used in each article.

Results

About 12% of the articles explicitly mention deposition of datasets in recognized repositories, leaving 88% that are invisible datasets. Among articles with invisible datasets, we found an average of 2.9 to 3.4 datasets, suggesting there were approximately 200,000 to 235,000 invisible datasets generated from NIH-funded research published in 2011. Approximately 87% of the invisible datasets consist of data newly collected for the research reported; 13% reflect reuse of existing data. More than 50% of the datasets were derived from live human or non-human animal subjects.

Conclusion

In addition to providing a rough estimate of the total number of datasets produced per year by NIH-funded researchers, this study identifies additional issues that must be addressed to improve the discoverability of and access to biomedical research data: the definition of a “dataset,” determination of which (if any) data are valuable for archiving and preservation, and better methods for estimating the number of datasets of interest. Lack of consensus amongst annotators about the number of datasets in a given article reinforces the need for a principled way of thinking about how to identify and characterize biomedical datasets.

URL : Sizing the Problem of Improving Discovery and Access to NIH-Funded Data: A Preliminary Study

DOI : 10.1371/journal.pone.0132735

Identifying Open Access Articles within the Top Ten Closed Access LIS Journals: A Global Perspective

Statut

Librarians have embraced the open access movement. They work to raise awareness of issues surrounding scholarly communication, to educate faculty about authors’ rights, and to help implement and maintain institutional repositories (IRs). But for all of the research and commentary from librarians about the importance of IRs and of making research freely available, there still exists the glaring contradiction that few librarians and Library and Information Science (LIS) authors provide free access to their own research publications.

In this study, we will look at the open access availability of articles from the top 20 closed access LIS journals and discuss some factors that may explain the discrepancies between LIS authors’ attitudes towards open access and their own self-archiving practices.

URL : http://digitalcommons.unl.edu/libphilprac/1245/