Open Source Archaeology: Ethics and Practice

‘Open Source Archaeology: Ethics and Practice’ brings together authors and researchers in the field of open-source archaeology, defined as encompassing the ethical imperative for open public access to the results of publicly-funded research; practical solutions to open-data projects; open-source software applications in archaeology; public information sharing projects in archaeology; open-GIS; and the open-context system of data management and sharing.

This edited volume is designed to discuss important issues around open access to data and software in academic and commercial archaeology, as well as to summarise both the current state of theoretical engagement, and technological development in the field of open-archaeology.

URL : http://www.degruyter.com/view/product/460080

Open Access Meets Discoverability: Citations to Articles Posted to Academia.edu

Using matching and regression analyses, we measure the difference in citations between articles posted to Academia.edu and other articles from similar journals, controlling for field, impact factor, and other variables. Based on a sample size of 31,216 papers, we find that a paper in a median impact factor journal uploaded to Academia.edu receives 16% more citations after one year than a similar article not available online, 51% more citations after three years, and 69% after five years. We also found that articles also posted to Academia.edu had 58% more citations than articles only posted to other online venues, such as personal and departmental home pages, after five years.

URL : Open Access Meets Discoverability: Citations to Articles Posted to Academia.edu

DOI : 10.1371/journal.pone.0148257

Research data explored: an extended analysis of citations and altmetrics

In this study, we explore the citedness of research data, its distribution over time and its relation to the availability of a digital object identifier (DOI) in the Thomson Reuters database Data Citation Index (DCI).

We investigate if cited research data “impacts” the (social) web, reflected by altmetrics scores, and if there is any relationship between the number of citations and the sum of altmetrics scores from various social media platforms.

Three tools are used to collect altmetrics scores, namely PlumX, ImpactStory, and Altmetric.com, and the corresponding results are compared. We found that out of the three altmetrics tools, PlumX has the best coverage. Our experiments revealed that research data remain mostly uncited (about 85 %), although there has been an increase in citing data sets published since 2008.

The percentage of the number of cited research data with a DOI in DCI has decreased in the last years. Only nine repositories are responsible for research data with DOIs and two or more citations. The number of cited research data with altmetrics “foot-prints” is even lower (4–9 %) but shows a higher coverage of research data from the last decade. In our study, we also found no correlation between the number of citations and the total number of altmetrics scores.

Yet, certain data types (i.e. survey, aggregate data, and sequence data) are more often cited and also receive higher altmetrics scores. Additionally, we performed citation and altmetric analyses of all research data published between 2011 and 2013 in four different disciplines covered by the DCI.

In general, these results correspond very well with the ones obtained for research data cited at least twice and also show low numbers in citations and in altmetrics. Finally, we observed that there are disciplinary differences in the availability and extent of altmetrics scores.

URL : http://link.springer.com/article/10.1007/s11192-016-1887-4

Meaningful Metrics: A 21st Century Librarian’s Guide to Bibliometrics, Altmetrics, and Research Impact

What does it mean to have meaningful metrics in today’s complex higher education landscape? With a foreword by Heather Piwowar and Jason Priem, this highly engaging and activity-laden book serves to introduce readers to the fast-paced world of research metrics from the unique perspective of academic librarians and LIS practitioners.

Starting with the essential histories of bibliometrics and altmetrics, and continuing with in-depth descriptions of the core tools and emerging issues at stake in the future of both fields, Meaningful Metrics is a convenient all-in-one resource that is designed to be used by a range of readers, from those with little to no background on the subject to those looking to become movers and shakers in the current scholarly metrics movement. Authors Borchardt and Roemer, offer tips, tricks, and real-world examples illustrate how librarians can support the successful adoption of research metrics, whether in their institutions or across academia as a whole.

URL : http://www.ala.org/acrl/sites/ala.org.acrl/files/content/publications/booksanddigitalresources/digital/9780838987568_metrics_OA.pdf

Promoting Open Access and Innovation: From Synergies to Le Centre de Recherche Interuniversitaire sur les Humanités Numériques

This article discusses the relationship between digital humanities and disciplinary boundaries in the last decade, primarily in the context of the national project Synergies.

It offers first an overview of Synergies as a concrete example of the way technological change impacts the very notion of disciplines by trying to create a platform that was interdisciplinary by nature, then discusses the creation of a new Digital Humanities centre in Québec—Le Centre de recherche interuniversitaire sur les humanités numériques – and the ways it was conceived as encompassing a range of disciplinary approach.

URL : Promoting Open Access and Innovation: From Synergies to Le Centre de Recherche Interuniversitaire sur les Humanités Numériques

Alternative location : http://src-online.ca/index.php/src/article/view/214/457

OA in the Library Collection: The Challenges of Identifying and Maintaining Open Access Resources

While librarians, researchers, and the general public have embraced the concept of Open Access (OA), librarians still have a difficult time managing OA resources. To find out why, Bulock and Hosburgh surveyed librarians about their experiences managing OA resources and the strengths and weaknesses of management systems.

At this session, they shared survey results, reflected on OA workflows at their own libraries, and updated audience members on relevant standards and initiatives. Survey respondents reported challenges related to hybrid OA, inaccurate metadata, and inconsistent communication along the serials supply chain. Recommended solutions included the creation of consistent, centralized article-level metadata and the development of OA collection development principles for libraries.

URL : http://scholarship.rollins.edu/as_facpub/136/

Improving the peer-review process and editorial quality: key errors escaping the review and editorial process in top scientific journals

We apply a novel mistake index to assess trends in the proportion of corrections published between 1993 and 2014 in Nature, Science and PNAS. The index revealed a progressive increase in the proportion of corrections published in these three high-quality journals.

The index appears to be independent of the journal impact factor or the number of items published, as suggested by a comparative analyses among 16 top scientific journals of different impact factors and disciplines. A more detailed analysis suggests that the trend in the time-to-correction increased significantly over time and also differed among journals (Nature 233 days; Science 136 days; PNAS 232 days).

A detailed review of 1,428 errors showed that 60% of corrections were related to figures, authors, references or results. According to the three categories established, 34.7% of the corrections were considered mild, 47.7% moderate and 17.6% severe, also differing among journals. Errors occurring during the printing process were responsible for 5% of corrections in Nature, 3% in Science and 18% in PNAS.

The measurement of the temporal trends in the quality of scientific manuscripts can assist editors and reviewers in identifying the most common mistakes, increasing the rigor of peer-review and improving the quality of published scientific manuscripts.

URL : Improving the peer-review process and editorial quality: key errors escaping the review and editorial process in top scientific journals

DOI : https://doi.org/10.7717/peerj.1670