DuEPublicA: Automated bibliometric reports based on the University Bibliography and external citation data

Author : Eike T. Spielberg

This paper describes a web application to generate bibliometric reports based on the University Bibliography and the Scopus citation database. Our goal is to offer an alternative to easy-to-prepare automated reports from commercial sources.

These often suffer from an incomplete coverage of publication types and a difficult attribution to people, institutes and universities. Using our University Bibliography as the source to select relevant publications solves the two problems.

As it is a local system, maintained and set up by the library, we can include every publication type we want. As the University Bibliography is linked to the identity management system of the university, it enables an easy selection of publications for people, institutes and the whole university.

The program is designed as a web application, which collects publications from the University Bibliography, enriches them with citation data from Scopus and performs three kinds of analyses:
1. A general analysis (number and type of publications, publications per year etc.),
2. A citation analysis (average citations per publication, h-index, uncitedness), and
3. An affiliation analysis (home and partner institutions)

We tried to keep the code highly generic, so that the inclusion of other databases (Web of Science, IEEE) or other bibliographies is easily feasible. The application is written in Java and XML and uses XSL transformations and LaTeX to generate bibliometric reports as HTML pages and in pdf format.

Warnings and alerts are automatically included if the citation analysis covers only a small fraction of the publications from the University Bibliography. In addition, we describe a small tool that helps to collect author details for an analysis.

URL : http://journal.code4lib.org/articles/12549

Journal Data Sharing Policies and Statistical Reporting Inconsistencies in Psychology

Authors : Michele Nuijten, Jeroen Borghuis, Coosje Veldkamp, Linda Alvarez, Marcel van Assen, Jelte Wicherts

In this paper, we present three studies that investigate the relation between data sharing and statistical reporting inconsistencies. Previous research found that reluctance to share data was related to a higher prevalence of statistical errors, often in the direction of statistical significance (Wicherts, Bakker, & Molenaar, 2011).

We therefore hypothesized that journal policies about data sharing and data sharing itself would reduce these inconsistencies. In Study 1, we compared the prevalence of reporting inconsistencies in two similar journals on decision making with different data sharing policies.

In Study 2, we compared reporting inconsistencies in articles published in PLOS (with a data sharing policy) and Frontiers in Psychology (without a data sharing policy). In Study 3, we looked at papers published in the journal Psychological Science to check whether papers with or without an Open Practice Badge differed in the prevalence of reporting errors.

Overall, we found no relationship between data sharing and reporting inconsistencies. We did find that journal policies on data sharing are extremely effective in promoting data sharing.

We argue that open data is essential in improving the quality of psychological science, and we discuss ways to detect and reduce reporting inconsistencies in the literature.

DOI : https://dx.doi.org/10.17605/OSF.IO/SGBTA

A Proposed Currency System for Academic Peer Review Payments Using the BlockChain Technology

Author : Michael Spearpoint

Peer review of scholarly papers is seen to be a critical step in the publication of high quality outputs in reputable journals.

However, it appears that there are few incentives for researchers to agree to conduct suitable reviews in a timely fashion and in some cases unscrupulous practices are occurring as part of the production of academic research output. Innovations in internet-based technologies mean that there are ways in which some of the challenges can be addressed.

In particular, this paper proposes a new currency system using the BlockChain as its basis that provides a number of solutions. Potential benefits and problems of using the technology are discussed in the paper and these will need further investigation should the idea develop further.

Ultimately, the currency could be used as an alternative publication metric for authors, institutions and journals.

URL : A Proposed Currency System for Academic Peer Review Payments Using the BlockChain Technology

Alternative location : http://www.mdpi.com/2304-6775/5/3/19

Sci-Hub provides access to nearly all scholarly literature

Authors : Daniel S. Himmelstein, Ariel R. Romero, Bastian Greshake Tzovaras, Casey S. Greene

The website Sci-Hub provides access to scholarly literature via full text PDF downloads. The site enables users to access articles that would otherwise be paywalled. Since its creation in 2011, Sci-Hub has grown rapidly in popularity.

However, until now, the extent of Sci-Hub’s coverage was unclear. As of March 2017, we find that Sci-Hub’s database contains 68.9% of all 81.6 million scholarly articles, which rises to 85.2% for those published in closed access journals.

Furthermore, Sci-Hub contains 77.0% of the 5.2 million articles published by inactive journals. Coverage varies by discipline, with 92.8% coverage of articles in chemistry journals compared to 76.3% for computer science.

Coverage also varies by publisher, with the coverage of the largest publisher, Elsevier, at 97.3%. Our interactive browser at greenelab.github.io/scihub allows users to explore these findings in more detail.

Finally, we estimate that over a six month period in 2015–2016, Sci-Hub provided access for 99.3% of valid incoming requests. Hence, the scope of this resource suggests the subscription publishing model is becoming unsustainable. For the first time, the overwhelming majority of scholarly literature is available gratis to anyone.

URL : https://greenelab.github.io/scihub-manuscript/

 

Medical Theses and Derivative Articles: Dissemination Of Contents and Publication Patterns

Authors : Mercedes Echeverria, David Stuart, Tobias Blanke

Doctoral theses are an important source of publication in universities, although little research has been carried out on the publications resulting from theses, on so-called derivative articles.

This study investigates how derivative articles can be identified through a text analysis based on the full-text of a set of medical theses and the full-text of articles, with which they shared authorship.

The text similarity analysis methodology applied consisted in exploiting the full-text articles according to organization of scientific discourse (IMRaD) using the TurnItIn plagiarism tool.

The study found that the text similarity rate in the Discussion section can be used to discriminate derivative articles from non-derivative articles.

Additional findings were: the first position of the thesis’s author dominated in 85% of derivative articles, the participation of supervisors as coauthors occurred in 100% of derivative articles, the authorship credit retained by the thesis’s author was 42% in derivative articles, the number of coauthors by article was 5 in derivative articles versus 6.4 coauthors, as average, in non-derivative articles and the time differential regarding the year of thesis completion showed that 87.5% of derivative articles were published before or in the same year of thesis completion.

URL : https://arxiv.org/abs/1707.04439

Scientist impact factor (SIF): a new metric for improving scientists’ evaluation?

Authors : Giuseppe Lippi, Camilla Mattiuzzi

Background

The publication of scientific research is the mainstay for knowledge dissemination, but is also an essential criterion of scientists’ evaluation for recruiting funds and career progression.

Although the most widespread approach for evaluating scientists is currently based on the H-index, the total impact factor (IF) and the overall number of citations, these metrics are plagued by some well-known drawbacks. Therefore, with the aim to improve the process of scientists’ evaluation, we developed a new and potentially useful indicator of recent scientific output.

Methods

The new metric scientist impact factor (SIF) was calculated as all citations of articles published in the two years following the publication year of the articles, divided by the overall number of articles published in that year. The metrics was then tested by analyzing data of the 40 top scientists of the local University.

Results

No correlation was found between SIF and H-index (r=0.15; P=0.367) or 2 years H-index (r=−0.01; P=0.933), whereas the H-Index and 2 years H-index values were found to be highly correlated (r=0.57; P<0.001). A highly significant correlation was also observed between the articles published in one year and the total number of citations to these articles in the two following years (r=0.62; P<0.001).

Conclusions

According to our data, the SIF may be a useful measure to complement current metrics for evaluating scientific output. Its use may be especially helpful for young scientists, wherein the SIF reflects the scientific output over the past two years thus increasing their chances to apply to and obtain competitive funding.

URL : http://atm.amegroups.com/article/view/15375