A Bibliometric study of Directory of Open Access Journals: Special reference to Microbiology

Author : K S Savita

The present study aim is to determine the number of free e-journal in the field of Microbiology available on DOAJ.

For this study the author has adopted bibliometric method and analyzed on the basis of country-wise distribution, language wise distribution and subject heading wise distribution.

URL : A Bibliometric study of Directory of Open Access Journals: Special reference to Microbiology

Alternative location : http://ijidt.com/index.php/ijidt/article/view/466

Towards an Ethical Framework for Publishing Twitter Data in Social Research: Taking into Account Users’ Views, Online Context and Algorithmic Estimation

Authors : Matthew L Williams, Pete Burnap, Luke Sloan

New and emerging forms of data, including posts harvested from social media sites such as Twitter, have become part of the sociologist’s data diet. In particular, some researchers see an advantage in the perceived ‘public’ nature of Twitter posts, representing them in publications without seeking informed consent.

While such practice may not be at odds with Twitter’s terms of service, we argue there is a need to interpret these through the lens of social science research methods that imply a more reflexive ethical approach than provided in ‘legal’ accounts of the permissible use of these data in research publications.

To challenge some existing practice in Twitter-based research, this article brings to the fore: (1) views of Twitter users through analysis of online survey data; (2) the effect of context collapse and online disinhibition on the behaviours of users; and (3) the publication of identifiable sensitive classifications derived from algorithms.

URL : Towards an Ethical Framework for Publishing Twitter Data in Social Research: Taking into Account Users’ Views, Online Context and Algorithmic Estimation

DOI : http://dx.doi.org/10.1177%2F0038038517708140

Replicability and Reproducibility in Comparative Psychology

Author : Jeffrey R. Stevens

Psychology faces a replication crisis. The Reproducibility Project: Psychology sought to replicate the effects of 100 psychology studies. Though 97% of the original studies produced statistically significant results, only 36% of the replication studies did so (Open Science Collaboration, 2015).

This inability to replicate previously published results, however, is not limited to psychology (Ioannidis, 2005). Replication projects in medicine (Prinz et al., 2011) and behavioral economics (Camerer et al., 2016) resulted in replication rates of 25 and 61%, respectively, and analyses in genetics (Munafò, 2009) and neuroscience (Button et al., 2013) question the validity of studies in those fields. Science, in general, is reckoning with challenges in one of its basic tenets: replication.

Comparative psychology also faces the grand challenge of producing replicable research. Though social psychology has born the brunt of most of the critique regarding failed replications, comparative psychology suffers from some of the same problems faced by social psychology (e.g., small sample sizes).

Yet, comparative psychology follows the methods of cognitive psychology by often using within-subjects designs, which may buffer it from replicability problems (Open Science Collaboration, 2015). In this Grand Challenge article, I explore the shared and unique challenges of and potential solutions for replication and reproducibility in comparative psychology.

URL : Replicability and Reproducibility in Comparative Psychology

Alternative location : http://journal.frontiersin.org/article/10.3389/fpsyg.2017.00862/full

Public Libraries as Publishers: Critical Opportunity

Author : Kathryn M. Conrad

Libraries have a long and distinguished history of publishing, since their earliest days. Traditionally libraries published to expose their collections through bibliographies, facsimiles, and catalogs.

While the Internet has made discovery and dissemination of library holdings easier than ever before, digital publishing technologies have also unlocked compelling new purposes for library publishing, including through Open Access publishing initiatives.

The self-publishing explosion and availability of self-publishing tools and services geared to libraries have heralded new opportunities for libraries, especially public libraries, to engage their communities in new ways.

By supporting self-publishing initiative in their communities, public libraries can promote standards of quality in self-publishing, provide unique opportunities to engage underserved populations, and become true archives of their communities.

DOI : http://dx.doi.org/10.3998/3336451.0020.106

Peer reviewing: a private affair between the individual researcher and the publishing houses, or a responsibility of the university?

Authors : Leif Longva, Eirik Reierth, Lars Moksness, Bård Smedsrød

Peer reviewing is mandatory for scientific journals as quality control of submitted manuscripts, for universities to rank applicants for scientific positions, and for funding agencies to rank grant applications.

In spite of this deep dependency of peer reviewing throughout the entire academic realm, universities exhibit a peculiar lack of interest in this activity.

The aim of this article is to show that by taking an active interest in peer reviewing the universities will take control over the management and policy shaping of scientific publishing, a regime that is presently largely controlled by the big publishing houses.

The benefits of gaining control of scientific publishing policy include the possibility to implement open access publishing and to reduce the unjustifiably high subscription rates currently charged by some of the major publishing houses.

A common international clean-up action is needed to move this pivotal element of scientific publishing from the dark hiding places of the scientific journals to where it should be managed: namely, at the universities.

In addition to the economic benefits, we postulate that placing peer reviewing at the universities will improve the quality of published research.

DOI : http://dx.doi.org/10.3998/3336451.0020.103

Knowledge discovery through text-based similarity searches for astronomy literature

AuthorWolfgang Kerzendorf

The increase in the number of researchers coupled with the ease of publishing and distribution of scientific papers (due to technological advancements) has resulted in a dramatic increase in astronomy literature.

This has likely led to the predicament that the body of the literature is too large for traditional human consumption and that related and crucial knowledge is not discovered by researchers. In addition to the increased production of astronomical literature, recent decades have also brought several advancements in computer linguistics.

Especially, the machine aided processing of literature dissemination might make it possible to convert this stream of papers into a coherent knowledge set. In this paper, we present the application of computer linguistics techniques on astronomy literature.

In particular, we developed a tool that will find similar articles purely based on text content given an input paper.

We find that our technique performs robustly in comparison with other tools recommending articles given a reference papers (known as recommender system). Our novel tool shows the great power in combining computer linguistics with astronomy literature and suggests that additional research in this endeavor will likely produce even better tools that will help researchers cope with the vast amounts of knowledge being produced.

URL : https://arxiv.org/abs/1705.05840

Is there agreement on the prestige of scholarly book publishers in the Humanities? DELPHI over survey results

Authors : Elea Giménez-Toledo, Jorge Mañana-Rodríguez

Despite having an important role supporting assessment processes, criticism towards evaluation systems and the categorizations used are frequent. Considering the acceptance by the scientific community as an essential issue for using rankings or categorizations in research evaluation, the aim of this paper is testing the results of rankings of scholarly book publishers’ prestige, Scholarly Publishers Indicators (SPI hereafter).

SPI is a public, survey-based ranking of scholarly publishers’ prestige (among other indicators). The latest version of the ranking (2014) was based on an expert consultation with a large number of respondents.

In order to validate and refine the results for Humanities’ fields as proposed by the assessment agencies, a Delphi technique was applied with a panel of randomly selected experts over the initial rankings.

The results show an equalizing effect of the technique over the initial rankings as well as a high degree of concordance between its theoretical aim (consensus among experts) and its empirical results (summarized with Gini Index).

The resulting categorization is understood as more conclusive and susceptible of being accepted by those under evaluation.

URL : https://arxiv.org/abs/1705.04517