A Case Study for a New Peer-Review Journal on Race and Ethnicity in American Higher Education

Author : Cristobal Salinas Jr.

In this exploratory case study, the interests, attitudes, and opinions of participants of the National Conference on Race and Ethnicity (NCORE) in American Higher Education are presented.

This case study sought to understand how college and university administrators and faculty perceived the need to create a peer-reviewed journal that aimed to support and create opportunities to publish research, policy, practices, and procedures within the context of race and ethnicity in American higher education.

The findings of this study reflect that the vast majority of those surveyed (n = 605) and interviewed (n = 5) support, and are interested in, having a peer-reviewed journal that focuses on race and ethnicity in American higher education.

URL : A Case Study for a New Peer-Review Journal on Race and Ethnicity in American Higher Education

DOI : https://doi.org/10.3390/publications6020026

To What Extent is Inclusion in the Web of Science an Indicator of Journal ‘Quality’?

Authors : Diego Chavarro, Ismael Rafols, Puay Tang

The assessment of research based on the journal in which it is published is a widely adopted practice. Some research assessments use the Web of Science (WoS) to identify “high quality” journals, which are assumed to publish excellent research.

The authority of WoS on journal quality stems from its selection of journals based on editorial standards and scientific impact criteria. These can be considered as universalistic criteria, meaning that they can be applied to any journal regardless of its place of publication, language, or discipline.

In this article we examine the coverage by WoS of journals produced in Latin America, Spain, and Portugal. We use a logistic regression to examine the probability of a journal to be covered by WoS given universalistic criteria (editorial standards and scientific impact of the journal) and particularistic criteria (country, language, and discipline of the journal).

We find that it is not possible to predict the inclusion of journals in WoS only through the universalistic criteria because particularistic variables such as country of the journal, its discipline and language are also related to inclusion in WoS.

We conclude that using WoS as a universalistic tool for research assessment can disadvantage science published in journals with adequate editorial standards and scientific merit. We discuss the implications of these findings within the research evaluation literature, specifically for countries and disciplines not extensively covered by WoS.

URL : https://dx.doi.org/10.2139/ssrn.2990653

Beyond Fact Checking: Reconsidering the Status of Truth of Published Articles

Authors : David Pontille, Didier Torny

Since the 17th century, scientific knowledge has been produced through a collective process, involving specific technologies used to perform experiments, to regulate modalities for participation of peers or lay people, and to ensure validation of the facts and publication of major results.

In such a world guided by the quest for a new kind of truth against previous beliefs various forms of misconduct – from subtle plagiarism to the entire fabrication of data and results – have largely been considered as minimal, if not inexistent.

Yet, some “betrayers of the truth” have been alleged in many fraudulent cases at least from the 1970s onward and the phenomenon is currently a growing concern in many academic corners. Facing numerous alerts, journals have generalized dedicated editorial formats to notify their readers of the emerging doubts affecting articles they had published.

This short piece is exclusively focused on these formats, which consists in “flagging” some articles to mark their problematic status.The visibility given to these flags and policies undermine the very basic components of the economy of science: How long can we collectively pretend that peer-reviewed knowledge should be the anchor to face a “post-truth” world?

URL : https://halshs.archives-ouvertes.fr/halshs-01576348

Les classements à l’international des revues en SHS

Auteurs/Authors : David Pontille, Didier Torny

Bien que plusieurs classements de revues aient été élaborés dès les années 1970, le caractère inédit de ceux qui ont émergé au cours des années 2000 réside dans leur statut d’instrument de politique publique. C’est le cas de l’Australie, du Brésil, de la France, de la Flandre, de la Norvège, et des Pays-Bas où cette technologie d’évaluation est en vigueur pour certains domaines – notamment en sciences humaines et sociales (SHS).

Dans cet article, nous analysons les modes d’existence de cette technologie d’évaluation spécifique. Bien que la formule générique du « classement de revues » se propage au plan international , différentes versions se développent parallèlement : leurs modalités de production, les valeurs défendues par leurs promoteurs et leurs usagers, aussi bien que leurs formes concrètes sont extrêmement variées.

Nous montrons que l’espace de variations des classements de revues en SHS est toujours bordé par deux options : favoriser une « bonne recherche » qui, sous l’effet d’avantages cumulatifs, risque de conduire à une science (hyper)normale soutenant des dispositions de conformité sociale chez les chercheurs ; encourager l’émergence des communautés minoritaires (linguistiques, disciplinaires, interdisciplinaires) et promouvoir la diversité des méthodes, théories et objets, au risque de mener à des formes de relativisme ou d’archipelisation de la recherche.

URL : https://hal-mines-paristech.archives-ouvertes.fr/hal-01256027

Analysing researchers’ outreach efforts and the association with publication metrics: A case study of Kudos

Authors : Mojisola Erdt, Htet Htet Aung, Ashley Sara Aw, Charlie Rapple, Yin-Leng Theng

With the growth of scholarly collaboration networks and social communication platforms, members of the scholarly community are experimenting with their approach to disseminating research outputs, in an effort to increase their audience and outreach.

However, from a researcher’s point of view, it is difficult to determine whether efforts to make work more visible are worthwhile (in terms of the association with publication metrics) and within that, difficult to assess which platform or network is most effective for sharing work and connecting to a wider audience.

We undertook a case study of Kudos (https://www.growkudos.com), a web-based service that claims to help researchers increase the outreach of their publications, to examine the most effective tools for sharing publications online, and to investigate which actions are associated with improved metrics.

We extracted a dataset from Kudos of 830,565 unique publications claimed by authors, for which 20,775 had actions taken to explain or share via Kudos, and for 4,867 of these full text download data from publishers was available.

Findings show that researchers are most likely to share their work on Facebook, but links shared on Twitter are more likely to be clicked on. A Mann-Whitney U test revealed that a treatment group (publications having actions in Kudos) had a significantly higher median average of 149 full text downloads (23.1% more) per publication as compared to a control group (having no actions in Kudos) with a median average of 121 full text downloads per publication.

These findings suggest that performing actions on publications, such as sharing, explaining, or enriching, could help to increase the number of full text downloads of a publication.

URL : Analysing researchers’ outreach efforts and the association with publication metrics: A case study of Kudos

DOI : https://doi.org/10.1371/journal.pone.0183217

Citation Count Analysis for Papers with Preprints

Authors : Sergey Feldman, Kyle Lo, Waleed Ammar

We explore the degree to which papers prepublished on arXiv garner more citations, in an attempt to paint a sharper picture of fairness issues related to prepublishing. A paper’s citation count is estimated using a negative-binomial generalized linear model (GLM) while observing a binary variable which indicates whether the paper has been prepublished.

We control for author influence (via the authors’ h-index at the time of paper writing), publication venue, and overall time that paper has been available on arXiv. Our analysis only includes papers that were eventually accepted for publication at top-tier CS conferences, and were posted on arXiv either before or after the acceptance notification.

We observe that papers submitted to arXiv before acceptance have, on average, 65\% more citations in the following year compared to papers submitted after. We note that this finding is not causal, and discuss possible next steps.

URL : https://arxiv.org/abs/1805.05238

The evolving preprint landscape: Introductory report for the Knowledge Exchange working group on preprints

Authors : Jonathan Tennant, Serge Bauin, Sarah James, Juliane Kant

In 1961, the USA National Institutes of Health (NIH) launched a program called Information Exchange Groups, designed for the circulation of biological preprints, but this shut down in 1967 (Confrey, 1996; Cobb, 2017).

In 1991, the arXiv repository was launched for physics, computer science, and mathematics, which is when preprints (or ‘e-prints’) began to increase in popularity and attention (Wikipedia ArXiv#History; Jackson, 2002). The Social Sciences Research Network (SSRN) was launched in 1994, and in 1997 Research Papers in Economics (Wikipedia RePEc) was launched.

In 2008, the research network platforms Academia.edu and ResearchGate were both launched and allowed sharing of research papers at any stage. In 2013, two new biological preprint servers were launched, bioRxiv (by Cold Spring Harbor Laboratory) and PeerJ Preprints (by PeerJ) (Wikipedia BioRxiv; Wikipedia PeerJ).

Between these major ongoing initiatives were various, somewhat less-successful attempts to launch preprint servers, including Nature Precedings (folded in April 2012) and Netprints from the British Medical Journal (Wikipedia Nature Precedings; BMJ, 1999).

Now, a range of innovative services, organisations, and platforms are rapidly developing around preprints, prompting this overview of the present ecosystem on behalf of Knowledge Exchange.

URL : The evolving preprint landscape: Introductory report for the Knowledge Exchange working group on preprints

DOI : https://dx.doi.org/10.17605/OSF.IO/796TU