Open science and modified funding lotteries can impede the natural selection of bad science

Authors : Paul E. Smaldino, Matthew A. Turner, Pablo A. Contreras Kallens

Assessing scientists using exploitable metrics can lead to the degradation of research methods even without any strategic behaviour on the part of individuals, via ‘the natural selection of bad science.’

Institutional incentives to maximize metrics like publication quantity and impact drive this dynamic. Removing these incentives is necessary, but institutional change is slow.

However, recent developments suggest possible solutions with more rapid onsets. These include what we call open science improvements, which can reduce publication bias and improve the efficacy of peer review. In addition, there have been increasing calls for funders to move away from prestige- or innovation-based approaches in favour of lotteries.

We investigated whether such changes are likely to improve the reproducibility of science even in the presence of persistent incentives for publication quantity through computational modelling.

We found that modified lotteries, which allocate funding randomly among proposals that pass a threshold for methodological rigour, effectively reduce the rate of false discoveries, particularly when paired with open science improvements that increase the publication of negative results and improve the quality of peer review.

In the absence of funding that targets rigour, open science improvements can still reduce false discoveries in the published literature but are less likely to improve the overall culture of research practices that underlie those publications.

URL : Open science and modified funding lotteries can impede the natural selection of bad science

DOI : https://doi.org/10.1098/rsos.190194

Assessing the Quality of Scientific Papers

Authors : Roman Vainshtein, Gilad Katz, Bracha Shapira, Lior Rokach

A multitude of factors are responsible for the overall quality of scientific papers, including readability, linguistic quality, fluency,semantic complexity, and of course domain-specific technical factors.

These factors vary from one field of study to another. In this paper, we propose a measure and method for assessing the overall quality of the scientific papers in a particular field of study.

We evaluate our method in the computer science domain, but it can be applied to other technical and scientific fields.Our method is based on the corpus linguistics technique. This technique enables the extraction of required information and knowledge associated with a specific domain.

For this purpose, we have created a large corpus, consisting of papers from very high impact conferences. First, we analyze this corpus in order to extract rich domain-specific terminology and knowledge.

Then we use the acquired knowledge to estimate the quality of scientific papers by applying our proposed measure. We examine our measure on high and low scientific impact test corpora.

Our results show a significant difference in the measure scores of the high and low impact test corpora. Second, we develop a classifier based on our proposed measure and compare it to the baseline classifier.

Our results show that the classifier based on our measure over-performed the baseline classifier. Based on the presented results the proposed measure and the technique can be used for automated assessment of scientific papers.

URL : https://arxiv.org/abs/1908.04200

Raising Visibility in the Digital Humanities Landscape: Academic Engagement and the Question of the Library’s Role

Authors : Kathleen Kasten-Mutkus, Laura Costello, Darren Chase

Academic libraries have an important role to play in supporting digital humanities projects in their communities. Librarians at Stony Brook University Libraries host Open Mic events for digital humanities researchers, teachers, and students on campus.

 Inspired by a desire to better serve digital humanists with existing projects, this event was initially organized to increase the visibility of scholars and students with nascent projects and connect these digital humanists to library supported resources and to one another.

For the Libraries, the Open Mic was an opportunity to understand the scope and practices of the digital humanities community at Stony Brook, and to identify ways to make meaningful interventions.

An open mic is a uniquely suitable event format in that it embodies a dynamic, permissive, multidisciplinary presentation space that is as much for exercising new and ongoing research (and technologies) as it is for making discoveries and connections.

The success of these events can be measured in the establishment of the University Libraries as a nexus for digital humanities work, consultations, instruction, workshops, and community on a campus without a designated digital humanities center.

The digital humanities Open Mic event at Stony Brook University locates the digital humanities within the library’s repertoire, while signaling that the library is — in a number of essential ways — open.

URL : http://www.digitalhumanities.org/dhq/vol/13/2/000420/000420.html

On a Quest for Cultural Change – Surveying Research Data Management Practices at Delft University of Technology

Authors : Heather Andrews Mancilla, Marta Teperek, Jasper van Dijck, Kees den Heijer, Robbert Eggermont, Esther Plomp, Yasemin Turkyilmaz-van der Velden, Shalini Kurapati

The Data Stewardship project is a new initiative from the Delft University of Technology (TU Delft) in the Netherlands. Its aim is to create mature working practices and policies regarding research data management across all TU Delft faculties.

The novelty of this project relies on having a dedicated person, the so-called ‘Data Steward’, embedded in each faculty to approach research data management from a more discipline-specific perspective. It is within this framework that a research data management survey was carried out at the faculties that had a Data Steward in place by July 2018.

The goal was to get an overview of the general data management practices, and use its results as a benchmark for the project. The total response rate was 11 to 37% depending on the faculty.

Overall, the results show similar trends in all faculties, and indicate lack of awareness regarding different data management topics such as automatic data backups, data ownership, relevance of data management plans, awareness of FAIR data principles and usage of research data repositories.

The results also show great interest towards data management, as more than ~80% of the respondents in each faculty claimed to be interested in data management training and wished to see the summary of survey results.

Thus, the survey helped identified the topics the Data Stewardship project is currently focusing on, by carrying out awareness campaigns and providing training at both university and faculty levels.

URL : On a Quest for Cultural Change – Surveying Research Data Management Practices at Delft University of Technology

Meta-Research: Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations

Authors : Erin C McKiernan, Lesley A Schimanski, Carol Muñoz Nieves, Lisa Matthias, Meredith T Niles, Juan P Alperin

We analyzed how often and in what ways the Journal Impact Factor (JIF) is currently used in review, promotion, and tenure (RPT) documents of a representative sample of universities from the United States and Canada. 40% of research-intensive institutions and 18% of master’s institutions mentioned the JIF, or closely related terms.

Of the institutions that mentioned the JIF, 87% supported its use in at least one of their RPT documents, 13% expressed caution about its use, and none heavily criticized it or prohibited its use. Furthermore, 63% of institutions that mentioned the JIF associated the metric with quality, 40% with impact, importance, or significance, and 20% with prestige, reputation, or status.

We conclude that use of the JIF is encouraged in RPT evaluations, especially at research-intensive universities, and that there is work to be done to avoid the potential misuse of metrics like the JIF.

URL : Meta-Research: Use of the Journal Impact Factor in academic review, promotion, and tenure evaluations

DOI : https://doi.org/10.7554/eLife.47338.001

The Transcript OPEN Library Political Science Model: A Sustainable Way into Open Access for E-Books in the Humanities and Social Science

Authors : Alexandra Jobmann, Nina Schönfelder

The strategic goal of the project “National Contact Point Open Access OA2020-DE” is to create the conditions for a large-scale open-access transformation in accordance with the Alliance of German Science Organizations.

In close collaboration with the publisher transcript, we developed a business model that strengthens the transformation process for e-books in the humanities and social sciences.

It largely addresses the drawbacks of existing models. Moreover, it is manageable, sustainable, transparent, and scalable for both publishers and libraries. This case report describes the setup of the model, its successful implementation for the branch “political science” of transcript in 2019, and provides a Strengths–Weaknesses–Opportunities–Threats (SWOT) analysis.

We believe that it has the potential to become one of the major open-access business models for research monographs and anthologies in the humanities and social sciences, especially for non-English e-books.

URL : The Transcript OPEN Library Political Science Model: A Sustainable Way into Open Access for E-Books in the Humanities and Social Science

DOI : https://doi.org/10.3390/publications7030055

The Impact of Open Access on Teaching—How Far Have We Come?

Authors : Elizabeth Gadd, Chris Morrison, Jane Secker

This article seeks to understand how far the United Kingdom higher education (UK HE) sector has progressed towards open access (OA) availability of the scholarly literature it requires to support courses of study.

It uses Google Scholar, Unpaywall and Open Access Button to identify OA copies of a random sample of articles copied under the Copyright Licensing Agency (CLA) HE Licence to support teaching. The quantitative data analysis is combined with interviews of, and a workshop with, HE practitioners to investigate four research questions.

Firstly, what is the nature of the content being used to support courses of study? Secondly, do UK HE establishments regularly incorporate searches for open access availability into their acquisition processes to support teaching? Thirdly, what proportion of content used under the CLA Licence is also available on open access and appropriately licenced? Finally, what percentage of content used by UK HEIs under the CLA Licence is written by academics and thus has the potential for being made open access had there been support in place to enable this?

Key findings include the fact that no interviewees incorporated OA searches into their acquisitions processes. Overall, 38% of articles required to support teaching were available as OA in some form but only 7% had a findable re-use licence; just 3% had licences that specifically permitted inclusion in an ‘electronic course-pack’.

Eighty-nine percent of journal content was written by academics (34% by UK-based academics). Of these, 58% were written since 2000 and thus could arguably have been made available openly had academics been supported to do so.

URL : The Impact of Open Access on Teaching—How Far Have We Come?

DOI : https://doi.org/10.3390/publications7030056