Open Science by Design

Contributors : National Academies of Sciences, Engineering, and Medicine; Policy and Global Affairs; Board on Research Data and Information; Committee on Toward an Open Science Enterprise

Openness and sharing of information are fundamental to the progress of science and to the effective functioning of the research enterprise. The advent of scientific journals in the 17th century helped power the Scientific Revolution by allowing researchers to communicate across time and space, using the technologies of that era to generate reliable knowledge more quickly and efficiently.

Harnessing today’s stunning, ongoing advances in information technologies, the global research enterprise and its stakeholders are moving toward a new open science ecosystem.

Open science aims to ensure the free availability and usability of scholarly publications, the data that result from scholarly research, and the methodologies, including code or algorithms, that were used to generate those data.

Open Science by Design is aimed at overcoming barriers and moving toward open science as the default approach across the research enterprise.

This report explores specific examples of open science and discusses a range of challenges, focusing on stakeholder perspectives. It is meant to provide guidance to the research enterprise and its stakeholders as they build strategies for achieving open science and take the next steps.

URL : https://www.nap.edu/catalog/25116/open-science-by-design-realizing-a-vision-for-21st-century

What Value Do Journal Whitelists and Blacklists Have in Academia?

Authors : Jaime A. Teixeira da Silva, Panagiotis Tsigaris

This paper aims to address the issue of predatory publishing, sensu lato. To achieve this, we offer our perspectives, starting initially with some background surrounding the birth of the concept, even though the phenomenon may have already existed long before the popularization of the term “predatory publishing”.

The issue of predation or “predatory” behavior in academic publishing is no longer limited to open access (OA). Many of the mainstream publishers that were exclusively subscription-based are now evolving towards a state of complete OA.

Academics seeking reliable sources of journals to publish their work tend to rely on a journal’s metrics such as citations and indexing, and on whether it is blacklisted or whitelisted.

Jeffrey Beall raised awareness of the risks of “predatory” OA publishing, and his blacklists of “predatory” OA journals and publishers began to be used for official purposes to distinguish valid from perceived invalid publishing venues.

We initially reflect on why we believe the blacklists created by Beall were flawed, primarily due to the weak set of criteria confusing non-predatory with true predatory journals leading to false positives and missing out on blacklisting true predatory journals due to false negatives.

Historically, most critiques of “predatory publishing” have relied excessively on Beall’s blacklists to base their assumptions and conclusions but there is a need to look beyond these.

There are currently a number of blacklists and whitelists circulating in academia, but they all have imperfections, such as the resurrected Beall blacklists, Crawford’s OA gray list based on Beall’s lists, Cabell’s new blacklist with about 11,000 journals, the DOAJ with about 11,700 OA journals, and UGC, with over 32,600 journals prior to its recent (May 2018) purge of 4305 journals.

The reader is led into a discussion about blacklists’ lack of reliability, using the scientific framework of conducting research to assess whether a journal could be predatory at the pre- and post-study levels. We close our discussion by offering arguments why we believe blacklists are academically invalid.

URL : What Value Do Journal Whitelists and Blacklists Have in Academia?

DOI : https://doi.org/10.1016/j.acalib.2018.09.017

Leveraging Concepts in Open Access Publications

Authors : Andrea Bertino, Luca Foppiano, Laurent Romary, Pierre Mounier

Aim

This paper addresses the integration of a Named Entity Recognition and Disambiguation (NERD) service within a group of open access (OA) publishing digital platforms and considers its potential impact on both research and scholarly publishing.

This application, called entity-fishing, was initially developed by Inria in the context of the EU FP7 project CENDARI (Lopez et al., 2014) and provides automatic entity recognition and disambiguation against Wikipedia and Wikidata. Distributed with an open-source licence, it was deployed as a web service in the DARIAH infrastructure hosted at the French HumaNum.

Methods

In this paper, we focus on the specific issues related to its integration on five OA platforms specialized in the publication of scholarly monographs in social sciences and humanities as part of the work carried out within the EU H2020 project HIRMEOS (High Integration of Research Monographs in the European Open Science infrastructure).

Results and Discussion

In the following sections, we give a brief overview of the current status and evolution of OA publications and how HIRMEOS aims to contribute to this.

We then give a comprehensive description of the entity-fishing service, focusing on its concrete applications in real use cases together with some further possible ideas on how to exploit the generated annotations.

Conclusions

We show that entity-fishing annotations can improve both research and publishing process. Entity-fishing annotations can be used to achieve a better and quicker understanding of the specific and disciplinary language of certain monographs and so encourage non-specialists to use them.

In addition, a systematic implementation of the entity-fishing service can be used by publishers to generate thematic indexes within book collections to allow better cross-linking and query functions.

URL : https://hal.inria.fr/hal-01900303/

Sustainable open access for scholarly journals in 6 years – the incubator model at Utrecht University Library Open Access Journals

Authors : Jeroen Sondervan, Fleur Stigter

Key points

  • Humanities and the social science journals need flexible funding models.
  • Pragmatism and collaboration are key to transforming traditional publishing initiatives.
  • The Uopen Journals model sets a 6‐year development target for developing sustainable journals.
  • Actively involved editors are key to a journal’s success.

The future of global research: A case study on the use of scenario planning in the publishing industry

Authors : Samira Rhoods, Anca Babor

Key points

  • Scenario planning is fun and engaging and is a good opportunity to revisit your company’s core strengths and competitive advantage!
  • Scenario planning should drive long‐term thinking in organizations.
  • It will change the nature of the strategic conversation and can be used to help validate business innovation.
  • Scenarios can help to engage with other organizations in the industry and help people work together to create preferred future outcomes.
  • The complexity of scenario planning should not be underestimated and shortcuts do not work.

URL : The future of global research: A case study on the use of scenario planning in the publishing industry

DOI : https://doi.org/10.1002/leap.1152

Identifying the challenges in implementing open science

Authors : Sarah E. Ali-Khan, Antoine Jean, E. Richard Gold

Areas of open science (OS) policy and practice are already relatively well-advanced in several countries and sectors through the initiatives of some governments, funders, philanthropy, researchers and the community. Nevertheless, the current research and innovation system, including in the focus of this report, the life sciences, remains weighted against OS.

In October 2017, thought-leaders from across the world gathered at an Open Science Leadership Forum in the Washington DC office of the Bill and Melinda Gates Foundation to share their views on what successful OS looks like.

We focused on OS partnerships as this is an emerging model that aims to accelerate science and innovation. These outcomes are captured in a first meeting report: Defining Success in Open Science.

On several occasions, these conversations turned to the challenges that must be addressed and new policies required to effectively and sustainably advance OS practice.

Thereupon, in this report, we describe the concerns raised and what is needed to address them supplemented by our review of the literature, and suggest the stakeholder groups that may be best placed to begin to take action.

It emerges that to be successful, OS will require the active engagement of all stakeholders: while the research community must develop research questions, identify partners and networks, policy communities need to create an environment that is supportive of experimentation by removing barriers.

This report aims to contribute to ongoing discussions about OS and its implementation. It is also part of a step-wise process to develop and mobilize a toolkit of quantitative and qualitative indicators to assist global stakeholders in implementing high value OS collaborations.

Currently in co-development through an open and international process, this set of measures will allow the generation of needed evidence on the influence of OS partnerships on research, innovation, and critical social and economic goals.

URL : Identifying the challenges in implementing open science

DOI : http://dx.doi.org/10.12688/mniopenres.12805.1

The History, Advocacy and Efficacy of Data Management Plans

Authors : Nicholas Smale, Kathryn Unsworth, Gareth Denyer, Daniel Barr

Data management plans (DMPs) have increasingly been encouraged as a key component of institutional and funding body policy. Although DMPs necessarily place administrative burden on researchers, proponents claim that DMPs have myriad benefits, including enhanced research data quality, increased rates of data sharing, and institutional planning and compliance benefits.

In this manuscript, we explore the international history of DMPs and describe institutional and funding body DMP policy. We find that economic and societal benefits from presumed increased rates of data sharing was the original driver of mandating DMPs by funding bodies.

Today, 86% of UK Research Councils and 63% of US funding bodies require submission of a DMP with funding applications. Given that no major Australian funding bodies require DMP submission, it is of note that 37% of Australian universities have taken the initiative to internally mandate DMPs.

Institutions both within Australia and internationally frequently promote the professional benefits of DMP use, and endorse DMPs as ‘best practice’. We analyse one such typical DMP implementation at a major Australian institution, finding that DMPs have low levels of apparent translational value.

Indeed, an extensive literature review suggests there is very limited published systematic evidence that DMP use has any tangible benefit for researchers, institutions or funding bodies.

We are therefore led to question why DMPs have become the go-to tool for research data professionals and advocates of good data practice. By delineating multiple use-cases and highlighting the need for DMPs to be fit for intended purpose, we question the view that a good DMP is necessarily that which encompasses the entire data lifecycle of a project.

Finally, we summarise recent developments in the DMP landscape, and note a positive shift towards evidence-based research management through more researcher-centric, educative, and integrated DMP services.

URL : The History, Advocacy and Efficacy of Data Management Plans

DOI : https://doi.org/10.1101/443499