Do Open Access Mandates Work? A Systematized Review of the Literature on Open Access Publishing Rates

Authors : Elena Azadbakht, Tara Radniecki, Teresa Schultz, Amy W. Shannon

To encourage the sharing of research, various entities—including public and private funders, universities, and academic journals—have enacted open access (OA) mandates or data sharing policies.

It is unclear, however, whether these OA mandates and policies increase the rate of OA publishing and data sharing within the research communities impacted by them. A team of librarians conducted a systematized review of the literature to answer this question. A comprehensive search of several scholarly databases and grey literature sources resulted in 4,689 unique citations.

However, only five articles met the inclusion criteria and were deemed as having an acceptable risk of bias. This sample showed that although the majority of the mandates described in the literature were correlated with a subsequent increase in OA publishing or data sharing, the presence of various confounders and the differing methods of collecting and analyzing the data used by the studies’ authors made it impossible to establish a causative relationship.

URL : Do Open Access Mandates Work? A Systematized Review of the Literature on Open Access Publishing Rates

DOI : https://doi.org/10.31274/jlsc.15444

Not open for all: accessibility of open textbooks

Authors : Elena Azadbakht, Teresa Schultz, Jennifer Arellano

In order for open educational resources (OERs) to be truly open to all, they must be accessible to learners with disabilities, including those with visual, auditory, physical and cognitive disabilities.

This study sought to determine the accessibility of a randomly selected sample of 355 open textbooks using a custom rubric based upon the World Wide Web Consortium’s (W3C’s) Web Content Accessibility Guidelines (WCAG), version 2.1, primarily at the Levels A and AA. Included books fell into one of four format types: HTML files/websites, PDFs, Microsoft Word documents and EPUBs.

The average number of ‘fails’ – instances in which they ran afoul of a rubric category – across the whole sample was 5.93 and the median was 6, out of a total of 14 or 15 categories, depending on the format type.

Overall, most of the books did not meet basic accessibility requirements, such as including alternative text for any images, properly coding/tagging any tables and following a logical heading order structure.

URL : Not open for all: accessibility of open textbooks

DOI : http://doi.org/10.1629/uksg.557

Research Data Management Services in Academic Libraries in the US: A Content Analysis of Libraries’ Websites

Authors : Ayoung Yoon, Teresa Schultz

Examining landscapes of research data management services in academic libraries is timely and significant for both those libraries on the front line and the libraries that are already ahead.

While it provides overall understanding of where the research data management program is at and where it is going, it also provides understanding of current practices and data management recommendations and/or tool adoptions as well as revealing areas of improvement and support.

This study examined the research data (management) services in academic libraries in the United States through a content analysis of 185 library websites, with four main areas of focus: service, information, education, and network.

The results from the content analysis of these webpages reveals that libraries need to advance and engage more actively to provide services, supply information online, and develop educational services.

There is also a wide variation among library data management services and programs according to their web presence.

URL : http://crl.acrl.org/index.php/crl/article/view/16788/18346

Practicing What You Preach: Evaluating Access of Open Access Research

Author : Teresa Schultz

The open access movement seeks to encourage all researchers to make their works openly available and free of paywalls so more people can access their knowledge. Yet some researchers who study open access (OA) continue to publish their work in paywalled journals and fail to make it open.

This project set out to study just how many published research articles about OA fall into this category, how many are being made open (whether by being published in a gold OA or hybrid journal or through open deposit), and how library and information science authors compare to other disciplines researching this field.

Because of the growth of tools available to help researchers find open versions of articles, this study also sought to compare how these new tools compare to Google Scholar in their ability to disseminating OA research.

From a sample collected from Web of Science of articles published since 2010, the study found that although a majority of research articles about OA are open in some form, a little more than a quarter are not.

A smaller rate of library science researchers made their work open compared to non-library science researchers. In looking at the copyright of these articles published in hybrid and open journals, authors were more likely to retain copyright ownership if they printed in an open journal compared to authors in hybrid journals.

Articles were more likely to be published with a Creative Commons license if published in an open journal compared to those published in hybrid journals.

URL : Practicing What You Preach: Evaluating Access of Open Access Research

DOI : https://dx.doi.org/10.17605/OSF.IO/YBDR8

 

Citations as Data: Harvesting the Scholarly Record of your University to Enrich Institutional Knowledge and Support Reseach

Authors : Ayoung Yoon, Teresa Schultz

Many research libraries are looking for new ways to demonstrate value for their parent institutions. Metrics, assessment, and promotion of research continue to grow in importance, but have not always fallen into the scope of services for the research library.

Montana State University (MSU) Library recognized a need and interest to quantify the citation record and scholarly output of our university. Within this vision in mind, we began positioning citation collection as the data engine that drives scholarly communication, deposits into our IR, and assessment of research activities.

We envisioned a project that might: provide transparency around the acts of scholarship at our university; celebrate the research we produce; and build new relationships between our researchers.

The result was our MSU Research Citation application — https://arc.lib.montana.edu/msu-research-citations/ — and our research publication promotion services— http://www.montana.edu/research/publications/ —The application and accompanying services are predicated on the principle that each citation is a discrete data object that can be searched, browsed, exported, and reused.

In this formulation, the record of our research publications are the data that can open up possibilities for new library projects and services.

URL : http://crl.acrl.org/content/early/2016/11/16/crl16-1023.short