OER: Lessons from the Field

Authors : Roy Kaufman, Andrew Campana

As open educational resources (OER) expand in the US and elsewhere, attention should be paid to the challenges of implementing OER and solutions to those challenges. OER currently hold great promise for instructing students in K-12 (secondary) and primary school classrooms, because – unlike traditional curriculum materials – OER content can legally and freely be copied, used, adapted and reshared by anyone.

Notwithstanding the benefits, OER developers have not yet worked out certain structural issues that can make it difficult for teachers and students to use OER, impeding the adoption and broader acceptance of even the best designed OER curricula.

Links which disappear over time, device management, data and privacy concerns, quality, scope, sequence and alignment challenges, copyright issues and sustainability of OER curricula are all challenges that advocates of OER and curriculum designers often miss, ignore or avoid.

These challenges, however, can be overcome through thoughtful planning and partnerships, as has been done in the US with the successful Louisiana Guidebooks and other OER course materials.

URL : OER: Lessons from the Field

DOI : http://doi.org/10.1629/uksg.464

How to Fight Fair Use Fear, Uncertainty, and Doubt : The Experience of One Open Educational Resource

Author : Lindsey Weeramuni

At the launch of one of the early online open educational resources (OER) in 2002, the approach to addressing copyright was uncertain. Did the university or the faculty own their material? How would the third-party material be handled? Was all of its use considered fair use under Section 107 of the U.S. Copyright Act (Title 17, United States Code) because of its educational purpose?

Or was permission-seeking necessary for this project to succeed and protect the integrity of faculty and university? For many years, this OER was conservative in its approach to third-party material, avoiding making fair use claims on the theory that it was too risky and difficult to prove in the face of an infringement claim.

Additionally, being one of the early projects of its kind, there was fear of becoming a target for ambitious copyright holders wanting to make headlines (and perhaps win lawsuits). It was not until 2009 that the Code of Best Practices in Fair Use for OpenCourseWare was written by a community of practitioners who believed that if fair use worked for documentary film makers, video creators, and others (including big media), it worked in open education as well.

Once this Code was adopted, universities and institutions were able to offer more rich and complete course content to their users than before. This paper explains how it happened at this early open educational resource offering.

URL : How to Fight Fair Use Fear, Uncertainty, and Doubt : The Experience of One Open Educational Resource

DOI: https://doi.org/10.17161/jcel.v3i1.9751

Integrating Data Science Tools into a Graduate Level Data Management Course

Authors: Pete E. Pascuzzi, Megan R. Sapp Nelson

Objective

This paper describes a project to revise an existing research data management (RDM) course to include instruction in computer skills with robust data science tools.

Setting

A Carnegie R1 university.

Brief Description

Graduate student researchers need training in the basic concepts of RDM. However, they generally lack experience with robust data science tools to implement these concepts holistically. Two library instructors fundamentally redesigned an existing research RDM course to include instruction with such tools.

The course was divided into lecture and lab sections to facilitate the increased instructional burden. Learning objectives and assessments were designed at a higher order to allow students to demonstrate that they not only understood course concepts but could use their computer skills to implement these concepts.

Results

Twelve students completed the first iteration of the course. Feedback from these students was very positive, and they appreciated the combination of theoretical concepts, computer skills and hands-on activities. Based on student feedback, future iterations of the course will include more “flipped” content including video lectures and interactive computer tutorials to maximize active learning time in both lecture and lab.

The substance of this article is based upon poster presentations at RDAP Summit 2018.

URL : Integrating Data Science Tools into a Graduate Level Data Management Course

DOI : https://doi.org/10.7191/jeslib.2018.1152

Academic E-book Usability from the Student’s Perspective

Authors : Esta Tovstiadi, Natalia Tingle, Gabrielle Wiersma

Objective

This article describes how librarians systematically compared different e-book platforms to identify which features and design impact usability and user satisfaction.

Methods

This study employed task-based usability testing, including the “think-aloud protocol.” Students at the University of Colorado Boulder completed a series of typical tasks to compare the usability and measure user satisfaction with academic e-books.

For each title, five students completed the tasks on three e-book platforms: the publisher platform and two aggregators. Thirty-five students evaluated seven titles on nine academic e-book platforms.

Results

This study identified each platform’s strengths and weaknesses based on students’ experiences and preferences. The usability tests indicated that students preferred Ebook Central over EBSCO and strongly preferred the aggregators over publisher platforms.

Conclusions

Librarians can use student expectations and preferences to guide e-book purchasing decisions. Preferences may vary by institution, but variations in e-book layout and functionality impact students’ ability to successfully complete tasks and influences their affinity for or satisfaction with any given platform.

Usability testing is a useful tool for gauging user expectations and identifying preferences for features, functionality, and layout.

URL : Academic E-book Usability from the Student’s Perspective

DOI : https://doi.org/10.18438/eblip29457

The Rutgers Open Access Policy goes into effect: Faculty reaction and implementation lessons learned

Authors : Jane Otto, Laura Bowering Mullen

From laying the groundwork for the successful passage of a university-wide Open Access policy, through the development and planning that goes into a successful implementation, to “Day One” when the official university policy goes into effect, there is a long list of factors that affect faculty interest, participation and compliance.

The authors, Mullen and Otto, having detailed earlier aspects of the Rutgers University Open Access Policy passage and implementation planning, analyze and share the specifics that followed the rollout of the Policy and that continue to affect participation.

This case study presents some strategies and systems used to enhance author self-archiving in the newly minted SOAR (Scholarly Open Access at Rutgers) portal of the Rutgers institutional repository, including involvement of departmental liaison librarians, effective presentation of metrics, and a focus on targeted communication with faculty.

Roadblocks encountered as faculty began to deposit their scholarship and lessons learned are a focus. Early reaction from faculty and graduate students (doctoral students and postdocs) to various aspects of the Policy as well as the use of SOAR for depositing their work are included.

DOI : https://doi.org/doi:10.7282/T3D50QDM

Measuring Open Access Policy Compliance: Results of a Survey

Authors : Shannon Kipphut-Smith, Michael Boock, Kimberly Chapman, Michaela Willi Hooper

INTRODUCTION

In the last decade, a significant number of institutions have adopted open access (OA) policies. Many of those working with OA policies are tasked with measuring policy compliance.

This article reports on a survey of Coalition of Open Access Policy Institutions (COAPI) members designed to better understand the methods currently used for measuring and communicating OA policy success.

METHODS

This electronic survey was distributed to the COAPI member listserv, inviting both institutions who have passed an implemented policies and those who are still developing policies to participate.

RESULTS

The results to a number of questions related to topics such as policy workflows, quantitative and qualitative measurement activities and related tools, and challenges showed a wide range of responses, which are shared here.

DISCUSSION

It is clear that a number of COAPI members struggle with identifying what should be measured and what tools and methods are appropriate. The survey illustrates how each institution measures compliance differently, making it difficult to benchmark against peer institutions.

CONCLUSION

As a result of this survey, we recommend that institutions working with OA policies be as transparent as possible about their data sources and methods when calculating deposit rates and other quantitative measures.

It is hoped that this transparency will result in the development of a set of qualitative and quantitative best practices for assessing OA policies that standardizes assessment terminology and articulates why institutions may want to measure policies.

URL : Measuring Open Access Policy Compliance: Results of a Survey

DOI : https://doi.org/10.7710/2162-3309.2247

How are we Measuring Up? Evaluating Research Data Services in Academic Libraries

Authors : Heather L. Coates, Jake Carlson, Ryan Clement, Margaret Henderson, Lisa R Johnston, Yasmeen Shorish

INTRODUCTION

In the years since the emergence of federal funding agency data management and sharing requirements (http://datasharing.sparcopen.org/data), research data services (RDS) have expanded to dozens of academic libraries in the United States.

As these services have matured, service providers have begun to assess them. Given a lack of practical guidance in the literature, we seek to begin the discussion with several case studies and an exploration of four approaches suitable to assessing these emerging services.

DESCRIPTION OF PROGRAM

This article examines five case studies that vary by staffing, drivers, and institutional context in order to begin a practice-oriented conversation about how to evaluate and assess research data services in academic libraries.

The case studies highlight some commonly discussed challenges, including insufficient training and resources, competing demands for evaluation efforts, and the tension between evidence that can be easily gathered and that which addresses our most important questions.

We explore reflective practice, formative evaluation, developmental evaluation, and evidence-based library and information practice for ideas to advance practice.

NEXT STEPS

Data specialists engaged in providing research data services need strategies and tools with which to make decisions about their services. These range from identifying stakeholder needs to refining existing services to determining when to extend and discontinue declining services.

While the landscape of research data services is broad and diverse, there are common needs that we can address as a community. To that end, we have created a community-owned space to facilitate the exchange of knowledge and existing resources.

URL : How are we Measuring Up? Evaluating Research Data Services in Academic Libraries

DOI : http://doi.org/10.7710/2162-3309.2226