Integrating Data Science Tools into a Graduate Level Data Management Course

Authors: Pete E. Pascuzzi, Megan R. Sapp Nelson

Objective

This paper describes a project to revise an existing research data management (RDM) course to include instruction in computer skills with robust data science tools.

Setting

A Carnegie R1 university.

Brief Description

Graduate student researchers need training in the basic concepts of RDM. However, they generally lack experience with robust data science tools to implement these concepts holistically. Two library instructors fundamentally redesigned an existing research RDM course to include instruction with such tools.

The course was divided into lecture and lab sections to facilitate the increased instructional burden. Learning objectives and assessments were designed at a higher order to allow students to demonstrate that they not only understood course concepts but could use their computer skills to implement these concepts.

Results

Twelve students completed the first iteration of the course. Feedback from these students was very positive, and they appreciated the combination of theoretical concepts, computer skills and hands-on activities. Based on student feedback, future iterations of the course will include more “flipped” content including video lectures and interactive computer tutorials to maximize active learning time in both lecture and lab.

The substance of this article is based upon poster presentations at RDAP Summit 2018.

URL : Integrating Data Science Tools into a Graduate Level Data Management Course

DOI : https://doi.org/10.7191/jeslib.2018.1152

Academic E-book Usability from the Student’s Perspective

Authors : Esta Tovstiadi, Natalia Tingle, Gabrielle Wiersma

Objective

This article describes how librarians systematically compared different e-book platforms to identify which features and design impact usability and user satisfaction.

Methods

This study employed task-based usability testing, including the “think-aloud protocol.” Students at the University of Colorado Boulder completed a series of typical tasks to compare the usability and measure user satisfaction with academic e-books.

For each title, five students completed the tasks on three e-book platforms: the publisher platform and two aggregators. Thirty-five students evaluated seven titles on nine academic e-book platforms.

Results

This study identified each platform’s strengths and weaknesses based on students’ experiences and preferences. The usability tests indicated that students preferred Ebook Central over EBSCO and strongly preferred the aggregators over publisher platforms.

Conclusions

Librarians can use student expectations and preferences to guide e-book purchasing decisions. Preferences may vary by institution, but variations in e-book layout and functionality impact students’ ability to successfully complete tasks and influences their affinity for or satisfaction with any given platform.

Usability testing is a useful tool for gauging user expectations and identifying preferences for features, functionality, and layout.

URL : Academic E-book Usability from the Student’s Perspective

DOI : https://doi.org/10.18438/eblip29457

The Rutgers Open Access Policy goes into effect: Faculty reaction and implementation lessons learned

Authors : Jane Otto, Laura Bowering Mullen

From laying the groundwork for the successful passage of a university-wide Open Access policy, through the development and planning that goes into a successful implementation, to “Day One” when the official university policy goes into effect, there is a long list of factors that affect faculty interest, participation and compliance.

The authors, Mullen and Otto, having detailed earlier aspects of the Rutgers University Open Access Policy passage and implementation planning, analyze and share the specifics that followed the rollout of the Policy and that continue to affect participation.

This case study presents some strategies and systems used to enhance author self-archiving in the newly minted SOAR (Scholarly Open Access at Rutgers) portal of the Rutgers institutional repository, including involvement of departmental liaison librarians, effective presentation of metrics, and a focus on targeted communication with faculty.

Roadblocks encountered as faculty began to deposit their scholarship and lessons learned are a focus. Early reaction from faculty and graduate students (doctoral students and postdocs) to various aspects of the Policy as well as the use of SOAR for depositing their work are included.

DOI : https://doi.org/doi:10.7282/T3D50QDM

Measuring Open Access Policy Compliance: Results of a Survey

Authors : Shannon Kipphut-Smith, Michael Boock, Kimberly Chapman, Michaela Willi Hooper

INTRODUCTION

In the last decade, a significant number of institutions have adopted open access (OA) policies. Many of those working with OA policies are tasked with measuring policy compliance.

This article reports on a survey of Coalition of Open Access Policy Institutions (COAPI) members designed to better understand the methods currently used for measuring and communicating OA policy success.

METHODS

This electronic survey was distributed to the COAPI member listserv, inviting both institutions who have passed an implemented policies and those who are still developing policies to participate.

RESULTS

The results to a number of questions related to topics such as policy workflows, quantitative and qualitative measurement activities and related tools, and challenges showed a wide range of responses, which are shared here.

DISCUSSION

It is clear that a number of COAPI members struggle with identifying what should be measured and what tools and methods are appropriate. The survey illustrates how each institution measures compliance differently, making it difficult to benchmark against peer institutions.

CONCLUSION

As a result of this survey, we recommend that institutions working with OA policies be as transparent as possible about their data sources and methods when calculating deposit rates and other quantitative measures.

It is hoped that this transparency will result in the development of a set of qualitative and quantitative best practices for assessing OA policies that standardizes assessment terminology and articulates why institutions may want to measure policies.

URL : Measuring Open Access Policy Compliance: Results of a Survey

DOI : https://doi.org/10.7710/2162-3309.2247

How are we Measuring Up? Evaluating Research Data Services in Academic Libraries

Authors : Heather L. Coates, Jake Carlson, Ryan Clement, Margaret Henderson, Lisa R Johnston, Yasmeen Shorish

INTRODUCTION

In the years since the emergence of federal funding agency data management and sharing requirements (http://datasharing.sparcopen.org/data), research data services (RDS) have expanded to dozens of academic libraries in the United States.

As these services have matured, service providers have begun to assess them. Given a lack of practical guidance in the literature, we seek to begin the discussion with several case studies and an exploration of four approaches suitable to assessing these emerging services.

DESCRIPTION OF PROGRAM

This article examines five case studies that vary by staffing, drivers, and institutional context in order to begin a practice-oriented conversation about how to evaluate and assess research data services in academic libraries.

The case studies highlight some commonly discussed challenges, including insufficient training and resources, competing demands for evaluation efforts, and the tension between evidence that can be easily gathered and that which addresses our most important questions.

We explore reflective practice, formative evaluation, developmental evaluation, and evidence-based library and information practice for ideas to advance practice.

NEXT STEPS

Data specialists engaged in providing research data services need strategies and tools with which to make decisions about their services. These range from identifying stakeholder needs to refining existing services to determining when to extend and discontinue declining services.

While the landscape of research data services is broad and diverse, there are common needs that we can address as a community. To that end, we have created a community-owned space to facilitate the exchange of knowledge and existing resources.

URL : How are we Measuring Up? Evaluating Research Data Services in Academic Libraries

DOI : http://doi.org/10.7710/2162-3309.2226

Worth the Wait? Using Past Patterns to Determine Wait Periods for E-Books Released After Print

Author : Karen Kohn

This paper asks if there is an optimal wait period for e-books that balances libraries’ desire to acquire books soon after their publication with the frequent desire to purchase books electronically whenever feasible.

Analyzing 13,043 titles that Temple University Libraries received on its e-preferred approval plan in 2014–15, the author looks at the delays from the publication of print books to publication of their electronic versions. The analysis finds that most books on the approval plan are published electronically within a week of the print. Recommended wait periods are provided for different subjects.

URL : Worth the Wait? Using Past Patterns to Determine Wait Periods for E-Books Released After Print

DOI : https://doi.org/10.5860/crl.79.1.35

Scaling Research Data Management Services Along the Maturity Spectrum: Three Institutional Perspectives

Authors : Cinthya Ippoliti, Amy Koshoffer, Renaine Julian, Micah Vandegrift, Devin Soper, Sophie Meridien

Research data services promise to advance many academic libraries’ strategic goals of becoming partners in the research process and integrating library services with modern research workflows. Academic librarians are well positioned to make an impact in this space due to their expertise in managing, curating, and preserving digital information, and a history of engaging with scholarly communications writ large.

Some academic libraries have quickly developed infrastructure and support for every activity ranging from data storage and curation to project management and collaboration, while others are just beginning to think about addressing the data needs of their researchers.

Regardless of which end of the spectrum they identify with, libraries are still seeking to understand the research landscape and define their role in the process.

This article seeks to blend both a general perspective regarding these issues with actual case studies derived from three institutions, University of Cincinnati, Oklahoma State University, and Florida State University, all of which are at different levels of implementation, maturity, and campus involvement.

URL : Scaling Research Data Management Services Along the Maturity Spectrum: Three Institutional Perspectives

DOI : https://dx.doi.org/10.17605/OSF.IO/WZ8FN