How are we Measuring Up? Evaluating Research Data Services in Academic Libraries

Authors : Heather L. Coates, Jake Carlson, Ryan Clement, Margaret Henderson, Lisa R Johnston, Yasmeen Shorish

INTRODUCTION

In the years since the emergence of federal funding agency data management and sharing requirements (http://datasharing.sparcopen.org/data), research data services (RDS) have expanded to dozens of academic libraries in the United States.

As these services have matured, service providers have begun to assess them. Given a lack of practical guidance in the literature, we seek to begin the discussion with several case studies and an exploration of four approaches suitable to assessing these emerging services.

DESCRIPTION OF PROGRAM

This article examines five case studies that vary by staffing, drivers, and institutional context in order to begin a practice-oriented conversation about how to evaluate and assess research data services in academic libraries.

The case studies highlight some commonly discussed challenges, including insufficient training and resources, competing demands for evaluation efforts, and the tension between evidence that can be easily gathered and that which addresses our most important questions.

We explore reflective practice, formative evaluation, developmental evaluation, and evidence-based library and information practice for ideas to advance practice.

NEXT STEPS

Data specialists engaged in providing research data services need strategies and tools with which to make decisions about their services. These range from identifying stakeholder needs to refining existing services to determining when to extend and discontinue declining services.

While the landscape of research data services is broad and diverse, there are common needs that we can address as a community. To that end, we have created a community-owned space to facilitate the exchange of knowledge and existing resources.

URL : How are we Measuring Up? Evaluating Research Data Services in Academic Libraries

DOI : http://doi.org/10.7710/2162-3309.2226

Data Management Plan Requirements for Campus Grant Competitions: Opportunities for Research Data Services Assessment and Outreach

Objective

To examine the effects of research data services (RDS) on the quality of data management plans (DMPs) required for a campus-level faculty grant competition, as well as to explore opportunities that the local DMP requirement presented for RDS outreach.

Methods

Nine reviewers each scored a randomly assigned portion of DMPs from 82 competition proposals. Each DMP was scored by three reviewers, and the three scores were averaged together to obtain the final score. Interrater reliability was measured using intraclass correlation.

Unpaired t-tests were used to compare mean DMP scores for faculty who utilized RDS services with those who did not. Unpaired t-tests were also used to compare mean DMP scores for proposals that were funded with proposals that were not funded. One-way ANOVA was used to compare mean DMP scores among proposals from six broad disciplinary categories.

Results

Analyses showed that RDS consultations had a statistically significant effect on DMP scores. Differences between DMP scores for funded versus unfunded proposals and among disciplinary categories were not significant. The DMP requirement also provided a number of both expected and unexpected outreach opportunities for RDS services.

Conclusions

Requiring DMPs for campus grant competitions can provide important assessment and outreach opportunities for research data services.

While these results might not be generalizable to DMP review processes at federal funding agencies, they do suggest the importance, at any level, of developing a shared understanding of what constitutes a high quality DMP among grant applicants, grant reviewers, and RDS providers.

URL : Data Management Plan Requirements for Campus Grant Competitions

DOI : http://dx.doi.org/10.7191/jeslib.2016.1089