Why academics under-share research data: A social relational theory

Authors : Janice Bially MatternJoseph KohlburnHeather Moulaison-Sandy

Despite their professed enthusiasm for open science, faculty researchers have been documented as not freely sharing their data; instead, if sharing data at all, they take a minimal approach. A robust research agenda in LIS has documented the data under-sharing practices in which they engage, and the motivations they profess.

Using theoretical frameworks from sociology to complement research in LIS, this article examines the broader context in which researchers are situated, theorizing the social relational dynamics in academia that influence faculty decisions and practices relating to data sharing.

We advance a theory that suggests that the academy has entered a period of transition, and faculty resistance to data sharing through foot-dragging is one response to shifting power dynamics. If the theory is borne out empirically, proponents of open access will need to find a way to encourage open academic research practices without undermining the social value of academic researchers.

URL : Why academics under-share research data: A social relational theory

DOI : https://doi.org/10.1002/asi.24938

Data Management Plans: Implications for Automated Analyses

Authors : Ngoc-Minh Pham, Heather Moulaison-Sandy, Bradley Wade Bishop, Hannah Gunderman

Data management plans (DMPs) are an essential part of planning data-driven research projects and ensuring long-term access and use of research data and digital objects; however, as text-based documents, DMPs must be analyzed manually for conformance to funder requirements.

This study presents a comparison of DMPs evaluations for 21 funded projects using 1) an automated means of analysis to identify elements that align with best practices in support of open research initiatives and 2) a manually-applied scorecard measuring these same elements.

The automated analysis revealed that terms related to availability (90% of DMPs), metadata (86% of DMPs), and sharing (81% of DMPs) were reliably supplied. Manual analysis revealed 86% (n = 18) of funded DMPs were adequate, with strong discussions of data management personnel (average score: 2 out of 2), data sharing (average score 1.83 out of 2), and limitations to data sharing (average score: 1.65 out of 2).

This study reveals that the automated approach to DMP assessment yields less granular yet similar results to manual assessments of the DMPs that are more efficiently produced. Additional observations and recommendations are also presented to make data management planning exercises and automated analysis even more useful going forward.

URL : Data Management Plans: Implications for Automated Analyses

DOI : http://doi.org/10.5334/dsj-2023-002