The amount of open access content stored in repositories has increased dramatically, which has created new technical and organisational challenges for bringing this content together. The COnnecting REpositories (CORE) project has been dealing with these challenges by aggregating and enriching content from hundreds of open access repositories, increasing the discoverability and reusability of millions of open access manuscripts.
As repository managers and library directors often wish to know the details of the content harvested from their repositories and keep a certain level of control over it, CORE is now facing the challenge of how to enable content providers to manage their content in the aggregation and control the harvesting process. In order to improve the quality and transparency of the aggregation process and create a two-way collaboration between the CORE project and the content providers, we propose the CORE Dashboard.
URL : https://www.liberquarterly.eu/articles/10138/
Recent additional open access (OA) requirements for publications by authors at UK higher education institutions require amendments to support mechanisms. These additional requirements arose primarily from the Research Councils UK Open Access Policy, applicable from April 2013, and the new OA policy for Research Excellence Framework eligibility published in March 2014 and applicable from April 2016.
Further provision also had to be made for compliance with the UK Charities Open Access Fund, the European Union, other funder policies, and internal reporting requirements.
In response, the University of Glasgow has enhanced its OA processes and systems. This case study charts our journey towards managing OA via our EPrints repository. The aim was to consolidate and manage OA information in one central place to increase efficiency of recording, tracking and reporting. We are delighted that considerable time savings and reduction in errors have been achieved by dispensing with spreadsheets to record decisions about OA.
URL : Managing open access with EPrints software: a case study
DOI : http://doi.org/10.1629/uksg.277
Staffing and Workflow of a Maturing Institutional Repository :
“Institutional repositories (IRs) have become established components of many academic libraries. As an IR matures it will face the challenge of how to scale up its operations to increase the amount and types of content archived. These challenges involve staffing, systems, workflows, and promotion. In the past eight years, Kansas State University’s IR (K-REx) has grown from a platform for student theses, dissertations, and reports to also include faculty works. The initial workforce of a single faculty member was expanded as a part of a library-wide reorganization, resulting in a cross-departmental team that is better able to accommodate the expansion of the IR. The resultant need to define staff responsibilities and develop resources to manage the workflows has led to the innovations described here, which may prove useful to the greater library community as other IRs mature.”
URL : http://jlsc-pub.org/jlsc/vol1/iss3/4/
Institutional Repositories: Exploration of Costs and Value :
“Little is known about the costs academic libraries incur to implement and manage institutional repositories and the value these institutional repositories offer to their communities. To address this, the authors report the findings of their 29 question survey of academic libraries with institutional repositories. We received 49 usable responses. Thirty-four of these responses completed the entire survey. While meant to be exploratory, results are varied and depend on the context of the institution. This context includes, among other things, the size of the repositories and of the institutions, the software used to operate the repositories, such as open source or proprietary, and whether librarians mediate content archiving, or content producers directly deposit their own material. The highlights of our findings, based on median values, suggest that institutions that mediate submissions incur less expense than institutions that allow self-archiving, institutions that offer additional services incur greater annual operating costs than those who do not, and institutions that use open source applications have lower implementation costs but comparable annual operating costs with institutions that use proprietary solutions. Furthermore, changes in budgeting, from special initiative to absorption into the regular budget, suggest a trend in sustainable support for institutional repositories. Our findings are exploratory but suggest that future inquiry into costs and the value of institutional repositories should be directed at specific types of institutions, such as by Carnegie Classification category.”
URL : http://www.dlib.org/dlib/january13/burns/01burns.html
Bibliographic Metadata Harvesting to Support the Management of an Institutional Repository :
“This thesis approaches the problem of automatic harvesting of bibliographic metadata records from several indexing services, in the context of the population of institutional repositories. Since the manual insertion of records is a tedious and error-prone task, the automation of the process intends to facilitate the management of a repository. However, the automated harvesting of records has to deal with the problem of identifying authors and with the need to consolidate duplicate records retrieved from different services. In an approach to the automation of the aforementioned task, we introduce a system that proposes to harvest bibliographic metadata records from different information sources publicly available, identify and consolidate the retrieved records that are considered duplicates and make available the results of such consolidation to external parties that are interested in the information, such as an institutional repository. The proposed system was tested with real bibliographic metadata corresponding to scientific publications of a subset of faculty members at Instituto Superior T´ecnico. The results of the evaluation show that, despite the required time to identify and consolidate, the merged records contain a valid aggregation of all available information in the system and can be efficiently accessed by external entities through a machine-to-machine interface.”
URL : https://dspace.ist.utl.pt/bitstream/2295/1271450/1/dissertacao.pdf
In 2009, the Institution for Social and Policy Studies (ISPS) at Yale University began building an open access digital collection of social science experimental data, metadata, and associated files produced by ISPS researchers.
The digital repository was created to support the replication of research findings and to enable further data analysis and instruction. Content is submitted to a rigorous process of quality assessment and normalization, including transformation of statistical code into R, an open source statistical software.
Other requirements included: (a) that the repository be integrated with the current database of publications and projects publicly available on the ISPS website; (b) that it offered open access to datasets, documentation, and statistical software program files; (c) that it utilized persistent linking services and redundant storage provided within the Yale Digital Commons infrastructure; and (d) that it operated in accordance with the prevailing standards of the digital preservation community.
In partnership with Yale’s Office of Digital Assets and Infrastructure (ODAI), the ISPS Data Archive was launched in the fall of 2010.
We describe the process of creating the repository, discuss prospects for similar projects in the future, and explain how this specialized repository fits into the larger digital landscape at Yale.
URL : http://www.ijdc.net/index.php/ijdc/article/view/222
An activity-based costing model for long-term preservation and dissemination of digital research data: the case of DANS :
“Financial sustainability is an important attribute of a trusted, reliable digital repository. The authors of this paper use the case study approach to develop an activity-based costing (ABC) model. This is used for estimating the costs of preserving digital research data and identifying options for improving and sustaining relevant activities. The model is designed in the environment of the Data Archiving and Networked Services (DANS) institute, a well-known trusted repository. The DANS–ABC model has been tested on empirical cost data from activities performed by 51 employees in frames of over 40 different national and international projects. Costs of resources are being assigned to cost objects through activities and cost drivers. The ‘euros per dataset’ unit of costs measurement is introduced to analyse the outputs of the model. Funders, managers and other decision-making stakeholders are being provided with understandable information connected to the strategic goals of the organisation. The latter is being achieved by linking the DANS–ABC model to another widely used managerial tool—the Balanced Scorecard (BSC). The DANS–ABC model supports costing of services provided by a data archive, while the combination of the DANS–ABC with a BSC identifies areas in the digital preservation process where efficiency improvements are possible.”
URL : http://link.springer.com/article/10.1007/s00799-012-0092-1