Digitize Me, Visualize Me, Search Me : Open Science and its Discontents

[…] Digitize Me, Visualize Me, Search Me takes as its starting point the so-called ‘computational turn’ to data-intensive scholarship in the humanities.

The phrase ‘the computational turn’ has been adopted to refer to the process whereby techniques and methodologies drawn from (in this case) computer science and related fields – including science visualization, interactive information visualization, image processing, network analysis, statistical data analysis, and the management, manipulation and mining of data – are being used to produce new ways of approaching and understanding texts in the humanities; what is sometimes thought of as ‘the digital humanities’.

The concern in the main has been with either digitizing ‘born analog’ humanities texts and artifacts (e.g. making annotated editions of the art and writing of William Blake available to scholars and researchers online), or gathering together ‘born digital’ humanities texts and artifacts (videos, websites, games, photography, sound recordings, 3D data), and then taking complex and often extremely large-scale data analysis techniques from computing science and related fields and applying them to these humanities texts and artifacts – to this ‘big data’, as it has been called.

Witness Lev Manovich and the Software Studies Initiative’s use of ‘digital image analysis and new visualization techniques’ to study ‘20,000 pages of Science and Popular Science magazines… published between 1872-1922, 780 paintings by van Gogh, 4535 covers of Time magazine (1923-2009) and one million manga pages’ (Manovich, 2011), and Dan Cohen and Fred Gibb’s text mining of ‘the 1,681,161 books that were published in English in the UK in the long nineteenth century’ (Cohen, 2010).

What Digitize Me, Visualize Me, Search Me endeavours to show is that such data-focused transformations in research can be seen as part of a major alteration in the status and nature of knowledge. It is an alteration that, according to the philosopher Jean-François Lyotard, has been taking place since at least the 1950s.

It involves nothing less than a shift away from a concern with questions of what is right and just, and toward a concern with legitimating power by optimizing the social system’s performance in instrumental, functional terms. This shift has significant consequences for our idea of knowledge.

[..] In particular, Digitize Me, Visualize Me, Search Me suggests that the turn in the humanities toward datadriven scholarship, science visualization, statistical data analysis, etc. can be placed alongside all those discourses that are being put forward at the moment – in both the academy and society – in the name of greater openness, transparency, efficiency and accountability.

URL : http://livingbooksaboutlife.org/pdfs/bookarchive/DigitizeMe.pdf

An Institutional Approach to Developing Research Data Management Infrastructure

This article outlines the work that the University of Oxford is undertaking to implement a coordinated data management infrastructure. The rationale for the approach being taken by Oxford is presented, with particular attention paid to the role of each service division. This is followed by a consideration of the relative advantages and disadvantages of institutional data repositories, as opposed to national or international data centres. The article then focuses on two ongoing JISC-funded projects, ‘Embedding Institutional Data Curation Services in Research’ (Eidcsr) and ‘Supporting Data Management Infrastructure for the Humanities’ (Sudamih).

Both projects are intra-institutional collaborations and involve working with researchers to develop particular aspects of infrastructure, including: University policy, systems for the preservation and documentation of research data, training and support, software tools for the visualisation of large images, and creating and sharing databases via the Web (Database as a Service).

URL : http://www.ijdc.net/index.php/ijdc/article/view/198

Citation and Peer Review of Data Moving Towards…

Citation and Peer Review of Data: Moving Towards Formal Data Publication

“This paper discusses many of the issues associated with formally publishing data in academia, focusing primarily on the structures that need to be put in place for peer review and formal citation of datasets. Data publication is becoming increasingly important to the scientific community, as it will provide a mechanism for those who create data to receive academic credit for their work and will allow the conclusions arising from an analysis to be more readily verifiable, thus promoting transparency in the scientific process. Peer review of data will also provide a mechanism for ensuring the quality of datasets, and we provide suggestions on the types of activities one expects to see in the peer review of data. A simple taxonomy of data publication methodologies is presented and evaluated, and the paper concludes with a discussion of dataset granularity, transience and semantics, along with a recommended human-readable citation syntax.”

URL : http://www.ijdc.net/index.php/ijdc/article/view/181

Building an Open Data Repository: Lessons and Challenge

Author : Limor Peer

The Internet has transformed scholarly research in many ways. Open access to data and other research output has been touted as a crucial step toward transparency and quality in science. This paper takes a critical look at what it takes to share social science research data, from the perspective of a small data repository at Yale University’s Institution for Social and Policy Studies.

The ISPS Data Archive was built to create an open access digital collection of social science experimental data, metadata, and associated files produced by ISPS researchers, for the purpose of replication of research findings, further analysis, and teaching.

This paper describes the development of the ISPS Data Archive and discusses the inter-related challenges of replication, integration, and stewardship. It argues that open data requires effort, investment of resources, and planning. By itself, it does not enhance knowledge.

URL : http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1931048

Evaluating the Impact of Open Data Websites …

Evaluating the Impact of Open Data Websites :

“Over the past few years, the steady increase in the number of government open data websites has led to a call for appropriate evaluation tools. While some (Noveck, 2009) have expressed optimism as to the potential of government open data, others (Coglianese, 2009; Hindman, 2009) have been more hesitant. This paper therefore aims to answer the following question: how does one evaluate the success of open data websites in reaching democratic objectives? In doing so, it explores past academic studies and examines the researcher’s experience with interpretive inquiry. Using Data.gov as an example, it argues that survey-based research, a common tool in information systems analysis, may not be suited to open data websites. Instead, it suggests a content analysis methodology, which hopes to inform future research on the subject.”

URL : http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1926201

Report on the Legal Status of Research Data in the four partner countries

This report compares the legal status of research data in the four KE partner countries. The report also addresses where European copyright and database law poses flaws and obstacles to the access to research data and singles out pre-conditions for openly available data.

URL : http://repository.jisc.ac.uk/id/eprint/6280