“Rome Wasn’t Digitized in a Day”: Building a Cyberinfrastructure for Digital Classics

Cogent and insightful, Rome Wasn’t Digitized in a Day: Building a Cyberinfrastructure for Digital Classicists rewards the reader with a many-faceted exploration of classical studies: the history of this complex and multidimensional field, its development of computer-based resources and tools over the last several decades, its current opportunities and needs in a digital era, and prospects for its future evolution as envisioned by digital classicists. Alison Babeu reminds us early in her report of the astonishing reach of classical studies, a field that includes the disciplines of history, literature, linguistics, art, anthropology, science, and mythology, among others, bounded by the Mycenean culture at its most distant past and continuing to the seventh century C.E. Not surprisingly, within this historical compass the sources for classicists are equally complex: stone fragments, papyri, pottery shards, the plastic arts, coins, and some of the most breathtaking physical structures the world has known.

In the course of this report, the substantial gains in the use of digital technologies in service to classical studies become obvious. Over the past 40 years, remarkable resources have been built, including largescale text databases in a variety of languages; digital repositories for archeological data, as well as for coins and cuneiform tablets; and datasets of texts for paleography and epigraphical studies. Applications that assist the scholar in morphological analysis, citation linking, text mining, and treebank construction, among others, are impressive. The challenges are also significant: there persist problems with the integrity of OCR scans; the interoperability of multimedia data that contain texts, images, and other forms of cultural expression; and the daunting magnitude of so many languages in so many different scripts.

The intellectual return on this investment in technology as a service to classical studies is equally startling and complex. One of the more salient developments has been the reconceptualization of the text. As recently as a generation ago, the “text” in classics was most often defined as a definitive edition, a printed artifact that was by nature static, usually edited by a single scholar, and representing a compilation and collation of several extant variations. Today, through the power and fluidity of digital tools, a text can mean something very different: there may be no canonical artifact, but instead a dataset of its many variations, with none accorded primacy. A work of ancient literature is now more often deeply contextualized, its transmission over time more nuanced, and its continuity among the various instantiations more accurately articulated. The performative nature of some of the great works—the epics of Homer are a prime example—can be captured more rigorously by digital technology, which can layer the centuries of manuscript fragments to produce a sharper understanding of what was emphasized in the epics over time and what passages or stories appear less important from one era to another, affording new insight into the cultural appropriation of these fundamental expressions of the human condition.

Achieving these new perspectives has required a cultural change in the classics. Scholarship in the digital environment is more collaborative, and can include students as integral contributors to the research effort. The connections, continuities, and cultural dialogue to which classical works were subject are reflected by new teams of scholars, working across traditional disciplines (which can often include computer science) to develop new methodological approaches and intellectual strategies in pursuit of knowledge about the ancient world. In this regard, the digital classics encompass new alignments of traditional hierarchies, academic boundaries, and technologies.

The Council on Library and Information Resources is pleased to publish this far-reaching study. The issues and perspectives to which it gives voice pertain significantly to the humanities at large. Its appearance is especially relevant as plans to build very large digital libraries in Europe and the United States flourish. Indeed, a transdisciplinary approach will be essential in constructing a digital environment with the scale and sophistication necessary to support advanced research, teaching, and lifelong learning. As this study suggests, we must continue to engage humanists, engineers, scientists, and all manner of pedagogical expertise in pursuit of a new, transformative educational ecology.

URL : http://www.clir.org/pubs/reports/pub150/pub150.pdf

Knowledge without Borders : GEANT 2020 as the European Communications Commons

The GÉANT Expert Group’s report on the 2020 Vision for European Research and Education Networking was delivered today to Neelie Kroes, European Commission Vice-President for the Digital Agenda. The report presents the experts’ views on the future of the pan-European research and education network GÉANT. It makes specific recommendations to policy makers, funding bodies and the research and education networks community for supporting and expanding knowledge communities, pushing the state-of-the-art in technology and adapting to change both from a governance and funding point of view.

The GÉANT Expert Group, chaired by Prof. Žiga Turk and composed of six other high-level European experts in different fields of policy, technology and science, was set up in 2010 with the mission to “articulate a 2020 vision for European Research and Education networking and identify an action plan for realizing this vision”.

URL : https://www.terena.org/about/ga/ga36/GEANTExpertGroup.pdf

Implementing Web 2.0 Design Patterns in an Institutional Repository May Increase Community Participation

Objective: To investigate whether Web 2.0 can enhance participation in institutional repositories (IRs) and whether its widespread use can lead to success in this context. Another purpose was to emphasize how an IR with a Web 2.0 approach can connect individuals in their creative and intellectual outputs, no matter what form of shared material is contributed.

Design: Comparative study.

Setting: Two IRs at Teachers College, Columbia University, which is a graduate and professional school of education in New York City.

Subjects: Students, faculty, and staff using the PocketKnowledge and CPC IRs.

Methods: Cocciolo compared two different IRs called PocketKnowledge and Community Program Collections (CPC). PocketKnowledge had the following Web 2.0 design patterns: users control their own data; users should be trusted; flexible tags are preferred over hierarchical taxonomies; the attitude should be playful; software gets better the more people use it. The PocketKnowledge IR design patterns were compared with the traditional design of the CPC IR. The CRC IR organized information based on taxonomy (e.g., programs and departments), lack of user control of their own content, and centrality of authority. Data were collected during a 22-month period. The PocketKnowledge IR was studied from September 2006 to July 2008, compiling information on both contributions and contributors. Contributions made by library staff to aid availability in archival collections were excluded from the data sets, because the study was focused on community participation in the learning environment. The CPC was studied between November 2004 and July 2006. Data collected included the contributions made to the system and information on the role of the contributor (e.g., student, faculty, or staff).

Main Results: Participation was much greater in the Web 2.0 system (PocketKnowledge) than in the non-Web 2.0 system (CPC). Involvement in the latter, the CPC, was noted primarily for faculty (59%), with a smaller proportion of students (11%) contributing. This trend was reversed with the Web 2.0 system, in which 79% of the contributions came from students. However, as a group, faculty were better represented than the student body as contributors to the Web 2.0 system (23% and 8% respectively). Faculty members who created an account (without contributing) represented 30% of the population. These observations suggest that Web 2.0 is attractive to students as a space to share their intellectual creations, and at the same time it does not alienate the faculty. Notwithstanding, although 31% of the student body had created a user account for PocketKnowledge, the Web 2.0 system, only 8% of the students actually contributed to this IR. The study examined only the participation rates and was not concerned with what motivated contributions to PocketKnowledge. Accordingly, the results can be extrapolated by observing that the limitation of previous IRs is that they focused primarily on the library goals of collecting and preserving scholarly work, and did not consider what prompted faculty to contribute. Despite the satisfactory participation in the two IRs of interest, the author argued that the incentive is associated more extensively with the role as teacher than with the role as researcher. This is related to the ambition of faculty to improve classroom-based experience by ensuring that their students are as engaged as possible in the teachers’ areas of expertise. In other words, a faculty contribution is motivated by knowing that students will become familiar with what is contributed.

Conclusion: This study suggests that IRs can achieve greater participation by shifting the focus from the library goals to the objective of building localized teaching and learning communities by connecting individuals through their respective intellectual outputs. Creation of a system like the CPC that supports such exchange will advance library goals by storing faculty’s scholarly work, whereas Web 2.0 offers a set of approaches and design patterns for establishing systems that help promote community participation. Greater student participation in an IR may prompt increased faculty participation, because the IR will be more extensively focused on the teaching and learning community than on the research community. Thus, the major finding of the study is that greater community participation resulted from a Web 2.0 design pattern approach.

URL : http://ejournals.library.ualberta.ca/index.php/EBLIP/article/view/9932

Digitize Me, Visualize Me, Search Me : Open Science and its Discontents

[…] Digitize Me, Visualize Me, Search Me takes as its starting point the so-called ‘computational turn’ to data-intensive scholarship in the humanities.

The phrase ‘the computational turn’ has been adopted to refer to the process whereby techniques and methodologies drawn from (in this case) computer science and related fields – including science visualization, interactive information visualization, image processing, network analysis, statistical data analysis, and the management, manipulation and mining of data – are being used to produce new ways of approaching and understanding texts in the humanities; what is sometimes thought of as ‘the digital humanities’.

The concern in the main has been with either digitizing ‘born analog’ humanities texts and artifacts (e.g. making annotated editions of the art and writing of William Blake available to scholars and researchers online), or gathering together ‘born digital’ humanities texts and artifacts (videos, websites, games, photography, sound recordings, 3D data), and then taking complex and often extremely large-scale data analysis techniques from computing science and related fields and applying them to these humanities texts and artifacts – to this ‘big data’, as it has been called.

Witness Lev Manovich and the Software Studies Initiative’s use of ‘digital image analysis and new visualization techniques’ to study ‘20,000 pages of Science and Popular Science magazines… published between 1872-1922, 780 paintings by van Gogh, 4535 covers of Time magazine (1923-2009) and one million manga pages’ (Manovich, 2011), and Dan Cohen and Fred Gibb’s text mining of ‘the 1,681,161 books that were published in English in the UK in the long nineteenth century’ (Cohen, 2010).

What Digitize Me, Visualize Me, Search Me endeavours to show is that such data-focused transformations in research can be seen as part of a major alteration in the status and nature of knowledge. It is an alteration that, according to the philosopher Jean-François Lyotard, has been taking place since at least the 1950s.

It involves nothing less than a shift away from a concern with questions of what is right and just, and toward a concern with legitimating power by optimizing the social system’s performance in instrumental, functional terms. This shift has significant consequences for our idea of knowledge.

[..] In particular, Digitize Me, Visualize Me, Search Me suggests that the turn in the humanities toward datadriven scholarship, science visualization, statistical data analysis, etc. can be placed alongside all those discourses that are being put forward at the moment – in both the academy and society – in the name of greater openness, transparency, efficiency and accountability.

URL : http://livingbooksaboutlife.org/pdfs/bookarchive/DigitizeMe.pdf

Knowledge Sharing Among Inventors: Some Historical Perspectives

This chapter documents instances from past centuries where inventors freely shared knowledge of their innovations with other inventors. It is widely believed that such knowledge sharing is a recent development, as in Open Source Software.

Our survey shows, instead, that innovators have long practiced “collective invention” at times, including inventions in such key technologies as steam engines, iron, steel, and textiles.

Generally, innovator behavior was substantially richer than the heroic portrayal often found in textbooks and museums. Knowledge sharing promoted innovation, sometimes coexisting with patents, at other times, not, suggesting that policy should foster both knowledge sharing and invention incentives.

URL : http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1944201