Social engagement and institutional repositories: a case study

Author : Susan Boulton

This article explores the community reach and societal impact of institutional repositories, in particular Griffith Research Online (GRO), Griffith University’s institutional repository.

To promote research on GRO, and to encourage people to click through to the repository content, a pilot social media campaign and some subsequent smaller social media activities were undertaken in 2018.

After briefly touching on these campaigns, this article provides some reflections from these activities and proposes options for the future direction of social engagement and GRO in particular, and for institutional repositories in general.

This undertaking necessitates a shift in focus from repositories as a resource for the scholarly community to a resource for the community at large. The campaign also highlighted the need to look beyond performance metrics to social media metrics as a measure of the social and community impact of a repository.

Whilst the article is written from one Australian university’s perspective, the drivers and challenges behind researchers and universities translating their research into economic, social, environmental and cultural impacts are national and international.

The primary takeaway message is for libraries to take more of a proactive stance and to kick-start conversations within their institutions and with their clients to actively partner in creating opportunities to share research.

URL : Social engagement and institutional repositories: a case study


Rethinking the Journal Impact Factor and Publishing in the Digital Age

Authors : Mark S. Nestor, Daniel Fischer, David Arnold, Brian Berman, James Q. Del Rosso

Clinical and experimental literature search has changed significantly over the past few decades, and with it, the way in which we value information. Today, our need for immediate access to relevant and specific literature, regardless of specialty, has led to a growing demand for open access to publications.

The Journal Impact Factor (JIF) has been a long-time standard for representing the quality or “prestige” of a journal, but it appears to be losing its relevance. Here, we define the JIF and deconstruct its validity as a modern measure of a journal’s quality, discuss the current models of academic publication, including their advantages and shortcomings, and discuss the benefits and shortcomings of a variety of open-access models, including costs to the author.

We have quantified a nonsubscribed physician’s access to full articles associated with dermatologic disease and aesthetics cited on PubMed. For some of the most common dermatology conditions, 23.1 percent of citations (ranging from 17.2% for melasma to 31.9% for malignant melanoma) were available as free full articles, and for aesthetic procedures, 18.9 percent of citations (ranging from 11.9% for laser hair removal to 27.9% for botulinum toxin) were available as free full articles.

Finally, we discuss existing alternative metrics for measuring journal impact and propose the adoption of a superior publishing model, one that satisfies modern day standards of scholarly knowledge pursuit and dissemination of scholarly publications for dermatology and all of medical science.


Envisioning the scientific paper of the future

Authors : Natalie M. Sopinka, Laura E. Coristine, Maria C. DeRosa, Chelsea M. Rochman, Brian L. Owens, Steven J. Cooke

Consider for a moment the rate of advancement in the scientific understanding of DNA. It is formidable; from Fredrich Miescher’s nuclein extraction in the 1860s to Rosalind Franklin’s double helix X-ray in the 1950s to revolutionary next-generation sequencing in the late 2000s.

Now consider the scientific paper, the medium used to describe and publish these advances. How is the scientific paper advancing to meet the needs of those who generate and use scientific information?

We review four essential qualities for the scientific paper of the future: (i) a robust source of trustworthy information that remains peer reviewed and is (ii) communicated to diverse users in diverse ways, (iii) open access, and (iv) has a measurable impact beyond Impact Factor.

Since its inception, scientific literature has proliferated. We discuss the continuation and expansion of practices already in place including: freely accessible data and analytical code, living research and reviews, changes to peer review to improve representation of under-represented groups, plain language summaries, preprint servers, evidence-informed decision-making, and altmetrics.

URL : Envisioning the scientific paper of the future


The stability of Twitter metrics: A study on unavailable Twitter mentions of scientific publications

Authors : Zhichao Fang, Jonathan Dudek, Rodrigo Costas

This paper investigates the stability of Twitter counts of scientific publications over time. For this, we conducted an analysis of the availability statuses of over 2.6 million Twitter mentions received by the 1,154 most tweeted scientific publications recorded by this http URL up to October 2017.

Results show that of the Twitter mentions for these highly tweeted publications, about 14.3% have become unavailable by April 2019. Deletion of tweets by users is the main reason for unavailability, followed by suspension and protection of Twitter user accounts.

This study proposes two measures for describing the Twitter dissemination structures of publications: Degree of Originality (i.e., the proportion of original tweets received by a paper) and Degree of Concentration (i.e., the degree to which retweets concentrate on a single original tweet).

Twitter metrics of publications with relatively low Degree of Originality and relatively high Degree of Concentration are observed to be at greater risk of becoming unstable due to the potential disappearance of their Twitter mentions.

In light of these results, we emphasize the importance of paying attention to the potential risk of unstable Twitter counts, and the significance of identifying the different Twitter dissemination structures when studying the Twitter metrics of scientific publications.


Altmetrics data providers: A meta-analysis review of the coverage of metrics and publication

Author : José-Luis Ortega

The aim of this paper is to review the current and most relevant literature on the use of altmetric providers since 2012. This review is supported by a meta-analysis of the coverage and metric counts obtained by more than 100 publications that have used these bibliographic platforms for altmetric studies.

The article is the most comprehensive analysis of altmetric data providers (Lagotto,, ImpactStory, Mendeley, PlumX, Crossref Event Data) and explores the coverage of publications, social media and events from a longitudinal view. Disciplinary differences were also analysed.

The results show that most of the studies are based on data. This provider is the service that captures most mentions from social media sites, blogs and news outlets. PlumX has better coverage, counting more Mendeley readers, but capturing fewer events.

CED has a special coverage of mentions from Wikipedia, while Lagotto and ImpactStory are becoming disused products because of their limited reach.

URL : Altmetrics data providers: A meta-analysis review of the coverage of metrics and publication

Original location :

Altmetrics and societal impact measurements: Match or mismatch? A literature review

Authors : Iman Tahamtan, Lutz Bornmann

Can alternative metrics (altmetrics) data be used to measure societal impact? We wrote this literature overview of empirical studies in order to find an answer to this question. The overview includes two parts.

The first part, “societal impact measurements”, explains possible methods and problems in measuring the societal impact of research, case studies for societal impact measurement, societal impact considerations at funding organizations, and the societal problems that should be solved by science.

The second part of the review, “altmetrics”, addresses a major question in research evaluation, which is whether altmetrics are proper indicators for measuring the societal impact of research. In the second part we explain the data sources used for altmetrics studies and the importance of field-normalized indicators for impact measurements.

This review indicates that it should be relevant for impact measurements to be oriented towards pressing societal problems. Case studies in which societal impact of certain pieces of research is explained seem to provide a legitimate method for measuring societal impact.

In the use of altmetrics, field-specific differences should be considered by applying field normalization (in cross-field comparisons). Altmetrics data such as social media counts might mainly reflect the public interest and discussion of scholarly works rather than their societal impact.

Altmetrics (Twitter data) might be especially fruitfully employed for research evaluation purposes, if they are used in the context of network approaches. Conclusions based on altmetrics data in research evaluation should be drawn with caution.

URL : Altmetrics and societal impact measurements: Match or mismatch? A literature review

Original location :

How much research shared on Facebook is hidden from public view? A comparison of public and private online activity around PLOS ONE papers

Authors : Asura Enkhbayar, Stefanie Haustein, Germana Barata, Juan Pablo Alperin

Despite its undisputed position as the biggest social media platform, Facebook has never entered the main stage of altmetrics research. In this study, we argue that the lack of attention by altmetrics researchers is not due to a lack of relevant activity on the platform, but because of the challenges in collecting Facebook data have been limited to activity that takes place in a select group of public pages and groups.

We present a new method of collecting shares, reactions, and comments across the platform-including private timelines-and use it to gather data for all articles published between 2015 to 2017 in the journal PLOS ONE.

We compare the gathered data with altmetrics collected and aggregated by Altmetric. The results show that 58.7% of papers shared on the platform happen outside of public view and that, when collecting all shares, the volume of activity approximates patterns of engagement previously only observed for Twitter.

Both results suggest that the role and impact of Facebook as a medium for science and scholarly communication has been underestimated. Furthermore, they emphasise the importance of openness and transparency around the collection and aggregation of altmetrics.