« In this study, we compare the difference in the impact between open access (OA) and non-open access (non-OA) articles. 1761 Nature Communications articles published from 1 Jan. 2012 to 31 Aug. 2013 are selected as our research objects, including 587 OA articles and 1174 non-OA articles. Citation data and daily updated article-level metrics data are harvested directly from the platform of nature.com. Data is analyzed from the static versus temporal-dynamic perspectives. The OA citation advantage is confirmed, and the OA advantage is also applicable when extending the comparing from citation to article views and social media attention. More important, we find that OA papers not only have the great advantage of total downloads, but also have the feature of keeping sustained and steady downloads for a long time. For article downloads, non-OA papers only have a short period of attention, when the advantage of OA papers exists for a much longer time. »
« The digital revolution has made it easier for political scientists to share and access high-quality research online. However, many articles are stored in proprietary databases that some institutions cannot afford. High-quality, peer-reviewed, top-tier journal articles that have been made open access (OA) (i.e., freely available online) theoretically should be accessed and cited more easily than articles of similar quality that are available only to paying customers. Research into the efficacy of OA publishing thus far has focused mainly on the natural sciences, and the results have been mixed. Because OA has not been as widely adopted in the social sciences, disciplines such as political science have received little attention in the OA research. In this article, we seek to determine the efficacy of OA in political science. Our primary hypothesis is that OA articles will be cited at higher rates than articles that are toll access (TA), which means available only to paying customers. We test this hypothesis by analyzing the mean citation rates of OA and TA articles from eight top-ranked political science journals. We find that OA publication results in a clear citation advantage in political science publishing. »
« Evaluating and comparing the academic performance of a journal, a researcher or a single paper has long remained a critical, necessary but also controversial issue. Most of existing metrics invalidate comparison across different fields of science or even between different types of papers in the same field. This paper proposes a new metric, called return on citation (ROC), which is simply a citation ratio but applies to evaluating the paper, the journal and the researcher in a consistent way, allowing comparison across different fields of science and between different types of papers and discouraging unnecessary and coercive/self-citation. »
« In this paper, we examine the evolution of the impact of older scholarly articles. We attempt to answer four questions. First, how often are older articles cited and how has this changed over time. Second, how does the impact of older articles vary across different research fields. Third, is the change in the impact of older articles accelerating or slowing down. Fourth, are these trends different for much older articles.
To answer these questions, we studied citations from articles published in 1990-2013. We computed the fraction of citations to older articles from articles published each year as the measure of impact. We considered articles that were published at least 10 years before the citing article as older articles. We computed these numbers for 261 subject categories and 9 broad areas of research. Finally, we repeated the computation for two other definitions of older articles, 15 years and older and 20 years and older.
There are three conclusions from our study. First, the impact of older articles has grown substantially over 1990-2013. In 2013, 36% of citations were to articles that are at least 10 years old; this fraction has grown 28% since 1990. The fraction of older citations increased over 1990-2013 for 7 out of 9 broad areas and 231 out of 261 subject categories. Second, the increase over the second half (2002-2013) was double the increase in the first half (1990-2001).
Third, the trend of a growing impact of older articles also holds for even older articles. In 2013, 21% of citations were to articles >= 15 years old with an increase of 30% since 1990 and 13% of citations were to articles >= 20 years old with an increase of 36%.
Now that finding and reading relevant older articles is about as easy as finding and reading recently published articles, significant advances aren’t getting lost on the shelves and are influencing work worldwide for years after. »
« Citation indicators are increasingly used in book-based disciplines to support peer review in the evaluation of authors and to gauge the prestige of publishers. However, because global citation databases seem to offer weak coverage of books outside the West, it is not clear whether the influence of non-Western books can be assessed with citations. To investigate this, citations were extracted from Google Books and Google Scholar to 1,357 arts, humanities and social sciences (AHSS) books published by 5 university presses during 1961–2012 in 1 non-Western nation, Malaysia. A significant minority of the books (23% in Google Books and 37% in Google Scholar, 45% in total) had been cited, with a higher proportion cited if they were older or in English. The combination of Google Books and Google Scholar is therefore recommended, with some provisos, for non-Western countries seeking to differentiate between books with some impact and books with no impact, to identify the highly-cited works or to develop an indicator of academic publisher prestige. »
« In this paper, we examine the evolution of the impact of non-elite journals. We attempt to answer two questions. First, what fraction of the top-cited articles are published in non-elite journals and how has this changed over time. Second, what fraction of the total citations are to non-elite journals and how has this changed over time.
We studied citations to articles published in 1995-2013. We computed the 10 most-cited journals and the 1000 most-cited articles each year for all 261 subject categories in Scholar Metrics. We marked the 10 most-cited journals in a category as the elite journals for the category and the rest as non-elite.
There are two conclusions from our study. First, the fraction of top-cited articles published in non-elite journals increased steadily over 1995-2013. While the elite journals still publish a substantial fraction of high-impact articles, many more authors of well-regarded papers in diverse research fields are choosing other venues.
The number of top-1000 papers published in non-elite journals for the representative subject category went from 149 in 1995 to 245 in 2013, a growth of 64%. Looking at broad research areas, 4 out of 9 areas saw at least one-third of the top-cited articles published in non-elite journals in 2013. For 6 out of 9 areas, the fraction of top-cited papers published in non-elite journals for the representative subject category grew by 45% or more.
Second, now that finding and reading relevant articles in non-elite journals is about as easy as finding and reading articles in elite journals, researchers are increasingly building on and citing work published everywhere. Considering citations to all articles, the percentage of citations to articles in non-elite journals went from 27% in 1995 to 47% in 2013. Six out of nine broad areas had at least 50% of citations going to articles published in non-elite journals in 2013. »
« The aim of this research was to identify the motivations for citation to Wikipedia in scientific papers. Also, the number of citation to Wikipedia, location of citation, type of citing papers, subject of citing and cited articles were determined and compared in different subject fields. From all English articles indexed in Scopus in 2007 and 2012 that have cited Wikipedia, 602 articles were selected using stratified random sampling. Content analysis and bibliometric methods were used to carry out the research. Results showed that there are 20 motivations for citing Wikipedia and the most frequent of them are providing general information and definition, facts and figures. Citations to Wikipedia often were in the introduction and introductory sections of papers. Computer sciences, internet and chemistry were the most cited subjects. The use of Wikipedia in articles is increasing both in terms of quantity and diversity. However, there are disciplinary differences both in the amount and the nature of use of Wikipedia. »
The Citation Merit of Scientific Publications :
« We propose a new method to assess the merit of any set of scientific papers in a given field based on the citations they receive. Given a field and a citation impact indicator, such as the mean citation or the h-index, the merit of a given set of n articles is identified with the probability that a randomly drawn set of n articles from a given pool of articles in that field has a lower citation impact according to the indicator in question. The method allows for comparisons between sets of articles of different sizes and fields. Using a dataset acquired from Thomson Scientific that contains the articles published in the periodical literature in the period 1998–2007, we show that the novel approach yields rankings of research units different from those obtained by a direct application of the mean citation or the h-index. »
URL : http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0049156
Towards a Book Publishers Citation Reports. First approach using the Book Citation Index :
« The absence of books and book chapters in the Web of Science Citation Indexes (SCI, SSCI and A&HCI) has always been considered an important flaw but the Thomson Reuters ‘Book Citation Index’ database was finally available in October of 2010 indexing 29,618 books and 379,082 book chapters. The Book Citation Index opens a new window of opportunities for analyzing these fields from a bibliometric point of view. The main objective of this article is to analyze different impact indicators referred to the scientific publishers included in the Book Citation Index for the Social Sciences and Humanities fields during 2006-2011. This way we construct what we have called the ‘Book Publishers Citation Reports’. For this, we present a total of 19 rankings according to the different disciplines in Humanities & Arts and Social Sciences & Law with six indicators for scientific publishers »
URL : http://arxiv.org/abs/1207.7067
A citation analysis of top research papers of computer science :
« The study intends to evaluate the top papers of Computer Science as reflected in Science Direct. Moreover, it aims to find out authorship pattern, ranking of authors, ranking of country productivity, ranking of journals, and highly cited papers of Computer Science. The citations data have been collected from the quarterly list of hottest 25 research articles in the subject field of Computer Science from Science Direct database. In the present study, 20 issues of the alert service beginning from January/March 2005 to October/December 2010 containing a total number of 495 articles in Computer Science have been taken up for analysis. The study reveals that out of 495 top papers; three-authored articles are little ahead than two authored articles followed by four-authored articles and the country productivity of USA is at the top followed by UK, Taiwan, Chaina, and Canada. Moreover, it finds that European Journal of Operational Research occupies the top position followed by Computers in Human Behavior, and Pattern Recognition. »
URL : http://hdl.handle.net/10760/16859