Does Online Availability Increase Citations? Theory and Evidence from a Panel of Economics and Business Journals :
“Does online availability boost citations? The answer has implications for issues ranging from the value of a citation to the sustainability of open-access journals. Using panel data on citations to economics and business journals, we show that the enormous effects found in previous studies were an artifact of their failure to control for article quality, disappearing once we add fixed effects as controls. The absence of an aggregate effect masks heterogeneity across platforms: JSTOR stands apart from others, boosting citations around 10%. We examine other sources of heterogeneity including whether JSTOR increases cites from authors in developing more than developed countries and increases cites to “long-tail” more than “superstar” articles. Our theoretical analysis informs the econometric specification and allows us to translate our results for citation increases into welfare terms.”
URL : http://ssrn.com/abstract=1746243
Comparing journals from different fields of Science and Social Science through a JCR Subject Categories Normalized Impact Factor :
“The journal Impact Factor (IF) is not comparable among fields of Science and Social Science because of systematic differences in publication and citation behaviour across disciplines. In this work, a decomposing of the field aggregate impact factor into five normally distributed variables is presented. Considering these factors, a Principal Component Analysis is employed to find the sources of the variance in the JCR subject categories of Science and Social Science. Although publication and citation behaviour differs largely across disciplines, principal components explain more than 78% of the total variance and the average number of references per paper is not the primary factor explaining the variance in impact factors across categories. The Categories Normalized Impact Factor (CNIF) based on the JCR subject category list is proposed and compared with the IF. This normalization is achieved by considering all the indexing categories of each journal. An empirical application, with one hundred journals in two or more subject categories of economics and business, shows that the gap between rankings is reduced around 32% in the journals analyzed. This gap is obtained as the maximum distance among the ranking percentiles from all categories where each journal is included.”
URL : http://arxiv.org/abs/1304.5107
A longitudinal comparison of citation rates and growth among open access journals :
“The study documents the growth in the number of journals and articles along with the increase in normalized citation rates of open access (OA) journals listed in the Scopus bibliographic database between 1999 and 2010. Longitudinal statistics on growth in journals/articles and citation rates are broken down by funding model, discipline, and whether the journal was launched or had converted to OA. The data we re retrieved from the web sites of SCIMago Journal and Country Rank (journal /article counts), JournalM3trics (SNIP2 values), Scopus (journal discipline) and Director y of Open Access Journals (DOAJ) (OA and funding status). OA journals/articles have grown much faster than subscription journals but still make up less that 12% of the journals in Scopus. Two-year cita tion averages for journals funded by article processing charges (APCs) have reached the same level as subscription journals. Citation averages of OA journals funded by other means continue to lag well behind OA journals funded by APCs and subscription journals. We hypothesize this is less an issue of quality than due to the fact that such journals are commonly published in languages other than English and tend to be located outside the four major publishing countries.”
URL : http://www.openaccesspublishing.org/apc9/acceptedversion.pdf
Collaboration scientifique et citations des articles : Quelles pratiques dans les revues médicales ? :
“Objectifs : La meilleure façon de caractériser la collaboration scientifique est d’étudier la co-signature des articles. Deux indicateurs sont intéressants : le nombre d’auteurs et son caractère international. L’objectif est d’étudier la corrélation entre ces deux indicateurs et le nombre de citations.
Méthodes : Nous avons sélectionné deux journaux de pharmacie et médecine afin de comparer les pratiques. Nous avons utilisé un échantillon d’environ 800 articles publiés entre 2002 et 2005 dont nous avons collecté les citations jusqu’en 2010. Nous avons transformé nos variables numériques, nombre d’auteurs et nombre de citations, en variables qualitatives.
Résultats : Les variables «auteurs» et «citations» ne sont pas indépendantes.
Conclusions. Les articles les moins cités sont souvent publiés par un seul auteur ou par une équipe très réduite alors que le caractère international des articles est un facteur qui en général augmente le nombre de citations. Cette micro-analyse a permis également de mieux appréhender certaines pratiques éditoriales.”
URL : http://hal.archives-ouvertes.fr/hal-00775307
Traditionally, scholarly impact and visibility have been measured by counting publications and citations in the scholarly literature. However, increasingly scholars are also visible on the Web, establishing presences in a growing variety of social ecosystems.
But how wide and established is this presence, and how do measures of social Web impact relate to their more traditional counterparts? To answer this, we sampled 57 presenters from the 2010 Leiden STI Conference, gathering publication and citations counts as well as data from the presenters’ Web “footprints.”
We found Web presence widespread and diverse: 84% of scholars had homepages, 70% were on LinkedIn, 23% had public Google Scholar profiles, and 16% were on Twitter. For sampled scholars’ publications, social reference manager bookmarks were compared to Scopus and Web of Science citations; we found that Mendeley covers more than 80% of sampled articles, and that Mendeley bookmarks are significantly correlated (r=.45) to Scopus citation counts.”
URL : http://2012.sticonference.org/Proceedings/vol1/Bar-Ilan_Beyond_98.pdf
Citation of Open Access Resources by African Researchers in Corrosion Chemistry :
“The authors performed a citation analysis of 15 source papers that were written in the field of corrosion chemistry. Each of the source papers was written on the specific topic of the corrosion of mild steel. The researchers were from a variety of different countries, including China, Egypt, Germany, India, Lesotho, Nigeria, Saudi Arabia, South Africa, Spain, Turkey, and the United States. The authors found that articles that had one or more researchers from an African country cited works that had versions in the Open Access (OA) domain twice as often (12.2%) when compared to articles that did not have an author from Africa (5.5%). The authors also evaluated the types of Open Access resources that the researchers cited, and the error rates found within their citations.”
URL : http://hdl.handle.net/10760/17568
A measure of total research impact independent of time and discipline :
“Authorship and citation practices evolve with time and differ by academic discipline. As such, indicators of research productivity based on citation records are naturally subject to historical and disciplinary effects. We observe these effects on a corpus of astronomer career data constructed from a database of refereed publications. We employ a simple mechanism to measure research output using author and reference counts available in bibliographic databases to develop a citation-based indicator of research productivity. The total research impact (tori) quantifies, for an individual, the total amount of scholarly work that others have devoted to his/her work, measured in the volume of research papers. A derived measure, the research impact quotient (riq), is an age independent measure of an individual’s research ability. We demonstrate that these measures are substantially less vulnerable to temporal debasement and cross-disciplinary bias than the most popular current measures. The proposed measures of research impact, tori and riq, have been implemented in the Smithsonian/NASA Astrophysics Data System.”
URL : http://arxiv.org/abs/1209.2124