« Peer review is an important element of scientific communication but deserves quantitative examination. We used data from the handling service manuscript Central for ten mid-tier ecology and evolution journals to test whether number of external reviews completed improved citation rates for all accepted manuscripts. Contrary to a previous study examining this issue using resubmission data as a proxy for reviews, we show that citation rates of manuscripts do not correlate with the number of individuals that provided reviews. Importantly, externally-reviewed papers do not outperform editor-only reviewed published papers in terms of visibility within a 5-year citation window. These findings suggest that in many instances editors can be all that is needed to review papers (or at least conduct the critical first review to assess general suitability) if the purpose of peer review is to primarily filter and that journals can consider reducing the number of referees associated with reviewing ecology and evolution papers. »
URL : With Great Power Comes Great Responsibility: the Importance of Rejection, Power, and Editors in the Practice of Scientific Publishing
Data reuse and the open data citation advantage :
« Background: Attribution to the original contributor upon reuse of published data is important both as a reward for data creators and to document the provenance of research findings. Previous studies have found that papers with publicly available datasets receive a higher number of citations than similar studies without available data. However, few previous analyses have had the statistical power to control for the many variables known to predict citation rate, which has led to uncertain estimates of the “citation benefit”. Furthermore, little is known about patterns in data reuse over time and across datasets.
Method and Results: Here, we look at citation rates while controlling for many known citation predictors and investigate the variability of data reuse. In a multivariate regression on 10,555 studies that created gene expression microarray data, we found that studies that made data available in a public repository received 9% (95% confidence interval: 5% to 13%) more citations than similar studies for which the data was not made available. Date of publication, journal impact factor, open access status, number of authors, first and last author publication history, corresponding author country, institution citation history, and study topic were included as covariates. The citation benefit varied with date of dataset deposition: a citation benefit was most clear for papers published in 2004 and 2005, at about 30%. Authors published most papers using their own datasets within two years of their first publication on the dataset, whereas data reuse papers published by third-party investigators continued to accumulate for at least six years. To study patterns of data reuse directly, we compiled 9,724 instances of third party data reuse via mention of GEO or ArrayExpress accession numbers in the full text of papers. The level of third-party data use was high: for 100 datasets deposited in year 0, we estimated that 40 papers in PubMed reused a dataset by year 2, 100 by year 4, and more than 150 data reuse papers had been published by year 5. Data reuse was distributed across a broad base of datasets: a very conservative estimate found that 20% of the datasets deposited between 2003 and 2007 had been reused at least once by third parties.
Conclusion: After accounting for other factors affecting citation rate, we find a robust citation benefit from open data, although a smaller one than previously reported. We conclude there is a direct effect of third-party data reuse that persists for years beyond the time when researchers have published most of the papers reusing their own data. Other factors that may also contribute to the citation benefit are considered. We further conclude that, at least for gene expression microarray data, a substantial fraction of archived datasets are reused, and that the intensity of dataset reuse has been steadily increasing since 2003. »
URL : https://peerj.com/articles/175/
The Rich Get Richer and the Poor Get Poorer: The Effect of Open Access on Cites to Science Journals Across the Quality Spectrum :
« An open-access journal allows free online access to its articles, obtaining revenue from fees charged to submitting authors. Using panel data on science journals, we are able to circumvent some problems plaguing previous studies of the impact of open access on citations. We find that moving from paid to open access increases cites by 8% on average in our sample, but the effect varies across the quality of content. Open access increases cites to the best content (top-ranked journals or articles in upper quintiles of citations within a volume) but reduces cites to lower-quality content. We construct a model to explain these findings in which being placed on a broad open-access platform can increase the competition among articles for readers’ attention. We can find structural parameters allowing the model to fit the quintile results quite closely. »
URL : http://ssrn.com/abstract=2269040
Delayed Open Access – an overlooked high-impact category of openly available scientific literature:
« Delayed open access (OA) refers to scholarly articles in subscription journals made available openly on the web directly through the publisher at the expiry of a set embargo period. Though a substantial number of journals have practiced delayed OA since they started publishing e-versions, empirical studies concerning open access have often overlooked this body of literature. This study provides comprehensive quantitative measurements by identifying delayed OA journals, collecting data concerning their publication volumes, embargo lengths, and citation rates. Altogether 492 journals were identified, publishing a combined total of 111 312 articles in 2011. 77,8 % of these articles were made open access within 12 months from publication, with 85,4 % becoming available within 24 months. A journal impact factor analysis revealed that delayed OA journals have on average twice as high average citation rates compared to closed subscription journals, and three times as high as immediate OA journals. Overall the results demonstrate that delayed OA journals constitute an important segment of the openly available scholarly journal literature, both by their sheer article volume as well as by including a substantial proportion of high impact journals. »
URL : http://hanken.halvi.helsinki.fi/portal/en/publications/delayed-open-access%28a2eb7a79-1078-4657-9d57-4f9f5a1ff228%29.html
Does Online Availability Increase Citations? Theory and Evidence from a Panel of Economics and Business Journals :
« Does online availability boost citations? The answer has implications for issues ranging from the value of a citation to the sustainability of open-access journals. Using panel data on citations to economics and business journals, we show that the enormous effects found in previous studies were an artifact of their failure to control for article quality, disappearing once we add fixed effects as controls. The absence of an aggregate effect masks heterogeneity across platforms: JSTOR stands apart from others, boosting citations around 10%. We examine other sources of heterogeneity including whether JSTOR increases cites from authors in developing more than developed countries and increases cites to “long-tail” more than “superstar” articles. Our theoretical analysis informs the econometric specification and allows us to translate our results for citation increases into welfare terms. »
URL : http://ssrn.com/abstract=1746243
Comparing journals from different fields of Science and Social Science through a JCR Subject Categories Normalized Impact Factor :
« The journal Impact Factor (IF) is not comparable among fields of Science and Social Science because of systematic differences in publication and citation behaviour across disciplines. In this work, a decomposing of the field aggregate impact factor into five normally distributed variables is presented. Considering these factors, a Principal Component Analysis is employed to find the sources of the variance in the JCR subject categories of Science and Social Science. Although publication and citation behaviour differs largely across disciplines, principal components explain more than 78% of the total variance and the average number of references per paper is not the primary factor explaining the variance in impact factors across categories. The Categories Normalized Impact Factor (CNIF) based on the JCR subject category list is proposed and compared with the IF. This normalization is achieved by considering all the indexing categories of each journal. An empirical application, with one hundred journals in two or more subject categories of economics and business, shows that the gap between rankings is reduced around 32% in the journals analyzed. This gap is obtained as the maximum distance among the ranking percentiles from all categories where each journal is included. »
URL : http://arxiv.org/abs/1304.5107
A longitudinal comparison of citation rates and growth among open access journals :
« The study documents the growth in the number of journals and articles along with the increase in normalized citation rates of open access (OA) journals listed in the Scopus bibliographic database between 1999 and 2010. Longitudinal statistics on growth in journals/articles and citation rates are broken down by funding model, discipline, and whether the journal was launched or had converted to OA. The data we re retrieved from the web sites of SCIMago Journal and Country Rank (journal /article counts), JournalM3trics (SNIP2 values), Scopus (journal discipline) and Director y of Open Access Journals (DOAJ) (OA and funding status). OA journals/articles have grown much faster than subscription journals but still make up less that 12% of the journals in Scopus. Two-year cita tion averages for journals funded by article processing charges (APCs) have reached the same level as subscription journals. Citation averages of OA journals funded by other means continue to lag well behind OA journals funded by APCs and subscription journals. We hypothesize this is less an issue of quality than due to the fact that such journals are commonly published in languages other than English and tend to be located outside the four major publishing countries. »
URL : http://www.openaccesspublishing.org/apc9/acceptedversion.pdf
Collaboration scientifique et citations des articles : Quelles pratiques dans les revues médicales ? :
« Objectifs : La meilleure façon de caractériser la collaboration scientifique est d’étudier la co-signature des articles. Deux indicateurs sont intéressants : le nombre d’auteurs et son caractère international. L’objectif est d’étudier la corrélation entre ces deux indicateurs et le nombre de citations.
Méthodes : Nous avons sélectionné deux journaux de pharmacie et médecine afin de comparer les pratiques. Nous avons utilisé un échantillon d’environ 800 articles publiés entre 2002 et 2005 dont nous avons collecté les citations jusqu’en 2010. Nous avons transformé nos variables numériques, nombre d’auteurs et nombre de citations, en variables qualitatives.
Résultats : Les variables «auteurs» et «citations» ne sont pas indépendantes.
Conclusions. Les articles les moins cités sont souvent publiés par un seul auteur ou par une équipe très réduite alors que le caractère international des articles est un facteur qui en général augmente le nombre de citations. Cette micro-analyse a permis également de mieux appréhender certaines pratiques éditoriales. »
URL : http://hal.archives-ouvertes.fr/hal-00775307
Beyond Citations: Scholars’ Visibility on the Social Web :
« Traditionally, scholarly impact and visibility have been measured by counting publications and citations in the scholarly literature. However, increasingly scholars are also visible on the Web, establishing presences in a growing variety of social ecosystems. But how wide and established is this presence, and how do measures of social Web impact relate to their more traditional counterparts? To answer this, we sampled 57 presenters from the 2010 Leiden STI Conference, gathering publication and citations counts as well as data from the presenters’ Web “footprints.” We found Web presence widespread and diverse: 84% of scholars had homepages, 70% were on LinkedIn, 23% had public Google Scholar profiles, and 16% were on Twitter. For sampled scholars’ publications, social reference manager bookmarks were compared to Scopus and Web of Science citations; we found that Mendeley covers more than 80% of sampled articles, and that Mendeley bookmarks are significantly correlated (r=.45) to Scopus citation counts. »
URL : http://2012.sticonference.org/Proceedings/vol1/Bar-Ilan_Beyond_98.pdf
Citation of Open Access Resources by African Researchers in Corrosion Chemistry :
« The authors performed a citation analysis of 15 source papers that were written in the field of corrosion chemistry. Each of the source papers was written on the specific topic of the corrosion of mild steel. The researchers were from a variety of different countries, including China, Egypt, Germany, India, Lesotho, Nigeria, Saudi Arabia, South Africa, Spain, Turkey, and the United States. The authors found that articles that had one or more researchers from an African country cited works that had versions in the Open Access (OA) domain twice as often (12.2%) when compared to articles that did not have an author from Africa (5.5%). The authors also evaluated the types of Open Access resources that the researchers cited, and the error rates found within their citations. »
URL : http://hdl.handle.net/10760/17568