The presence of High-impact factor Open Access Journals in Science, Technology, Engineering and Medicine (STEM) disciplines


The present study means to establish to what extent high-quality open access journals are available as an outlet for publication, by examining their distribution in different scientific disciplines, including the distribution of those journals without article processing charges.

The study is based on a systematic comparison between the journals included in the DOAJ, and the journals indexed in the Journal Citation Reports (JCR) Science edition 2013, released by Thomson Reuters.

The impact factor of Open Access (OA) journals was lower than those of other journals by a small but statistically significant amount. Open access journals are present in the upper quartile (by impact factor) of 85 out of 176 (48.8%) categories examined. There were no OA journals with an Impact Factor in only 16 categories (9%).

URL : The presence of High-impact factor Open Access Journals in Science, Technology, Engineering and Medicine (STEM) disciplines

Alternative location :

A New Ranking Scheme for the Institutional Scientific Performance


We propose a new performance indicator to evaluate the productivity of research institutions by their disseminated scientific papers. The new quality measure includes two principle components: the normalized impact factor of the journal in which paper was published, and the number of citations received per year since it was published. In both components, the scientific impacts are weighted by the contribution of authors from the evaluated institution.

As a whole, our new metric, namely, the institutional performance score takes into account both journal based impact and articles specific impacts. We apply this new scheme to evaluate research output performance of Turkish institutions specialized in astronomy and astrophysics in the period of 1998-2012. We discuss the implications of the new metric, and emphasize the benefits of it along with comparison to other proposed institutional performance indicators.


Retraction policies of top scientific journals ranked by impact factor



This study gathered information about the retraction policies of the top 200 scientific journals, ranked by impact factor.


Editors of the top 200 science journals for the year 2012 were contacted by email.


One hundred forty-seven journals (74%) responded to a request for information. Of these, 95 (65%) had a retraction policy. Of journals with a retraction policy, 94% had a policy that allows the editors to retract articles without authors’ consent.


The majority of journals in this sample had a retraction policy, and almost all of them would retract an article without the authors’ permission.


The Dawn of Open Access to Phylogenetic Data


“The scientific enterprise depends critically on the preservation of and open access to published data. This basic tenet applies acutely to phylogenies (estimates of evolutionary relationships among species). Increasingly, phylogenies are estimated from increasingly large, genome-scale datasets using increasingly complex statistical methods that require increasing levels of expertise and computational investment. Moreover, the resulting phylogenetic data provide an explicit historical perspective that critically informs research in a vast and growing number of scientific disciplines. One such use is the study of changes in rates of lineage diversification (speciation – extinction) through time. As part of a meta-analysis in this area, we sought to collect phylogenetic data (comprising nucleotide sequence alignment and tree files) from 217 studies published in 46 journals over a 13-year period. We document our attempts to procure those data (from online archives and by direct request to corresponding authors), and report results of analyses (using Bayesian logistic regression) to assess the impact of various factors on the success of our efforts. Overall, complete phylogenetic data for of these studies are effectively lost to science. Our study indicates that phylogenetic data are more likely to be deposited in online archives and/or shared upon request when: (1) the publishing journal has a strong data-sharing policy; (2) the publishing journal has a higher impact factor, and; (3) the data are requested from faculty rather than students. Importantly, our survey spans recent policy initiatives and infrastructural changes; our analyses indicate that the positive impact of these community initiatives has been both dramatic and immediate. Although the results of our study indicate that the situation is dire, our findings also reveal tremendous recent progress in the sharing and preservation of phylogenetic data.”

URL : The Dawn of Open Access to Phylogenetic Data

DOI: 10.1371/journal.pone.0110268

The impact factors of open access and subscription journals across fields


“We have compared the 2-year and 5-year impact factors (IFs), normalized impact factors (NIFs) and rank normalized impact factors (RNIFs) of open access (OA) and subscription journals across the 22 major fields delineated in Essential Science Indicators. Journal Citation Reports (JCR) 2012 has assigned 2-year IF to 1,073 OA and 7,290 subscription journals and 5-year IF to 811 OA and 6,705 subscription journals. Overall 12.8% of journals listed in JCR are OA, but a higher percentage of journals are OA in 9 fields, including multidisciplinary (31%), agriculture (19.1%) and microbiology (19.1). Overall 2-year IF is higher than 5-year IF in a bout 31.5% journals in both OA and subscription journals. But among physics journals , two-thirds of OA journals and 58% of sub-scription journals have a higher 2-year IF. For multidisciplinary journals the mean RNIF is higher for OA journals than subscription journals. Higher proportion of subscription journals had mean
RNIF above 0.5: 361 of 1,073 OA journals (33.6%) and 3,857 of 7,280 subscription journals (52.9%) had a 2-year mean RNIF above 0.5 and 277 of 811 OA journals (34.2%) and 3,453 of 6705 (51.5%) subscription journals had a 5-year mean RINF above 0.5. Moving to OA has proven to be advantageous to developing country journals; it has helped a large number of Latin American and many Indian journals improve their IF.”


Why Do We Still Have Journals?


“The Web has greatly reduced the barriers to entry for new journals and other platforms for communicating scientific output, and the number of journals continues to multiply. This leaves readers and authors with the daunting cognitive challenge of navigating the literature and discerning contributions that are both relevant and significant. Meanwhile, measures of journal impact that might guide the use of the literature have become more visible and consequential, leading to “impact gamesmanship” that renders the measures increasingly suspect. The incentive system created by our journals is broken. In this essay, I argue that the core technology of journals is not their distribution but their review process. The organization of the review process reflects assumptions about what a contribution is and how it should be evaluated. Through their review processes, journals can certify contributions, convene scholarly communities, and curate works that are worth reading. Different review processes thereby create incentives for different kinds of work. It’s time for a broader dialogue about how we connect the aims of the social science enterprise to our system of journals.”


Comparing journals from different fields of Science and…


Comparing journals from different fields of Science and Social Science through a JCR Subject Categories Normalized Impact Factor :

“The journal Impact Factor (IF) is not comparable among fields of Science and Social Science because of systematic differences in publication and citation behaviour across disciplines. In this work, a decomposing of the field aggregate impact factor into five normally distributed variables is presented. Considering these factors, a Principal Component Analysis is employed to find the sources of the variance in the JCR subject categories of Science and Social Science. Although publication and citation behaviour differs largely across disciplines, principal components explain more than 78% of the total variance and the average number of references per paper is not the primary factor explaining the variance in impact factors across categories. The Categories Normalized Impact Factor (CNIF) based on the JCR subject category list is proposed and compared with the IF. This normalization is achieved by considering all the indexing categories of each journal. An empirical application, with one hundred journals in two or more subject categories of economics and business, shows that the gap between rankings is reduced around 32% in the journals analyzed. This gap is obtained as the maximum distance among the ranking percentiles from all categories where each journal is included.”