This book is the result of a joint research and development project supported by UNESCO and undertaken in 2013 by UNESCO in partnership with the Public Knowledge Project (PKP), the Scientific Electronic Library Online (SciELO), the Network of Scientific Journals of Latin America, the Caribbean, Spain and Portugal (RedALyC), Africa Journals Online (AJOL), the Latin America Social Sciences SchoolBrazil (FLACSO-Brazil), and the Latin American Council of Social Sciences (CLACSO). This book aims to contribute to the understanding of scholarly production, use and reach through measures that are open and inclusive. The present book is divided into two sections.
The first section presents a narrative summary of Open Access in Latin America, including a description of the major regional initiatives that are collecting and systematizing data related to Open Access scholarship, and of available data that can be used to understand the (i) growth, (ii) reach, and (iii) impact of Open Access in developing regions. The first section ends with recommendations for future activities. The second section includes in-depth case-studies with the descriptions of indicators and methodologies of peer-review journal portals SciELO and Redalyc, and a case of subject digital repository maintained by CLACSO.
URL : http://microblogging.infodocs.eu/wp-content/uploads/2015/08/alperin2014.pdf
Alternative location : http://hdl.handle.net/10760/25122
This article compares the Faculty of 1000 (F1000) quality filtering results and Mendeley usage data with traditional bibliometric indicators, using a sample of 1397 Genomics and Genetics articles published in 2008 selected by F1000 Faculty Members (FMs). Both Mendeley user counts and F1000 article factors (FFas) correlate significantly with citation counts and associated Journal Impact Factors. However, the correlations for Mendeley user counts are much larger than those for FFas.
It may be that F1000 is good at disclosing the merit of an article from an expert practitioner point of view while Mendeley user counts may be more closely related to traditional citation impact. Articles that attract exceptionally many citations are generally disorder or disease related, while those with extremely high social bookmark user counts are mainly historical or introductory.
URL : http://2012.sticonference.org/Proceedings/vol2/Li_F1000_541.pdf
Google Scholar Metrics: an unreliable tool for assessing scientific journals :
« We introduce Google Scholar Metrics (GSM), a new bibliometric product of Google that aims at providing the H-index for scientific journals and other information sources. We conduct a critical review of GSM showing its main characteristics and possibilities as a tool for scientific evaluation. We discuss its coverage along with the inclusion of repositories, bibliographic control, and its options for browsing and searching. We conclude that, despite Google Scholar’s value as a source for scien- tific assessment, GSM is an immature product with many shortcomings, and therefore we advise against its use for evalu- ation purposes. However, the improvement of these shortcomings would place GSM as a serious competitor to the other existing products for evaluating scientific journals. »
URL : http://digibug.ugr.es/handle/10481/21540
Le classement de Leiden: environnement scientifique et configuration :
« Le classement de Leiden s’impose aujourd’hui comme une alternative pertinente et valable vis-à-vis de celui de Shanghai. De nombreux indicateurs font intervenir les caractéristiques propres aux champs disciplinaires et des calculs fondés sur le principe de distribution. Il est conçu par le centre CWTS de l’université néerlandaise de Leiden. »
« The Leiden Ranking is considered today as quite a pertinent and valuable alternative vs. the Shanghai Ranking. A significant number of indicators involve for instance Fields Citation Scores and data distribution. It is conceived by the CWTS of the University of Leiden – The Netherlands. »
URL : http://archivesic.ccsd.cnrs.fr/sic_00696098
Which alternative tools for bibliometrics in an research institute ? :
« Nowadays, bibliometrics is a frequently used tool in scientific and technical information, it can be useful to quantify scientific production and for collective or individual evaluations. Web of Science (Thomson ISI) and impact factor calculated by JCR are the better known references. We will underline the limits and setbacks of these overused indicators, especially the bias factor h. Other alternative tools are emerging today. Our presentation will focus on comparing all these products, and we will study their interests for librarians and researchers. »
« Aujourd’hui la bibliométrie est un outil fréquemment utilisé pour quantifier la production scientifique et aussi pour les évaluations des chercheurs et des institutions. Le WoK et le JCR pour le facteur d’impact sont des outils de référence. Nous voudrions souligner les limites de ces indicateurs, nous soulignerons les biais du facteur h. D’autres outils alternatifs émergent aujourd’hui. Cette communication analysera d’autres outils qui peuvent être utilisés en bibliométrie, nous en verrons les avantages et les inconvénients pour les documentalistes et les chercheurs. »
URL : http://archivesic.ccsd.cnrs.fr/sic_00668741
The visibility of Wikipedia in scholarly publications :
« Publications in the Institute of Scientific Information’s (ISI, currently Thomson Reuters) Web of Science (WoS) and Elsevier’s Scopus databases were utilized to collect data about Wikipedia research and citations to Wikipedia. The growth of publications on Wikipedia research, the most active researchers, their associated institutions, academic fields and their geographic distribution are treated in this paper. The impact and influence of Wikipedia were identified, utilizing cited work found in (WoS) and Scopus. Additionally, leading authors, affiliated institutions, countries, academic fields, and publications that frequently cite Wikipedia are identified. »
URL : http://firstmonday.org/htbin/cgiwrap/bin/ojs/index.php/fm/article/view/3492/3031
Who Shares? Who Doesn’t? Factors Associated with Openly Archiving Raw Research Data :
« Many initiatives encourage investigators to share their raw datasets in hopes of increasing research efficiency and quality. Despite these investments of time and money, we do not have a firm grasp of who openly shares raw research data, who doesn’t, and which initiatives are correlated with high rates of data sharing. In this analysis I use bibliometric methods to identify patterns in the frequency with which investigators openly archive their raw gene expression microarray datasets after study publication.
Automated methods identified 11,603 articles published between 2000 and 2009 that describe the creation of gene expression microarray data. Associated datasets in best-practice repositories were found for 25% of these articles, increasing from less than 5% in 2001 to 30%–35% in 2007–2009. Accounting for sensitivity of the automated methods, approximately 45% of recent gene expression studies made their data publicly available.
First-order factor analysis on 124 diverse bibliometric attributes of the data creation articles revealed 15 factors describing authorship, funding, institution, publication, and domain environments. In multivariate regression, authors were most likely to share data if they had prior experience sharing or reusing data, if their study was published in an open access journal or a journal with a relatively strong data sharing policy, or if the study was funded by a large number of NIH grants. Authors of studies on cancer and human subjects were least likely to make their datasets available.
These results suggest research data sharing levels are still low and increasing only slowly, and data is least available in areas where it could make the biggest impact. Let’s learn from those with high rates of sharing to embrace the full potential of our research output. »
URL : http://www.plosone.org/article/info%3Adoi%2F10.1371%2Fjournal.pone.0018657