La visibilité des revues scientifiques francophones sur le plan international : le cas des SIC et d’Études de Communication

“Les Sciences de l’Information et de la Communication sont une discipline spécifiquement française. Les anglo-saxons ont maintenus la différence entre, d’une part, la Library and Information Science et les Communication and Media Studies. Appartenant au champ des SIC la revue Études de Communication aborde à la fois des thématiques propres au champ de la LIS et au champ des media studies. La visibilité des revues francophones à l’international, dans un monde qui s’anglicise de plus en plus, est problématique. Quand le champ scientifique dans lequel s’inscrit une revue n’existe pas en-dehors de la francophonie cette problématique s’intensifie. Il s’agit ici de définir la notion de visibilité et ses enjeux, en général et dans le champ plus spécifique des SIC, de s’interroger sur la place du français dans le monde scientifique et d’envisager un cas pratique, celui d’Études de Communication, avec ses caractéristiques et ses solutions.”

URL : http://memsic.ccsd.cnrs.fr/mem_00741376

In science “there is no bad publicity”: Papers criticized in technical comments have high scientific impact

Technical comments are special types of scientific publications whose aim is to correct or criticize previously published papers. Often, comments are negatively perceived by the authors of the criticized articles because believed to make the commented papers less worthy or trusty to the eyes of the scientific community.

Thus, there is a tendency to think that criticized papers are predestined to have low scientific impact. We show here that such belief is not supported by empirical evidence. We consider thirteen major publication outlets in science and perform a large-scale analysis of the citation patterns of criticized publications.

We find that commented papers have not only average citation rates much higher than those of non commented articles, but also unexpectedly over-populate the set of the most cited publications within a journal. Since comments are published soon after criticized papers, comments can be viewed as early indicators of the future impact of criticized papers.

Our results represent one the most clear observations of the popular wisdom of “any publicity is good publicity”, according to which success might follow from negative criticisms, but for which there have been very few empirical validations so far.

Our results go also beyond, touching core topics of research in philosophy of science, because they emphasize the fundamental importance of scientific disputes for the production and dissemination of knowledge.

URL : http://arxiv.org/abs/1209.4997

A measure of total research impact independent of…

Statut

A measure of total research impact independent of time and discipline :

“Authorship and citation practices evolve with time and differ by academic discipline. As such, indicators of research productivity based on citation records are naturally subject to historical and disciplinary effects. We observe these effects on a corpus of astronomer career data constructed from a database of refereed publications. We employ a simple mechanism to measure research output using author and reference counts available in bibliographic databases to develop a citation-based indicator of research productivity. The total research impact (tori) quantifies, for an individual, the total amount of scholarly work that others have devoted to his/her work, measured in the volume of research papers. A derived measure, the research impact quotient (riq), is an age independent measure of an individual’s research ability. We demonstrate that these measures are substantially less vulnerable to temporal debasement and cross-disciplinary bias than the most popular current measures. The proposed measures of research impact, tori and riq, have been implemented in the Smithsonian/NASA Astrophysics Data System.”

URL : http://arxiv.org/abs/1209.2124

Web Impact Factor WIF and Link Analysis of…

Statut

Web Impact Factor (WIF) and Link Analysis of Indian Institute of Technologies (IITs): A Webometric Study :

“This paper examines and explores the web impact factor through a webometric study of the present 16 Indian Institute of Technology (IIT) of India. Identifies the domain systems of the websites; analyzes the number of web pages and link pages, and calculates the simple web impact factor (WIF), self link web impact factor and external web impact factor of all the IIT. Also reflects that some IIT have higher number of web pages, but correspondingly their link pages are very small in number and websites fall behind in their simple, self link and external link web impact factor.”

URL : http://digitalcommons.unl.edu/libphilprac/789/

Maximizing the impacts of your research a handbook…

Statut

Maximizing the impacts of your research: a handbook for social scientists :

“There are few academics who are interested in doing research that simply has no influence on anyone else in academia or outside. Some perhaps will be content to produce ‘shelf-bending’ work that goes into a library (included in a published journal or book), and then over the next decades ever-so-slightly bends the shelf it sits on. But we believe that they are in a small minority. The whole point of social science research is to achieve academic impact by advancing your discipline, and (where possible) by having some positive influence also on external audiences – in business, government, the media, civil society or public debate.

For the past year a team of academics based at the London School of Economics, the University of Leeds and Imperial College London have been working on the Impact of Social Sciences project aimed at developing precise methods for measuring and evaluating the impact of research in the public sphere. We believe our data will be of interest to all UK universities to better capture and track the impacts of their social science research and applications work.

Part of our task is to develop guidance for colleagues interested in this field. In the past, there has been no one source of systematic advice on how to maximize the academic impacts of your research in terms of citations and other measures of influence. And almost no sources at all have helped researchers to achieve greater visibility and impacts with audiences outside the university. Instead researchers have had to rely on informal knowledge and picking up random hints and tips here and there from colleagues, and from their own personal experience.

This Handbook remedies this key gap and, we hope, will help researchers achieving a more professional and focused approach to their research from the outset. It provides a large menu of sound and evidence-based advice and guidance on how to ensure that your work achieves its maximum visibility and influence with both academic and external audiences. As with any menu, readers need to pick and choose the elements that are relevant for them. We provide detailed information on what constitutes good practice in expanding the impact of social science research. We also survey a wide range of new developments, new tools and new techniques that can help make sense of a rapidly changing
field.”

URL : http://eprints.lse.ac.uk/35758/

Open access versus subscription journals: a comparison of scientific impact

Authors : Bo-Christer Björk, David Solomon

Background

In the past few years there has been an ongoing debate as to whether the proliferation of open access (OA) publishing would damage the peer review system and put the quality of scientific journal publishing at risk.

Our aim was to inform this debate by comparing the scientific impact of OA journals with subscription journals, controlling for journal age, the country of the publisher, discipline and (for OA publishers) their business model.

Methods

The 2-year impact factors (the average number of citations to the articles in a journal) were used as a proxy for scientific impact. The Directory of Open Access Journals (DOAJ) was used to identify OA journals as well as their business model.

Journal age and discipline were obtained from the Ulrich’s periodicals directory. Comparisons were performed on the journal level as well as on the article level where the results were weighted by the number of articles published in a journal.

A total of 610 OA journals were compared with 7,609 subscription journals using Web of Science citation data while an overlapping set of 1,327 OA journals were compared with 11,124 subscription journals using Scopus data.

Results

Overall, average citation rates, both unweighted and weighted for the number of articles per journal, were about 30% higher for subscription journals. However, after controlling for discipline (medicine and health versus other), age of the journal (three time periods) and the location of the publisher (four largest publishing countries versus other countries) the differences largely disappeared in most subcategories except for journals that had been launched prior to 1996.

OA journals that fund publishing with article processing charges (APCs) are on average cited more than other OA journals. In medicine and health, OA journals founded in the last 10 years are receiving about as many citations as subscription journals launched during the same period.

Conclusions

Our results indicate that OA journals indexed in Web of Science and/or Scopus are approaching the same scientific impact and quality as subscription journals, particularly in biomedicine and for journals funded by article processing charges.

URL : http://www.biomedcentral.com/1741-7015/10/73