Homepage             |         Sitemap             |         Disclaimer             |         Contact




Open Forum :

This tab has been created in the home page of the EURASC website as a substitute for the previous one placed in the restricted « members area ».  

The object of this new tab is to give to all the EURASC members the possibility to talk about scientific subjects of their choice, novelties in their field, research policy,…to exchange information, to keep up a kind of correspondence through newsletters, to create a forum for discussion.  

All members are invited to submit short forum letters or short responses to published ones.

Electronic PDF contributions should be sent to the following address: forum@eurasc.org.
They will be published online in this tab.



Performance Assessment of Belgian Researchers

Michel Gevers

Emeritus Professor UCL, Guest Professor VUB

Performance evaluations and rankings have become all pervasive. Rating agencies are evaluating countries and banks, consulting agencies are assessing the performance of companies, human resource offices are evaluating their workers. The evaluation culture has invaded all aspects of our life; only evaluators appear to have escaped this evaluation frenzy. Historians will perhaps remember the start of the 21st century as a period where more effort was spent on evaluating than on producing.

In the universities also, the evaluation culture is present everywhere. The universities are compared by the famous rankings (Shanghai, Times Higher Education, etc) while the professors are evaluated at every step of their career, whether it is for a promotion or for an application to research funding.

The criteria are constantly revised, in part because every new evaluation criterion leads to adaptation strategies. As soon as the research community finds out that a particular research performance criterion becomes dominant in the evaluation committees, some of its members adapt their research and publication strategy to maximize its impact on the newly adopted criterion rather than aiming at producing the most highly creative research results. For a while it was the number of publications that mattered until it became obvious that it is easy to publish many papers in low quality journals. After a period in which the assumed quality of the journals was taken into account, as measured, for example, by the infamous impact factors, the trend in the last 5 to 10 years has been to consider the number of citations of a document (book, book chapter, journal article, conference paper) as the most important measure of research performance for a researcher. The idea is that if a paper is highly cited by researchers in the same field, it is probably because it contained important new ideas; conversely, if a paper is never cited, it probably means that it was not deep or ground-breaking.

Today, the number of citations has become accepted as the leading indicator for the evaluation of research quality, at least in fundamental research. It is encapsulated in a number of criteria that have been devised to produce the most compact possible measure of performance, the most notorious one being the h-index. A researcher has an h-index of 20, say, if 20 of his/her publications have been cited at least 20 times. A productive researcher will proudly exhibit his/her h-index, while the researcher who is less well endowed will hope that his/her evaluation committee will not look it up on the specialized websites. One should add that publication rhythms and citation traditions vary widely between disciplines, and that one can never compare citation records or h-indices across disciplines.

The use and misuse of the h-index in the evaluation of researchers is widely criticized within the research community and rightly so, because the maximisation of one’s h-index often conflicts with the pursuit of long-term visionary research. Evaluation committees like to make their life simple and they therefore long for a unique indicator that would allow them to rank researchers without any effort, rather than facing the hard task of evaluating the creativity, depth and impact of the contributions of these researchers by themselves. In addition, the dominance of a single performance indicator inevitably produces perverse effects among some members of the research community. Just to give one example, some journals now request the authors to cite other authors who have published in the same journal so as to increase the citation record of that journal. I want to make it clear that one should keep a very critical eye on the use and abuse of the h-index and other sophisticated performance indices that are being constantly devised or refined.

Keeping in mind the cautionary remarks made above, but given that the number of citations has become so prevalent in the evaluation of fundamental research all over the world and that it is indeed one important indicator of the impact of a research paper or book within a scientific discipline, I have found it interesting to compare these citation numbers for 17 OECD countries that are considered to be among the most active research-wise. All the analyses in this report have been performed using data available from the websites of the World Bank3, the OECD4 and SCImago5, a website that is specialized in the ranking of journals and countries on the basis of citations of papers and that is powered by SCOPUS6, a data-base of scientific documents and citations maintained by Elsevier. The data in this report do not use the h-index, but citation numbers of documents produced by the 17 countries.

On the SCImago website one can compute citation numbers of documents published in all possible countries over the period 1996-2010, or separately for each year of that period. The number of citations given for a specific country and for a particular year X is the number of citations of all papers published by authors of that country during the year X and cited during the years X, X+1, X+2, etc, until 2010. Thus, the citation numbers for the year 2008, say, refer to the number of citations of all documents published in 2008 and cited in 2008, 2009 or 2010. When referring to the period 1996- 2010, all documents published during that period are considered.

A comparison between countries based on the total number of citations would not be meaningful, of course, because it would not take account of the size of the population of the countries involved, the number of researchers, or the budgets allocated to research. In order to compare countries of such varying populations, one needs to introduce some normalization. In this study, different types of normalization have been introduced: in the first, the number of citations per document produced by authors of the 17 countries are examined; in the second, the number of citations of a country are divided by the budgets allocated to research in higher education institutions of that country; in the third, an index called Normalized Impact (NI) is studied, which will be explained later.

Citations per document

 


Proposals, presented by Members at the General Assembly in Athens

Herbert Gleiter

(Karlsruhe Institute of Technology (KIT) Institut für Nanotechnologie)



Dear Mrs. de Rode, dear Mr. Capasso,


I would like to summarize some of the ideas and results, I obtained in talking to several key people of the German National Academy of Sciences, the German Academy of Engineering and using my membership in the National Academy of India as well as of the US.
In principle, the future role of EURASC in Europe may be visualized as a purly scientifically oriented Academy, for example, similar to the American Academy of Arts and Sciences (AAAS). The structure and goals etc. of AAAS are listed in the AAAS home page www.amacad.org

If EURASC decides to have this purly scientific mission, one has to discuss and solve the question of the funding of EURASC. AAAS is supported by a generous flux of donations and, moreover, AAAS has endowed funds of several billions USD (cf. homepage of AAAS). AAAS publishes quarterly a collection of articles about open questions of general interest such as the financial crisis, mass incarnation, the nuclear future etc. These articles are written by outstanding people in these fields who are mostly AAAS members. All of this is funded by AAAS.

Alternatively, EURASC may see his mission in giving advice to European Union. This role would be comparable to the one e.g. of the US National Academies. The structure, goals etc, of the US National Academies may be found in their homepage www.nationalacadmies.org They publish regular Reports to the Congress, numerous papers on present issues etc. Their staff comprises more than a hundred permanent employees. They are generously funded by the US Government because the National Academies were started by President Lincoln by means of the "Act of Incoporation" which guarantees the funding of the Academies as well as their close and influential relationship with the US Government.

The German National Academy of Sciences (Leopoldina) - homepage www.leopoldina-halle.de - has a similar historical background. It was founded by king Leopold in 1677 as the "Sacri Romani Imperii Academia Ceasaro - Leopoldina Naturae Curiosum". This start guaranted the outstanding position of this Academy in Europe and its funding. During WW II it lost this status. However, in 2008 the Government of the (then united) Germany declared the Leopoldina to become again the National Academy of Sciences of Germany. This declaration included (similar to the US National Academies) its role as an advisor to the Government as well as its funding by the Government.

All of this seems to teach us the followind. If EURASC wants to become the European Academy of Sciences, a declaration of the Government at Brussels seems to be required. This declaration would have to include the financial support of EURASC and define it as the advisor to the European Government. In fact, after WW II the Leopoldina has tried to regain its role as the National Academy by negotiating with the different State Academies in Germany. These negotiations went on for 8 years without any results.

I a got a comparable reponse by talking several top representatives of National Academies in Europe: We do not need a European Academy. Moreover, we have EASAC, ALLEA (homepage www.EASAC.eu) These organisations take care of all co- oporations between the existing National Academies, the press releases, the joint publications etc. In addition, Academia Europaea is already a member of ESAC, why do we need EURASC ?

This brings me to the point we may have to consider before we approach Brussels. We should clarify with the Academia Eurpopaea and the European Academy of Arts and Sciences (at Salzburg), how we and how they see their roles in Europe. The obvious question is and was: can we join up with these two Academies and form one EURASC.

Naturally, this letter raises far more questions than answers. However, before we begin to discuss the future role and structure of EURASC in January 14, 2011, we should try to define the mission of EURASC.

 


Archives