Science
Research

Measuring up: take two

Lists are only valuable if they are correct. The H top-20 in the last issue of Resource did a few scientists an injustice. To put that right, here's a second attempt.

The ink on ‘Measuring up’ was barely dry before Michael Muller sent an indignant email. Since when did he no longer belong on the list? He was quickly followed by Hans Tramper, who said that ‘as an old hand he wasn’t that concerned’. And hadn’t we left out Tjakko Abee, suggested someone else. Yes, we had. Annoying, but not entirely unexpected. Fortunately, the complaints stopped there. In the new list shown below, these errors have been corrected. Unavoidably, at the expense of three others. Louise Vet, Jacques Vervoort and Martien Groenen, please accept our apologies. Expert judgement Get it right first time. Wasn’t that possible? In principle yes, but it is no easy matter to compile a list like this. Wageningen UR doesn’t keep its own records of scores like these. What’s more, a scientist who wants to know her own h-rating, has to investigate it herself. In fact, there is only one good way to compile the Top-20: expert judgement. In other words, to sound out a broad selection of people who could or should be in the know. So that is what has been done. But it’s a fallible method. Among those thousands of scientists, it is easy to overlook someone. Which means that even the amended and appended list is falsifiable. Improbable Besides, in no way does an h-score tell the whole story. Total output, the total number of citations, the average per article and the number of citations for the most frequently cited article provide a much more complete picture of a researcher’s scientific worth. Such statistics in the table below reveal striking differences. For example, Peter Holman achieves an h=48 with just 120 articles. This is reflected in the improbable average of more than 100 citations per article. Although, obviously, 2288 (!) citations for a top article bumps up the score rather nicely. WoS The total number of citations also shows great variation, with Daan Kromhout the real star with over 25,000 citations. Willem de Vos isn’t far behind, but he’s needed 180 more articles to get to his position. Incidentally, the data come from Web of Science, currently the most used and most complete source in the h-field. Upon request, WoS will publish each scientist’s citation score per year in graph form. Staff of Wageningen UR have access to WoS via the library website ([www.library.wur.nl**](http://www.library.wur.nl/)).

Leave a Reply


You must be logged in to write a comment.