Organisation
Employees

Measuring up: take two

Lists are only valuable if they are correct. The H top-20 in the last issue of Resource did a few scientists an injustice. To put that right, here’s a second attempt.

The ink on ‘Measuring up‘ was barely dry before Michael Muller sent an indignant email. Since when did he no longer belong on the list? He was quickly followed by Hans Tramper, who said that ‘as an old hand he wasn’t that concerned’. And hadn’t we left out Tjakko Abee, suggested someone else. Yes, we had. Annoying, but not entirely unexpected. Fortunately, the complaints stopped there. In the new list shown below, these errors have been corrected. Unavoidably, at the expense of three others. Louise Vet, Jacques Vervoort and Martien Groenen, please accept our apologies. Expert judgement Get it right first time. Wasn’t that possible? In principle yes, but it is no easy matter to compile a list like this. Wageningen UR doesn’t keep its own records of scores like these. What’s more, a scientist who wants to know her own h-rating, has to investigate it herself. In fact, there is only one good way to compile the Top-20: expert judgement. In other words, to sound out a broad selection of people who could or should be in the know. So that is what has been done. But it’s a fallible method. Among those thousands of scientists, it is easy to overlook someone. Which means that even the amended and appended list is falsifiable. Improbable Besides, in no way does an h-score tell the whole story. Total output, the total number of citations, the average per article and the number of citations for the most frequently cited article provide a much more complete picture of a researcher’s scientific worth. Such statistics in the table below reveal striking differences. For example, Peter Holman achieves an h=48 with just 120 articles. This is reflected in the improbable average of more than 100 citations per article. Although, obviously, 2288 (!) citations for a top article bumps up the score rather nicely. WoS The total number of citations also shows great variation, with Daan Kromhout the real star with over 25,000 citations. Willem de Vos isn’t far behind, but he’s needed 180 more articles to get to his position. Incidentally, the data come from Web of Science, currently the most used and most complete source in the h-field. Upon request, WoS will publish each scientist’s citation score per year in graph form. Staff of Wageningen UR have access to WoS via the library website (www.library.wur.nl).

h-index

citations total

citations ave. per article

citations top article

articles total

Willem de Vos, microbiology

77

22677

36

427

626

Daan Kromhout, epidemiology

76

25567

57

2288

445

Maarten Koornneef, genetics

65

13301

63

553

210

Willem van Riemsdijk, soil chemistry

57

10482

43

488

242

Edith Feskens, nutrition

56

14781

48

2288

309

Marcel Dicke, entomology

56

10396

39

790

268

Willem Norde, physical chemistry

53

9984

42

710

233

Frans Kok, nutrition

52

10414

33

480

326

Peter Hollman, food chemistry

48

12162

101

2288

120

Ton Bisseling, molecular biology

48

7266

42

258

173

Martien Cohen Stuart, physical chemistry

48

8676

25

257

353

Frank Berendse, nature management

47

6013

42

641

150

Michael Muller, nutrigenomics

46

7093

47

573

154

Just Vlak, virology

46

7103

24

233

291

Tjakko Abee, food microbiology

45

5734

30

155

192

Fons Stams, microbiology

46

7202

25

206

279

Hans Tramper, bioprocess technology

45

7491

20

189

369

Piere de Wit, phytopathology

44

5944

39

237

159

Marten Scheffer, aquatic ecology

43

7548

52

1224

146

Michiel Kleerebezem, microbiology

43

5469

34

474

163

Leave a Reply


You must be logged in to write a comment.