Science
Background

Measuring up – Academia adopts the h-index

Since the h-index was launched in 2005, it has become hard to imagine the academic world without it. A simple, clear-cut number that sums up a researcher's impact. But it does have its limitations, apparently. In pursuit of the real value of h. Plus the inevitable list: whose h-factor is the highest?

Never heard of the h-index or the h-factor? Call yourself an academic? Researchers of 2012 cannot imagine a world without the h-index. Rarely has a metric been so rapidly adopted as the index launched by Jorge Hirsch in 2005. It is based on a citation count. In other words: the number of times a scientific publication is referred to. A scientist with an h-factor of ten has ten publications to her name, each of which has been cited at least ten times. The index translates the impact (the number of citations) and the scope (the number of articles) of a scientist’s work into a straight number. A simple but brilliant idea, as it turned out.

The higher the better

Seven years down the line, the h-index has become critical. Researchers’ importance is measured by their h score. No academic CV is complete without it. Something that happened during the research for this article seems very telling. The editors drew up a list of the Wageningen academics with the highest h-factor, and emailed the researchers concerned to check that the data were correct. The first replies came back within five minutes and almost everyone had responded within a day, from Wageningen, Hanoi, South Africa, all over the world. Some wrote to correct the figures, some to explain or comment on the index, and others were simply curious about their position in the rankings. Clearly the h-index is hot. And the higher the better. Top of the class at Wageningen UR is Willem de Vos (h = 77). He heads the top 20, just ahead of Daan Kromhout. So does this make Willem de Vos Wageningen’s best scientist? ‘You could interpret it like that. I’m not doing badly’, he responds modestly. But De Vos is the first to express strong reservations about the index. ‘It’s just a figure. There are all sorts of ways of looking at it. It’s the bureaucrats who are particularly taken with the h-factor. It tells you something about citations but it is not the most significant factor in a scientist’s importance. The real point is the content of your publications, of course.’ De Vos does have a couple of explanations for his high score. Age plays a big role. By its very nature, the h-factor can only increase. Once cited, always cited. ‘I started young myself, had my first publication as first author at 25, and became a professor at 32. That helps. What is more, I have worked in several new professional fields. That makes a difference too. It makes you a pioneer and your articles get referred to a lot. And you don’t do it by yourself, you should take that into account. I have a big group of people that I work with.’ To correct for age, De Vos says the h factor often gets divided by the number of years since the researcher got a PhD. ‘That should come out well over 2, if you are reasonably good. Mine is almost three.’ Just Vlak (h = 46) points to another method, known as the Eigen factor. This is the h-index divided by the number of years that the researcher has been working. ‘The h-index is a good measure of performance but it is not infallible. You can also look at how recently the top ten of someone’s most cited articles were published. That gives a better idea of the current relevance of their contribution to science. Scientists with a high h score, but with most of the citations dating back to the 1980s and 90s, would not appear to be working at the leading edge anymore.’

Inconsistencies

Most of the ‘highfliers’ approached point out the limitations of the h-index. Daan Kromhout (h = 76) comments on the piggyback effect: authors who get credits for articles they hardly contributed to. According to Kromhout, this plays an especially big role in articles about large-scale epidemiological studies or genetic research and meta-analysis. ‘Many researchers get impressive citation scores that way, and a high h-index without have made a significant intellectual contribution to the article.’ And then there is the system’s unreliability. ‘Sadly, one of my most-cited articles was not included’, responds Willem Norde (h = 53). ‘For some reason or other, Web of Science has missed two volumes of Colloids and Surfaces B: Biointerfaces.’ This cost Norde one full point. Others have had similar experiences. An outspoken opponent of the use of the h-index is the Centre for Science and Technological Studies (CWTS) at the University of Leiden. The centre is specialized in bibliometric research and recently issued a statement warning of the pitfalls of the index. According to scientometrist Ludo Waltman (h = 9), the h-index concept produces some ‘strange results’. ‘We call that the inconsistency of the h-index. If two researchers who have h-indices of 9 and 10 respectively write an article together, their rankings can be reversed. The one with a 9 goes up to 11 while the one with a 10 stays put. They swap positions, when they wrote the article together.’ The question is of course how often such inconsistencies arise. ‘That is difficult to say. But that is not what bothers us. The point is that those inconsistencies exist.’ According to Waltman, the h-index also disadvantages researchers who are very selective in their publication strategy. ‘The h-index is a combination of quality and quantity. But people who focus strongly on quantity are at an advantage. Anyone who really wants to deliver something special doesn’t publish a lot, and is therefore at a disadvantage. The h-index is too focussed on quantity.’ Besides these theoretical objections, the CWTS also has its own ideas about the practical uses of the h-index. According to Waltmans, it is hard to get a clear picture of the way the h-index is used in the policy of universities and institutes. ‘We are concerned about that. When there is a chair vacant, candidates’ h-indices are increasingly taken into account. That is not bad in itself, but the h-index sums everything up in a number, while little attention is paid to the underlying complexities of someone’s scientific significance. The h-index is applied far too mechanistically.’

Apples and pears

Rector Martin Kropff (h = 31) vigorously denies this picture. ‘If I appoint a professor, I want to know how high his or her h-index is. That is true. But there is a story behind it. We expect someone to be serious about publishing and to be cited. But we do look at the publications themselves, for example, and at which journals they appear in.’ It also makes a difference, Kropff says, what sort of appointment is concerned. ‘For a personal professor it’s all about the research. But a chair holder has to be able to do much more than that. He has to lead a group of people, to be a good teacher, to be able to inspire others and bring them to greater heights. Someone with a high h-index has a big impact. But that is not all there is to it. In Wageningen multidisciplinary work, working in teams, is important as well. That is our strength. What is more, it is not just the research itself that matters to us, but the impact of our research. And sometimes it is better to appoint young people, who will have a lower h-index by definition. So a high h-index doesn’t automatically win.’ ‘The h-index is important, but it is just one of the things we use and is certainly not the be-all and end-all’, agrees talent scout Henrieke de Ruiter (h = …). ‘You really have to use it with great caution.’ According to De Ruiter, you should look at the score in the light of the subject area. Otherwise you will be comparing apples with pears. ‘You need to know what the h-indices of the top researchers in that field are. In one field you will be in the top 5 with an h-index of 30, whereas in another even an h-index of 40 won’t put you in the top 20.’ Kropff also points out the big differences in the publication culture in different academic fields. ‘In the social sciences there has long been a culture of writing books instead of articles. And they are not taken into account in calculating the h-factor. In our tenure track system we do take them into account though. There you get points for writing a book as well. My own most-cited work ever is a book, so that does not feature in the citation lists. And then there’s the fact that there are far more people working in some fields than in others. That means more publications and more citations. A great deal is published in molecular biology, for instance.’

HCP index

There has been criticism of the h-index ever since it was launched. The literature includes dozens of attempts to come up with a system that compensates for the system’s perceived shortfalls. Waltman has had a go too: he came up with the HCP index. HCP stands for highest cited publications. ‘That means that you only look at the number of articles with citations above a set lower limit. You could, for example, set the lower limit at the top 20 percent of the most-cited articles in a field. That works with the same idea as the h-index: an article only counts if it comes above a certain level. But the HCP doesn’t have the inconsistencies of the h-index. The prioritising of quantity over quality is gone, and you can compare academic fields with each other more easily.’ But Waltman has few illusions about the chances of his invention toppling the h-index from its pedestal. The system is too popular for that; it would be a losing battle. ‘I have just seen another job ad for a research post which asked for your h-index.’

Which h?

One definitive h-index does not exist. It depends which database you use: Web of Science (ISI), Scopus (Elsevier) or Google Scholar (Google). Web of Science goes back to the early twentieth century, whereas Scolus only started counting journal articles in 1996. Both services also use their own selection of the estimated 6,000 scientific journals in circulation. This makes for remarkable differences. Certainly for scientists who were already active before 1996. As an example: runner-up Daan Kromhout has an h-index of 76 with WoS but only scores 60 with Scopus. A full 16 points less. Google Scholar has only been online a few months and still has to show its paces. The search engine takes into account reports, scientific books and theses as well as articles, which results in considerably higher h-indices. And Google Scholar is the only one of the three that is accessible to the general public.

Leave a Reply


You must be logged in to write a comment.