WUR is high in the rankings because of the demonstrably big impact of Wageningen research. And this is largely due to a smart publishing strategy, say information specialists.
Illustration: Pascal Tieman
Getting published is vital for research. A matter of survival, in fact, for the individual researcher: ‘publish or perish’. And no other Dutch university is as good at getting published as Wageningen University & Research. The institution’s international success can largely be put down to this.
So say information specialists Peter van der Togt of the Forum library and Philipp Fondermann of Elsevier Information Systems in an article about the Wageningen publication strategy in Procedia Computer Science. That strategy is straightforward. Van der Togt sums it up: ‘Make sure you get published in Q1 journals, because that gives you the greatest chance of a big impact.’ Q1 stands for the best 25 journals, based on their impact in their research fields. Q2 to Q4 stand for the other quartiles.
This publication strategy is based on the work of erstwhile information specialist and bibliometrics guru Wouter Gerritsma. Scientists’ impact can be measured by the number of times their articles are cited by colleagues, but it takes a few years for an article to prove its worth. Gerritsma discovered that the Q value of the journal in which an article is published is a good predictor of the impact the article will have. ‘So publishing in a Q1 journal is a good indicator of a big impact,’ explains Van der Togt. ‘The impact assessment of tenure trackers is based on this.’
The emphasis on smarter, more calculated publishing has got through to all the branches of the organization, and has borne fruit, according to Van der Togt. The figures bear out the claim (see the graphs). The number of Wageningen articles in Q1 journals has risen over 10 years by nearly one quarter to 63 percent. The number of articles that belong in the top 10 most frequently cited articles went up by 10 percent in the same period. Well over 25 percent of all Wageningen articles get into the top 10 percent nowadays. The relative impact of the average Wageningen article went up from 1.75 to 2.75. That means that a Wageningen article is cited 2.75 times as often as the average article in the same field during the same period worldwide. No other Dutch university can match that impact.
Richard Visser, head of Plant Breeding and Dean of Research, confirms that the attitude to publishing has changed. ‘We have started looking more consciously at the impact of journals. Previously 90 percent of our articles were published in Euphytica. That was the best journal in the field of breeding. Many institutes had their own ‘in-house journals.’ With the rise of impact factors for journals, people here started saying we should publish in other journals too, journals with a higher impact. Nowadays we publish most often in Theoretical Applied Genetics. And we have also deliberately started publishing in journals like Plant Cell in order to reach a different readership.’
Yet Visser is alert to the dangers of letting impact factor play too dominant a role. ‘What you want first and foremost is to reach the audience which is closest to your research. That means you sometimes publish in journals with a lower impact factor because they are the ones your readers read. We publish in Profyta, for instance, a periodical for the ornamental plant sector which doesn’t have an impact factor at all. It’s about passing on information to the sector as well.’ According to Visser, PhD candidates have a big say in the choice of journal as well. ‘Some of them just want to publish seven or eight articles and it doesn’t matter much where. But others only want to be published in Nature or Science, or the like.’
Besides more targeted publishing, Wageningen researchers have also started publishing more. That is to say: more peer-reviewed articles. Van der Togt: ‘You can see a clear shift from writing in the form of proceedings, reports and books to articles. Ten years ago, 60 percent of Wageningen’s academic output was articles, and that is 80 percent now.’ Van der Togt puts all these developments down to the new publication strategy. ‘But you cannot prove that 100 percent of course. There is no baseline measurement: we cannot compare the situation with a WUR without that strategy. What we can do is to carry out these bibliometric analyses, because then our metadata are in order. In the old days people only used to look at the number of articles, but now they look at the quality too. On that score Wageningen is ahead of the game.’
But does the increased impact of Wageningen researchers actually mean the researchers themselves have got better? ‘Well, they have certainly got better at publishing,’ says Van der Togt cautiously. ‘But to publish in those better journals, you have to put more effort into your articles. So the articles are better and that’s quality too.’ Professor Visser agrees with him. ‘If you aim higher, your data have to be better and you have to present them in a better way. By publishing better you become a better scientist.’