In an increasingly competitive academic world we are all eager to know which university is ‘the best’. And a raft of league tables comes to our aid, including the Times Higher Education, Shanghai, QS and Leiden rankings. There is strong criticism of these lists, though. And of the way universities make selective use of the results. But there is no way round them, says information specialist Wouter Gerritsma.
Crritics of rankings must have sighed at the news that the American magazine US News is launching a new league table for universities all over the world. Yet another. The critics of these lists do not mince their words, although they have become a little predictable. The rankings are not comparing like with like, they say, because every country has its own higher education system. They are also accused of a bias towards English-speaking countries, because all the main international scientific journals are in English. Rich (American) universities also benefit from the capacity to ‘buy’ a place at the top of the list by attracting famous scientists and Nobel Prize winners. Many of the criticisms come from universities that are lower down the rankings than they would like. No that they hesitate to shout from the rooftops if they suddenly do better the next time round.
But nobody can avoid the rankings because they represent custom. This goes for the publishers, but also for the universities, given that potential students increasingly often consult them before deciding where to do a degree. ‘The universities and students in Asia are particularly competitive,’ says Wouter Gerritsma of Wageningen UR. ‘The Chinese who come here have all seen where Wageningen UR stands in the ranking. And Asian universities use the league tables to decide who they want to work with.
So more and more universities are doing their best to be placed high in the rankings. The University of Maastricht, for example, an outspoken critic of the rankings, has established a committee this month tasked with improving its position in them. ‘Increasingly, they determine the reputation and the visibility of the university, and what is more, people base their choice of degree programme on them,’ explains the new head of International Rankings to the Maastricht university magazine Observant. Maastricht has already taken steps to boosts its citation scores.
Gerritsma, who delivers the bibliometric data to the compilers of the rankings, has mixed feelings about them. Some are better than others, he thinks. To his mind, the Times Higher Education ranking is better than the Shanghai one because the Times takes into account education, international collaboration and contract earnings. This puts Wageningen UR far higher in the Times ranking than in the Shanghai one. Shanghai also relies more than the Times on an assessment of reputation, a figure which can only be obtained by subjective means, namely a questionnaire. By reputation, Harvard is the best university in the world. So Harvard always come stop of the rankings in which reputation is an important criterion. ‘Then you are not really evaluating quality; in fact you are basing your assessment on the last ranking, because that influences people’s judgements,’ says Paul Wouters. Paul Wouters is director of the Centre for Science and Technology Studies (CWTS) in Leiden, where he measures and analyses academic production and quality. He also produced the only Dutch ranking: the Leiden Ranking. So he knows what goes into this kind of league table. Although? ‘One of the problems is the lack of transparency. It is impossible to find out how the main rankings are put together. I cannot study the data on which others base their rankings.’
According to Wouters, the most influential rankings – those of Times Higher Education, Shanghai and QD – do not measure the quality of the university but its publications score. This way it is always large, well-endowed universities that come out top. They also place a higher value on papers in Nature and Science than those in other journals, while ‘those are by no means the most important journals in all sectors. This constitutes a bias towards the natural sciences.’
The influential rankings are based, moreover, on information provided by the universities themselves, without any quality control. This enables universities to massage the data to put them in a better light. For example: you can count your PhD candidates as staff (aio’s in Dutch) or as students. Since the Dutch universities started copying overseas universities in counting PhD candidates as students, they have all risen up the rankings.
So the most influential rankings are not the best ones, concludes Wouters. He himself prefers rankings which only evaluate on one or two quality criteria and are not dependent on questionnaires or on data provided by the universities themselves. Then one of the rankings stands out, says Wouters, and that is the Leiden Ranking. Since Wouters just happens to be the creator of the Leiden Ranking, he is aware that he could be accused of blowing his own trumpet. Wouters: ‘But in the Leiden Ranking we make no claim to knowing which university is the best. Using the Web of Science database, we paint a picture of a research group: how much do they publish, and how often are they citied compared with others in their field. That is what we stick to.’
So what is Wageningen UR’s rightful place in the international rankings? Because Wageningen UR is specialized in agriculture, nutrition and environment, you need to look at rankings dealing with these specialisms, thinks Gerritsma. Wageningen hovers around 20th place in the Times Higher ranking for life sciences universities, and is consistently in the top 3 in rankings for agricultural universities worldwide. Gerritsma: ‘We publish large numbers of articles in the field of plant and animal sciences and of environmental sciences. In terms of size we are at sixth place in the world. So it is very logical that we are high on a field-specific ranking like this. What I mainly look at is how well we are doing compared with other strong agriculture and life sciences faculties.’
So the overall scores on the big global rankings do not tell us much, says Gerritsma. An Asian wanting to study environmental sciences in Europe needs to know that Wageningen is in the global top 10 in that field.
Universities are not just assessed for the quality of their research, but also for their education. People who want to know which degree programmes scores highest can consult three Dutch platforms: Studiekeuze123, Keuzegids Universiteiten and Elsevier. Interestingly, these are all based on the same data. So how does that work?
Anyone who dives into the world of
academic rankings soon gets tangled up in a network of providers and methods. This does not make comparing educational rankings any easier. At the top of the pyramid stands the Studiekeuze123 foundation, commissioned by the ministry of education to provide a guide to the higher education landscape in the Netherlands. The foundation maintains a large database with statistics from the universities association VSNU and accreditation organization NVAO. It also holds an annual survey among all 70,000 higher education students (the National Student Survey or NSE) to find out what they think of their degree programme and institution.
All this data ends up on the website Studiekeuze123.nl, the bible for school leavers. The website contains masses of data. Data about the degree programme, the institution and student evaluations of every imaginable aspect of the programme. But what is not routinely offered is information on which programme or institution scores highest. The ministry wants openness about quality, but no rankings with winners and losers.
But users, of course, do want this kind of verdict. And they are well-served by the Keuzegids Universiteiten and Elsevier. The Keuzegids is published by the Centre for Higher Education Information (CHOI) in Leiden. The guide makes use of the same data as Studiekeuze123, but presents and orders it so that it is easy to compare programmes and institutions. In other words, it is a ranking.
The second guide to make use of the Studiekeuze123 database is that of Elsevier. This weekly magazine launched a permanent website for the purpose this year: bestestudies.elsevier.nl. Elsevier too compared programmes and institutions, but makes an editor’s choice of elements that are important for the quality of the education. Sports and canteen facilities, for instance, are left out of the picture.
Elsevier does not rank the programmes, but does indicate which ones are either well above or far below average. The aim is to give a more realistic picture per subject area, in which the differences between programmes are often extremely minimal.
But whichever system you use: Wageningen consistently comes out of comparisons as one of the main players when it comes to excellent degree programmes. According to Anja van den Broek of research bureau ResearchNed, this has a lot to do with the small classes Wageningen is known for and the specialized nature of the programmes. ‘What is more, it is a lively university and nowadays it offers a lot of hip degree programmes. For the time being, Wageningen is expected to stay at the top.’
With thanks to ResearchNed, the research bureau that does the data processing for both Studiekeuze123 and Elsevier.
The rankings ranked by Wouter Gerritsma
- Times Higher Education (THE), a British ranking, influential and comprehensive;
- Shanghai Ranking, also the Academic Ranking of World Universities (ARWU), the main Asian ranking;
- QS World Universities Ranking, British, created by the former creator of the THE.
- The Leiden Ranking, made by the Centre for Science and Technology Studies (CWTS) in Leiden. ‘A research ranking which only counts publications and citations.’
- US News and World Report. New this year: the first American ranking to assess non-American universities as well.
Illustratie: Henk van Ruitenbeek