For decades, academic quality and impact have been assessed in the same way. But is that method adequate in a world fundamentally changed by digitalization and the social media? Altmetrics offer an alternative.
Microbiologist Erwin Zoetendal does not have to worry about whether his work is noticed by the general public. Since he helped prove that transplanting gut flora - faeces, in other words – cures persistent intestinal infections, he has received a steady flow of emails from patients, school students and curious members of the public. And the evidence for this sizeable public interest is in the figures: his study has been mentioned in 41 newspaper articles, 31 blog posts, 1086 tweets and 87 Facebook posts. More than any other Wageningen publication in recent years. Keeping track of how academic publications are doing in the media – including the social media – is a new approach to assessing a publication’s impact.
Currently there is a lot of experimentation going on with these kinds of alternative assessment systems, also known as alternative metrics, or altmetrics. Proponents of these methods see them as a useful addition to or improvement on current impact assessments. For decades researchers have been judged by the number of citations of their work by colleagues and by the ‘impact factor’ – the average number of citations per article – of the journals in which they are published. These figures, and others derived from them, have a big influence on who gets grants and jobs and thus a career in the academic world. But the traditional methods of assessment have their shortcomings. These were listed in a manifesto for altmetrics published by four researchers in 2010. One issue is that the impact factor says very little about the quality of individual articles but is nevertheless used as if it does. Secondly, the number of citations someone gets can easily be manipulated, the critics claim. What is more, the number of citations does not tell you how good the work is considered to be. ‘Read the nonsense in this paper’ is just as much a citation as ‘look at this important contribution’.
Altmetrics are said to offer several advantages over this system. You can see more quickly whether an article is going to be influential, and a broader concept of impact is used than the views of fellow scientists. But altmetrics have their own idiosyncrasies. Media impact can be manipulated too. Methods which are now used to promote commercial messages in the social media could soon play a role in science if this approach is taken. The emphasis on media attention also increases the role of media logic, with its focus on strange and striking findings, in the assessment of scientific impact. This is clearly illustrated by the fact that the Altmetrics Top 20 for 2014 included not just articles about ebola and stem cell research, but also articles about time travellers on the internet, penises in female insects and alcohol abuse by James Bond.
Altmetrics are still in their infancy, says Wouter Gerritsma, information specialist at the WUR library. All sorts of potential yardsticks are being tried out without knowing whether they are sound. Companies such as Altmetric and Plum analytics are newcomers which monitor the number of mentions in social media and on academic platforms. But even the well-known academic names are experimenting with adding detailed statistics, such as the number of times an article has been viewed or downloaded, to their conventional citation scores. And Gerritma thinks this is just a start. He believes it is possible to generate much more specific information, such as keeping track of how many times an article is cited in the process of drawing up legislation. This would give a better statistical overview of the influence of Alterra reports, for example.
Although they form something of a global movement, altmetrics remain relatively unknown among scientists. Microbiologist Zoetendal, for instance, is pleased to hear that his article scores high, but has never looked up the statistics himself. But there are Wageningen researchers who keep an eye on their score on the altmetrics lists. Colin Khoury, a PhD candidate at the Centre for Crop Systems Analysis, discovered altmetrics when he published an article in PNAS. This journal works together with the company Altmetric, which monitors the attention paid to all scientific articles –as with Zoetendal’s – in all kinds of media and on platforms. Next to every article on PNAS is a ‘donut’ showing its impact score. This expresses public interest in a single figure. Not every mention carries the same weight – Science counts for more than twitter.
Khoury is very enthusiastic about the new yardstick. ‘It is an ideal method,’ he emails from Colombia where he is doing his research. ‘My co-authors and I have used the method to demonstrate our social media impact in research proposals, for example.’ In Altmetric’s application he sees not only a single overall score but also precisely which media have cited him. He also gets to know more about his readers: their origins, for instance, their professions and even their place in the academic pecking order. Not that many researchers will find themselves cited as much as Khoury, however – his article went all around the world and drew a tremendous amount of attention. So his altmetric score is sky-high, even for PNAS. Interest is growing among other groups than scientists themselves. ‘You can see that universities are starting to hear about it, too,’ says Gerritsma. Recently, for instance, he gave a presentation for the Wageningen graduate schools, where people are thinking about whether altmetrics can help them in future to find the fairest possible way of assessing relevance.Teething troubles
Gerritsma does not wish to dismiss the teething troubles of the new system, but he thinks many of them will be solved as altmetrics are further developed. ‘This development simply needs time. In the next few years we shall see what works and what doesn’t.’ Meanwhile, altmetrics are gradually gaining ground in Wageningen. The library hopes soon to make visible on the website in a user-friendly way how often Wageningen articles have been viewed and read. And even Zoetendal seems cautiously to warm to the idea. ‘Perhaps I should take a look at my scores after all.’
Are you a researcher and are you curious as to what altmetrics might mean for your articles? Then look up ‘altmetrics’ on resource-online.nl. Here you will find a ranking of the most popular Wageningen articles of recent years, as well as useful links.