Predatory publishers have appeared on the scene in the wake of successful open access (OA) journals such as PLoS ONE. OA journals are free for readers but authors pay a publication fee. This makes it possible to set up a journal of this kind and make money without having to invest in such things as peer reviewing, proper archiving and editing. Of course there were always bad journals, says information specialist Wouter Gerritsma, but the internet is making it worse. 'It has become much easier to become a publisher. You just have to have a website and you can publish articles as PDFs.' Worldwide there are now hundreds of academic journals of dubious quality and intentions, to say the least. They charge publication fees varying from a few hundred to more than a thousand euros.
Wageningen researchers sometimes get taken for a ride too. Professor of Crop Physiology Paul Struik, for example, published two papers last year in academic journals belonging to predatory publisher Academic Journals. He regrets it now, he says. 'The quality of the layout was bad and it was difficult to get changes made. What is more, they were extremely aggressive in demanding payment. I received 10 reminders and sent the same proof of payment 10 times.' After publication, Struik was also put on a list of reviewers. Only to find his mailbox filling up with an 'incomprehensible mess' of manuscripts.
Other researchers make deliberate use of the many highly accessible new journals. They can provide them with a place to publish disappointing results, for example. 'I still had an introductory chapter from my thesis in which there was nothing very new,' says researcher Roland Melse. A new OA journal was happy to publish it without any significant changes. 'So I hadn't written it for nothing, after all.' According to Google Scholar, it has been cited13 times already.
Some researchers are astonished to find their articles turning up on the 'dark side'. Researcher Hans Stigter, for example, says he never gave permission for the publication of an article with his name on it in a dubious journal. Others, however, are not at all keen to talk about their 'dubious' publications. They email grumpily back saying that PhDs had contact with these journals or that they had nothing to do with the publication in question.
In order to lure unsuspecting scientists, predators must look respectable and trustworthy. What better way to do that than with a big name from the scientific field concerned? So publishers spam scientists continuously with requests to join their editorial team. A handful of Wageningers fell for it.
Associate professor of Phytopathology Bart Thomma accepted a position at The Open Plant Science Journal in 2007, when mailings were still rare. He still hasn't done any work in this capacity so he has neither positive nor negative experiences. The decisive factor in his agreeing to the role was the presence of a highly respected fellow scientist. This may sound naïve to bystanders, but Thomma points out that even for established journals, informal contacts and reputations are important. A CV is superfluous because your publications are really your calling card. 'Perhaps it is indeed naïve and I let myself be impressed by the names on the list,' says Thomma now.
For professor of Organic Chemistry Han Zuilhof, too, his experience as an editor became a headache. 'I was approached by email by OMICS, with a reference to a respectable colleague and friend at the Weizmann Institute,' says Zuilhof. With some hesitation, he agreed to become editor, but his doubts quickly grew. 'The impact factor they gave for an existing OMICS turned out to be a fiction and I didn't get any articles to review either: precisely nil.' When Zuilhof received no answer to questions, it was the last straw. He resigned without having seen a single paper. But OMICs still boasts of having him as editor.
Librarians nowadays have their hands full filtering out predators and warning people about them. 'Knowing whether a publisher is to be trusted is often a case of gut feeling,' says Wouter Gerritsma. What he does with suspect journals is to ask researchers about their publishing experience. Such publishers can also often be spotted because of their unprofessional sites. Take the World Science Publisher (motto: make easy publication), whose website is a free blogspot and is full of spelling errors.
Gerritsma is afraid, though, that a few rotten apples could damage the reputation of the promising open access system. For this reason he would like to see the prompt development of an objective assessment method for sorting good journals from bad (see box). For the time being, the American librarian Beall has become the leading authority. His list (Beall's list) is currently the best indicator of the reliability of a publisher.
However, Beall thinks researchers should learn to see through fakes themselves. They should be able to resist the temptation of superfast publication and they should be careful what they read and cite. Because the unpredictable and low quality of peer review in predator journals leaves a lot of scope for unethical behavior such as plagiarism.
But that sort of clumsiness is precisely what can give the game away. American publication specialist Phil Davies succeeded in getting an article published that consisted of a computer-generated mishmash of words. So if you have nothing better to do one drizzly November afternoon, you too could try the fast route to an academic career. Who knows, you might soon find yourself in the - no kidding - Antarctica Journal of Mathematics.
Predatory journals can earn money riding the wave of the massive boom in open access publishing. It is not easy to compare the quality of these journals objectively. Scientists try to do so using the average number of citations (the impact factor) but that factor can only be calculated after three years. So, led by former Wageningen librarian Leo Waaijers, ICT organization SURF is working on a new method. This approach will be based on the quality of the peer review. Is it transparent, for instance? Is the process described in detail and is the evaluation as objective as possible? This information should be on the journals' sites.
The approach was tested at a recent meeting. 'It went remarkably well,' says Waaijers. 'The predatory journals were spotted immediately. The objection made against Beall's list that it is too subjective cannot be made against this instrument. The project will be rounded off early next year. The final form the method will take has not been decided but it may be an app.