After the recent scandal around lax assessment practices at Inholland applied sciences university, it seems more important than ever to stick to clear and assessable examination requirements. As a result: the examiners of 750 modules at Wageningen have to draw up a solid assessment plan. A necessary evil.
But lecturer Katja Teerds has now given the course a thorough overhaul. The link between behaviour and hormones is now the theme running through the whole course. There is a new textbook and the exam asks not so much for factual knowledge as for an ability to make connections and to integrate knowledge. ‘That makes the course much nicer', says Teerds, ‘I now get mainly positive reactions.'
The figures back her up. In their course evaluations students indicate that they consider the course academically solid and that they gain a lot from it, giving grades around the 4.5 mark. And the pass rate has improved from 36 to 54 percent. The exam was still not very positively evaluated though, with a 3. ‘I still have to work on that', says Teerds. ‘The students found the new book tough because there are a lot of examples in it, so which examples do they have to know? I'm going to make that clear in the new study guide.'
‘Good that it's being done'
Never again Inholland, is the dominant sentiment in Dutch higher education. The institute rigged exam results, cutting off its nose to spite its face (recruitment went down 30 percent) as well as seriously denting the image of Dutch higher education. Government responded by tightening up examination requirements. All universities now have to demonstrate that their assessment procedures are transparent and that exams are reliable and valid. The first programmes to notice this in Wageningen will be the 35 degree programmes due for the six-yearly inspection this year.
In total, assessment plans have to be made for 750 courses at Wageningen University. The examiners check whether the exams are sound and whether students get the grades they deserve. ‘It's a devil of a job', sighs one of the teachers involved. Only to add, ‘but it is good that it is being done.' Other teachers enjoy having other people take an interest in ‘their' subjects, and think it is useful.
Valid, reliable and transparent
The magic words for exams are: valid, reliable and transparent. Validity means that the exam clearly reflects the goals of the course. ‘An obvious example: you don't test oral presentation skills with a written exam', says Mauric Franssen, teacher of Organic Chemistry and secretary of the examination board on Technology and Nutrition. The assessment must also be reliable: if two examiners grade the same paper, they should arrive at the same grade. And thirdly, the assessment must be transparent: students should know in advance what they have to know and how they will be assessed. ‘An example is group work, in which we also want to assess the participants individually', says Franssen. ‘Then it must be perfectly clear in the study guide and the assessment plan how we are going to that.'
The examination boards haven't come across any major slip-ups. ‘But we simply have to be consistent', says Dick van der Hoek. ‘If we have formulated a learning goal, it should be reflected in the exam. And if one student gets a 7 and another a 6.5 with comparable answers, you have to be able explain why. Van der Hoek is evaluating the assessment plan of 20 courses at Environment and Landscape. He is almost halfway.
But making sure the rules are kept is not the only goal of quality control on exams. It is a positive development for students too, says Van der Hoek. ‘Students are studying more and more purposefully. The exam used to be a selection criterion. Nowadays it is a way of providing guidance: we explain properly exactly what students need to know for each component and how much weight is given to it, so that students know how much energy to spend on which topics. They are no longer taken by surprise in the exam.'
This reflects another of the university's objectives that can be acheived through the examination boards' rounds of visits to the examiners. Explaining better what students have to do, learn and understand should lead to fewer delays in completing a degree course. Such delays are bad for both students and the university, now that the Bachelor's courses have been clearly separated from the Master's courses. In other words: arbitrariness and lack of clarity concerning exams have no place in this day and age.
Wageningen students give their courses high marks, as can be seen every year in the Higher Education Degree Guide. But students no longer express their appreciation directly to the teachers. Only 20 to 30 percent of the students fill in the course evaluations, say Franssen and Van der Hoek. ‘So teachers do not get a good impression of how popular their course is.' Formerly, course evaluations were much more systematically filled in by students. They used to be handed the questionnaires on paper by the teacher at the end of the last class, and everyone had to fill it in before leaving the room. Since a digital form was introduced, there has been a low response from students. Katja Teerds, who teachers Behavioural Endocrinology, has a problem with this too. ‘I got a response of only 14 percent, dramatically bad. The exam was in July and everyone's mind is on the holiday by then. But it means you don't get a real sense of what students think of your course.' Teerds has a solution: ‘a compulsory evaluation to fill in straight after the exam.'