Higher pass rate and lower workload thanks to comparing tool

Comproved conversations hoger slaagpercentage

Higher pass rate and lower workload thanks to comparing tool

At the Rotterdam Business School (HRBS), educational consultant Eline van Hamersveld and her team experimented with the Comproved comparing tool. The goal? A reduction in teachers’ workload. The result? The workload was lower and the students achieved a higher success rate. How did they go about it? We asked Eline.

How did you get to know Comproved?

“At Creative Marketing and Sales, a course at the HRBS for which I work, some colleagues had seen the tool on an information market. They were very triggered by the concept and asked me to investigate further what this could do for us.”

What was your motivation for getting started with the tool?

“There were several reasons why we wanted to test the tool. First, teachers indicated that they wanted to reduce their workload. In addition, we wanted to increase the role of peers in the learning process. Furthermore, from an educational perspective, I wanted to investigate whether we could assess more reliably with the tool. Finally, and I think this is also an important one, we wanted to bring variation in the way we assess.”

What exactly are you using the tool for?

“We tested the tool in a course called ‘Differentiation’. That course is about how companies position themselves and whether there is a gap in the marketing. The students investigate the positioning issue and incorporate their findings into a paper. Previously, 165 long papers were submitted to the instructor at the end of the block, who then had to grade them and assign a grade. That was a lot of work. ”

“We have modified the course. Students now write a one-pager. They also give each other peer feedback halfway through the course using the comparing tool. Based on that feedback, they rewrite their paper. Then they assess and comment on each other’s final products with the tool. Finally, the instructor assigns a high-ranked and a low-ranked paper a grade and the tool calculates the other grades.”

The ranking based on peer assessment did justice to the actual quality of the papers. The students had a good sense of which paper was of better quality and which was not.

How did you go about integrating the tool into the course?

“We were granted budget and time from the Teaching and Learning Technology Workplace to test the tool. We first organized a mini-test with some teachers and students. We wanted to know what the students thought of the tool and whether they could easily work with it. Because that mini-test went well, we wanted to actually integrate the tool into the course.”

“We then redesigned the course with the teachers. We first devoted a lesson to getting to know the tool. That way the students could already see the tool and try it out using examples. Halfway through the course we organized a formative moment with a peer assessment using the tool. At the end of the block, the tool was also used summatively. Then students had to come to school during test week to make their comparisons.”

“Afterwards, we held a calibration session with the teachers where we looked at some of the student work. The ranking based on the peer assessment did do justice to the actual quality. The students had a good sense of which paper was of better quality and which was not.”

Were you satisfied with the results?

“The teachers were very enthusiastic because they found it valuable to work with peer feedback halfway through the course. They also saw that the students were learning more from the feedback. Students used their insights from the peer feedback to make their papers better toward the final product. Furthermore, the teachers felt that the tool reduced the workload because the work was more spread out. Of course, they had some work on setting up that new tool and setting up the assessments. But at the end they did not have to read all 165 papers, just some final products.”

We saw that the average grade was higher and more students passed the course while the learning objectives remained the same.

What do you think is the biggest advantage of the tool?

“We think the main benefit is that it makes the students learn more. We were able to compare the grades with the year we did not use the tool. We saw that the average grade was higher. Also, there was a higher pass rate among the students while the learning objectives remained the same. So we think the students learned more and better.”

Did you also experience any disadvantages?

“Some work does creep into the preparation and guiding. For example, it is very important that the students are properly guided in giving valuable feedback. For that, they definitely need instruction and practice. It is also best to embed the feedback giving in the lessons. Of course, this was a first-year course in the first semester. When they practice more often they will acquire feedback literacy faster. We are thinking about expanding the use of the tool to other courses and other moments in the academic year.”

Do you have any tips for teachers who also want to use the tool?

“You need time and space to embed the tool properly. It is not something you do quickly. You have to become proficient in it. Also look for experiences of others. What are the pitfalls? For example, not paying attention to giving feedback beforehand is a pitfall. And finally, you need to properly brief your students. We were sometimes asked by students why they had to grade each other’s papers. After all, they thought that was the teachers’ job. We then explained to them that they could learn a lot from this and that this would help them master the assessment criteria. Once we explained that, the students understood it and were also excited to get started with the tool.”

Want to learn more about comparative judgement and the comparing tool? Download our ebook

Share this