Project introduction and background information
The popularity of quiz tools is based, in part, on the voting/revoting paradigm according to which students first provide their initial answers to multiple-choice questions (i.e., âvoting phaseâ), receive aggregated feedback based on class responses (i.e., collective feedback), and then answer the same questions for a second time (i.e., ârevoting phaseâ), before they receive the correct answers and participate in the class discussion that follows. Collective feedback, however, is often limited to the popularity of each question choice. This is true for many well-established tools such as Kahoot, Socrative, and Wooclap. Consequently, students may feel encouraged to focus more on probabilistic strategies, changing their initial answers to the most popular one (i.e., conformity bias). To address this information gap, studies on online assessment and group awareness have suggested including additional feedback metrics that could better describe the characteristics of the population that voted for each question choice. At the same time, literature on metacognition suggests that eliciting metacognitive judgments from students may have a positive impact on their understanding and metamemory (e.g., âHow confident are you that you got the right answer?â; âHow much did you prepare for todayâs quizâ, etc.).
Objective and expected outcomes
The project utilizes quiz activities that elicit metacognitive judgments during the voting phase and use studentsâ responses as collective feedback metrics in the revoting phase. In addition, the project explores the potential of allowing students to write short justifications for their answers, thus offering the opportunity for elaboration in an otherwise closed-type interaction. Therefore, in some quiz activities, the students not only see the popularity of the different choices but also how their peers justified their answers. Finally, the project explores how introducing group gamification within the context of a quiz affects student performance. Based on the above, the project aims to:
- Â Create instructional material with guidelines and best practices for the teacher on designing meaningful and challenging quizzes that would go further than memory recall.
- Explore how using different metacognitive judgments as feedback metrics could benefit studentsâ learning in quiz activities.
- Explore how writing short justifications may affect studentsâ performance in quizzes.
- Explore how introducing group gamification (i.e., group leaderboard) could affect studentsâ performance in quizzes.
Results and learnings
Feedback on easy, challenging, and wicked questions
Enriching collective feedback by showing how confident and prepared the students are can be very beneficial and counteract the conformity bias, but mostly when a question is challenging (e.g., when the students are closely divided between a couple of choices). In the case of an easy question, the popularity of the correct answer is too high (usually over 70%), so students tend to switch their answers to the most popular one during the revoting phase. Unfortunately, this also happens in wicked questions (i.e., in questions where a wrong choice is by far the most popular one).
The feelings of preparation and confidence are good predictors of performance
Correlation analysis has shown repeatedly that studentsâ self-reported feelings on how prepared they are for the quiz and how confident they are that they got the right answer are strongly and positively correlated to their performance in the voting phase. In other words, preparation and confidence metrics can be useful indicators of the correct answer. This is also the reason why using such metacognitive judgments can counteract the conformity bias in challenging questions. When popularity information is not enough, students rely on other feedback information to find the right answer.
Justifications, confidence, and performance
The more confident the students feel, the longer the justifications they write. Even though the length of the justification is not always correlated with the actual performance, the analysis showed that asking students to take a moment and justify their answers can be beneficial for them. Further analysis is needed to identify whether the improved performance is based on actual reflection on the question, on the prolonged time-on-task that a writing task imposes, or on the disruption of superficial answering strategies (i.e., clicking without thinking).
Group gamification can be perceived in different ways
When asked the students were divided on whether adding a competitive component into a learning activity is a positive thing. Results suggested that in several cases, students who participated in a quiz without gamification elements outperformed students that were competing against each other. Negative feelings towards a competitive activity could be one reason for such a result, while an alternative explanation is that gamification consumed valuable time on task that would otherwise be used more productively by the students.
Recommendations
Aim for challenging quizzes
Ideally, a quiz should be challenging enough to differentiate between levels of knowledge. Too easy or too difficult questions do not provide a clear picture of the studentsâ knowledge to the teacher and they can be the cause of a false feeling of achievement or disappointment for the students.
Eliciting metacognitive judgments is easy and beneficial even without the collective feedback
Adding a simple question asking the students about their confidence could activate reflection or increase attention. So, even if using these metacognitive judgments as collective feedback in revoting phase is technically challenging, it is still advisable to use them at least in the voting phase. Eventually, they will also provide a better picture to the teacher of the state the students were during the quiz activity.
Educate teachers and students on the role of collective feedback
In case of a revoting phase, it can be useful to explicitly explain to teachers and students that a quiz is not a popularity contest and very often a popular answer is the result of a common misconception. So, participants should always consider the collective feedback with a pinch of salt.
Enrich the collective feedback when possible
Having said the above, it is strongly advisable to enrich, whenever possible, the collective feedback to present more information about the audience than the popularity of the question choices.
Use gamification with caution
This has been mentioned repeatedly in the literature. Not all learning activities benefit from gamification and not all students appreciate a competition. The integration of gamification into a learning activity should take into account the instructional design and the specific audience characteristics.
Written justifications improve the learning experience
Not all closed-type questions are appropriate for justification. The question should allow some room for discussion, otherwise, it is difficult for students to provide meaningful justifications. The process of writing justifications makes the activity more meaningful, but also more demanding and time-consuming. Yet, it is advisable to include the possibility of written justifications, especially in self-paced quiz activities that students could do at home.
Practical outcomes
The project produced several quizzes in the Psychology bachelorâs programme and the Educational Science and Technology master's programme of the University of Twente. In addition, a workshop for teachers on good practices will be held by the PI.
Finally, the project background and objectives draw from the PIâs past research activities:
Â
Papadopoulos, P. M., Obwegeser, N., & Weinberger, A. (2021). Concurrent and retrospective metacognitive judgements as feedback in audience response systems: Impact on performance and self-assessment accuracy. Computers & Education Open, 2, 2021, 100046, ISSN 2666-5573. https://doi.org/10.1016/j.caeo.2021.100046
Â
Papadopoulos, P. M., Obwegeser, N., & Weinberger, A. (2021). Let me explain! The effects of writing and reading short justifications on students' performance, confidence and opinions in audience response systems. Journal of Computer Assisted Learning, 1â11. https://doi.org/10.1111/jcal.12608
Â
Papadopoulos, P. M., Natsis, A., Obwegeser, O., & Weinberger, A. (2019). Enriching Feedback in Audience Response Systems: Analysis and Implications of Objective and Subjective Metrics on Studentsâ Performance and Attitudes. Journal of Computer Assisted Learning, 35(2), 305-316. https://doi.org/10.1111/jcal.12332
DISCLAIMER
The project has been funded by the Teaching Academy of the University of Twente through the WSV Fund. Â