The browser you are using is not supported by this website. All versions of Internet Explorer are no longer supported, either by us or Microsoft (read more here: https://www.microsoft.com/en-us/microsoft-365/windows/end-of-ie-support).

Please use a modern browser to fully experience our website, such as the newest versions of Edge, Chrome, Firefox or Safari etc.

Quizzes, question banks and random draws

an image showing a rubrics cube that is white

Hans Knutsson, senior lecturer in business economy at EHL, describes the experiences he and his colleagues have had with online exams and the use of Canvas New Quizzes.

Photo by Honey Yanibel Minaya Cruz on Unsplash 

At the School of Economics, where I work as a senior lecturer in business administration, we have spent the last two years working on switching platforms to Canvas. We started early and we were very lucky to do so – the corona pandemic forced us into a digital format practically over a spring-winter night. The things that some of us found slow and tricky has now become a well laid out path for how to, as quickly as possible, switch from a physical classroom to a digital classroom with sometimes hundreds of students at a time. The work has mostly exceeded expectations and Zoom has now become the place that is a natural meeting spot, easily accessible through Canvas.

What has been the most difficult pedagogically is to conduct exams in a good way. At the School of Economics we have worked with Inspera and Peergrade in parallel with Canvas for a few years, but now some of the staff have discovered Canvas New Quizzes (see the fact box to the right) as a good alternative to traditional literature exams. We are becoming more and more like those who speak with words and phrases such as Question Banks, minus points and “restricted backtracking”. Multiple choice questions has been an easy form of examination online.

The problem we saw early on with multiple choice questions in an unsupervised home environment is – as a colleague put it - not just an “open book exam but an “open all that exists exam” and can be cheated wildly.

Our way of handling this challenge was to create large question banks and to choose questions randomly from them to give to students to answer in a limited time. Canvas also has the nice function that allows us to prevent students to revisit previous questions, so that the questions do not repeat in the same order (even if the random selection of questions addresses most issues in that respect) and that the order in which the multiple response alternatives can vary between students.

Here is the solution itself: if the students cannot easily identify all the questions that are asked and cannot quickly read the response alternatives in a coordinated and structured way, the potential to collaborate decreases. This particular time restraint has raised some questions from both colleagues and students, but in my opinion time is an essential factor in assessing how well a student understands a question or not. The national university entrance exam is also carried out on limited time, to name a well-known example. So, instead of allowing for more time, we have introduced point deductions for incorrect answers, which Canvas allows us to do in a very simple way.

It has been easy and therefore enjoyable to work in Canvas and several of us are now ready to raise the bar further, among other things by starting to use other question formats than multiple choice questions. The technical possibilities that New Quizzes provides are great, and our own habitual notions of how to test a student’s knowledge and abilities will be the biggest obstacle for us to meet the challenge of online tests. The teacher’s preconceived notion of what an exam should look like is probably more a limitation than that of technology.