We were interested to know whether we could use our Qwizdom audience response handsets for students to input their answers to paper-based multiple choice tests that were given in examination halls. The impetus for this was to have the papers marked both quickly and by machine.
We produced an exam that included a number of different types of multiple choice question to demonstrate that MCQs need not be simple tests of memory. We included:
· Multiple Choice (choose one from four) to test knowledge.
· Multiple Choice (choose one from four) to test comprehension / interpretation.
· Multiple Answer (choose two from five) to test knowledge and judgement.
· Assertion-Reason Question.
· Multiple answers with positive and negative scoring on the responses.
· Text input (short answer – one or two words).
· Interpretive question based on a text with True/False and No inference responses.
The first trial involved members of the e-learning team taking the exam and imputing their answers into Qwizdom Q6 handsets. It was good that we did this run through before inviting academics and students to be part of a larger test as we learnt quite a lot and made numerous revisions to the exam paper and the process.
1. We wanted to run the session on a laptop as there are not always computers in our exam halls. For reasons we never discovered the software was slow to load so it was decided to reboot the laptop. We then had to wait for downloaded updates to install resulting in a 20 minute delay before we could start. In a real exam this would have been a disaster. The second time we ran the test we ensured that the laptop was prepared well in advance and carried to the exam hall already switched on and ready to go.
2. We thought that the default question numbering on the Q6 screens (e.g. q#001 in the top left corner of the screen) was too small and students might be confused as to which number they were on. Our questions were too long to display on the screens and we had this option switched off. So we redesigned the questions so that the question text was simply “Question #” and we displayed this. Then, on the handset screens the students could see the default q#001 in the top left corner of the screen and in a much larger font in the middle of the screen “Question 1”.
3. Our exam question booklet had on the inside cover instructions as to how to use the handset. In light of feedback from the first trial we moved this to the back cover so it could be read before the students turned their exam booklet over at the start of the exam. We also rewrote many of the instructions to be clearer.
4. We realised that some of the AnswerKey questions and answers were wrong despite having proofread them (e.g. the number of answer options on the handset did not correspond to that on the paper). This just shows that you must trial such an exam before giving it to students!
By offering vouchers that could be spent in our on-campus coffee shop we attracted 30 volunteers to sit the test in exam conditions.
1. We did not have any problems with the laptop this time.
2. At the beginning of the exam we asked students to switch on their handsets (we handed them out switched off to avoid some going into standby mode before the start time) and to join the session with their student number. We went through the instructions about how to use the handsets and told anyone who had problems with their handset during the exam to raise their hand. We decided this was better than trying to write instructions for all contingencies on the exam paper (e.g. what to do if connectivity was lost).
3. No students had significant problems, though two students’ handsets lost connectivity. This was easily restored and did not phase anyone.
4. In a real exam we would have finalised our participants list before the start of the exam, but for various reasons we did not do this on the trial and we added people to the list at the last minute. The process we used was to add the student’s student number, first name and surname to an Excel worksheet then export it as a CSV text file and import that into AnswerKey. After the exam we realised that some of the names were missing from the results. The students had taken the exam but their results were saved against the default word “participant”. After much head scratching we discovered that we had imported into AnswerKey an old file. The lesson learnt is that you should always use version numbers in the participants list file names. Or simply use one file!
5. We also noticed that there was another student whose name did not appear and yet he had been on the csv file that had been imported. The reason for this was that the student number we had typed had been incorrect. The student had imputed his correct student number which did not match with the number on the csv file.
6. In one way, we were lucky that students could join the session without being on the participants list, but this was simply an oversight on our part in that we had not restricted access. In a real exam we would do this to ensure that all input from handsets is stored against a number and a name.
7. The exam progressed nicely with no incidents and the answers the students inputted could be seen in real time on the laptop. At the end of the exam we were able to extract all the grades very easily.
Feedback from students.
We asked for feedback from the students after the exam and although all completed the test without real issues some did make some valid points.
1. I found the handsets really easy to use but I got confused when it came to questions 10,11,12,13,14 as the layout was slightly different and I found I kept pressing ‘change’ instead of going onto the next question. Other than that I thought the handsets were a brilliant idea!
[These were questions that required multiple responses to be selected. The correct way of doing this is to choose the answers with the up/down arrows, select them with the Enter key and then press the Send key. In previous questions., where only one answer was required students did not need to select the answer with the Enter key before pressing the Send key]
2. It was a bit difficult to start off with- only because it was unfamiliar, but it was very easy to get used to. (By about the third question). Easy to type- it’s just like texting. Nothing much to say really.
3. Both my friend and I thought they worked great. The handsets were very easy to use (similar to my old style Nokia phone) so very user friendly. My connection didn’t fail and I wasn’t confused at any point.
4. I have arthritis in my right thumb which I have not had for very long but is really painful (I am only 48) so I found it slightly difficult as the buttons were quite small and fiddly.
5. I personally found it a little confusing matching up questions on the page and on the clickers. I also felt that it failed ‘the Technology Works When It’s Invisible’ test. I felt I was focusing more on using then clickers then answers. I personally feel that online tests maybe a better option.
6. Mostly I found it easy to use. However, when the question asked for text to be typed the handset was VERY slow at accepting each typed character. In a real exam situation this would be very stressful. I don’t think it should be used for more than one-word answers or preferably not at all for worded responses. I would say it’s only useful for the multiple-choice answers.
7. I am familiar with Quizdom handsets, as my employer uses them as part of the experience for children who attend xxx. No issues to report.
1. We have concerns about some students being confused about which question they are on. Possibly this can be dealt with by providing a stronger verbal message at the start of the exam, i.e. to check the number on the screen before answering. Otherwise the trial seemed to work OK.
2. Before advocating that we run proper exams in this way, I would like to see it used more often in less high stakes situations, perhaps progress tests where the results do not count towards the assessment of a module.
3. Where the small size of the handset window is an accessibility issue for some students, it might be possible to supply laptops with the same questions in a different quiz application.
4. Generally it worked fine, all the students completed the exam, no one had to mark the answer sheets and the results were available immediately.