Way back before the summer we started working on a version of Exeter’s paper-based digital literacy self-awareness activity called iTest. If you’re not familiar with iTest, it’s a quiz that asks users to answer questions about how they use technology in their academic life and then gives them scores in six broad areas and advice about how to increase their digital literacy in these areas. For more information please see Exeter’s Collaborate Blog.
Where we are now.
I first saw iTest at a Changing the Learning Landscape event at Exeter back in March (link to previous blog posting) and thought it looked very useful – something I would like to have when we do new student induction sessions at Newman. However, I thought it would be cool to have it available on the students’ phones. After that CLL event there was the opportunity to apply for a small amount of funding (£500) to fund a “mini project” based on the CLL event. I spoke to two academics in our IT department who said they’d be interested in developing such an app and we were lucky enough to win an award.
The development of the app has been quite interesting for many reasons. Technically there were quite a few challenges to overcome but my involvement has been on the front end which is what I will describe now.
What is an app?
First, I should say that early on in this project we debated at length what the word ‘app’ meant. For me it meant something that you downloaded onto your phone but for my IT colleagues it had another meaning. The result is that we have a web application that can be viewed on any web browser and to my mind it looks great on a phone but not so good on a laptop. We also have an app that is connected to a database which gives it certain functionalities that we wanted and can make it more powerful with further development.
The functionality we wanted was that the institution using the app would be able to change the category names (Digital Dodger, Media Mogul etc) and also the questions and the feedback. This data is stored in the database and an admin interface allows these changes to be made via a web browser.
The admin interface.
This consists of four screens allowing the admin to edit the welcome text, the category names, the questions and the feedback pages.
Originally we had thought to save the students’ responses into the database so that this could feed into their Personal Learning Profiles but opted instead to keep things simple and not ask the user to create a username and password as this might have put some students off, though we might revisit this for next semester.
One of the technical challenges that the developers faced, and which I think they enjoyed, was randomizing the 30 iTest questions yet extracting scores in six categories. This desire to stay true to the Exeter version in mixing the categories up, meant that we could not use any existing quiz generator we knew about. (There’s probably one out there so if you know of one, please let us know). At one point in the development we considered abandoning this and asking the student to choose a category. This would have allowed far more flexibility in writing the questions and feedback, but in the end we thought we should try to be as close to the original as we could.
One small leaflet into one small phone doesn’t go.
At the outset it was our intention to simply create an interactive digital version of Exeter’s leaflet, however shoehorning the contents of the 12 page A5 leaflet into a tiny smart phone screen was a challenge. For example the paper version had an introductory text which we thought was too long for the first screen on a phone.
The Paper version also had 30 statements rather than questions.
e,g, I make use of digital technology that few other students are using.
with the answers;
- Not true of me
- Sometimes true of me
- Definitely true of me
In the interests of brevity and so that we could refer to them as questions in our instructions (as this seemed more natural) we changed the statements to questions, e.g. How confident are you in being able to find and subscribe to a podcast connected to your studies?
and used the answers:
- Not very.
- So so.
The design of the app necessitated that all answers had to be the same (as they were in the paper version) but this made the writing of the questions quite difficult, especially the questions in the Digital Dodger category and is one drawback of our design.
When I first looked at the paper-based iTest I thought it would transfer from Exeter to Newman without problem, but as I read the statements carefully and tried to convert them into questions I realised some did not work for us.
Here are two examples:
- My field of study is heavily influenced by the technologies and/or media we use.
- My studies would be impossible without digital technologies.
I think the idea was to profile students who had chosen disciplines that were heavily dependent upon technology like computer science or probably most science subjects. At Newman we do not offer such subjects so these sorts of questions became irrelevant.
The six categories.
The Exeter version presented six types of user:
- Digital Dodger
- Career Builder
- Digital Guru
- Media Mogul
- Information Junkie
- Online Networker
The Digital Dodger was described as a student who is reluctant to integrate technology in their academic life. However, the feedback given in this category did not imply that this was necessarily negative – “You have your own reasons for not engaging with technology in your studies and it does not necessarily impact negatively on your academic work.” We agreed with the sentiment but decided to make it a negative. In the app we did not have much space to present lots of information (I know we could have presented pages of scrollable information but we thought no one would read it) so it became easier to make the Digital Dodger something that should be avoided.
This in itself presented problems writing questions with the same answers as the other five categories, especially as the design of the app meant that they had to score the same, ie the top answer had to score 1, the middle answer 2 and the last answer 3 (as per the original version).
Another reason for making Digital Dodger completely different is that in our induction sessions this year we overtly discussed digital literacies with our new students. We talked about five different literacies and outlined how we would support students in these areas and we discussed preparing for the workplace in terms of acquiring digital skills and using digital technologies to present yourself professionally. The categories we use in induction are:
- IT Literacy
- Information Literacy
- Media Literacy
- Communication & Collaboration
- Learning Skills
So it seemed obvious to map these areas with the Exeter categories:
- IT Literacy – Digital Guru
- Information Literacy – Information Junkie
- Media Literacy – Media Mogul
- Communication & Collaboration – Online Networker
- Learning Skills (incorporated into the others)
- Preparing for the workplace – Career Builder
I have already mentioned that we had to alter some questions because they did not fit our students. When we decided to match the categories with the digital literacy categories we intended talking about at induction, we also decided to change the questions so they more directly related to our induction.
One of the benefits of the admin app is that you can easily change the questions and this is what we did. We more or less started again. Some of the ideas behind the original questions remain but basically there are 30 new questions.
The Feedback pages.
We also had to modify the feedback pages. The original version had a three column page with a column for each score – low, medium and high. In each column it then detailed the implications for the student of the score, said what was important for the student and presented some advice about how increase the score (or decrease it for Digital Dodgers).
This we thought was too much to be displayed in what was to be a light-hearted self-awareness app on a phone.
Also, although we understood that self-awareness might include understanding who you are, we decided to take out the sections called “important to your experience as a student”. We figured the students would know this instinctively and needn’t be told. We also decided to remove the descriptions that said the score “indicates that you…”. Again we thought this might be self-evident. So we stripped back the feedback to just the bare minimum – so we say – An X is… To improve your score you might do Y. (The paper version had a page describing the six types of user and we moved these descriptions onto each of the six feedback pages.)
We also bullet pointed things the student could do to improve their scores in each area to try to make it less wordy and hopefully more useful.
The text that appears on the feedback pages is stored in the database and is inputted there as html. At one stage we had the feedback pages being external to the app but that presented two problems – it wasn’t easy to navigate back and the institution running the app would also have to create and host separate feedback pages. So we returned to using the database model.
Links to live iTests.
This link will take you to the Newman version of iTest. As already explained it works best on a phone.
Here’s a link to the original version. We don’t imagine anyone would run with this, but we provide it to show what a more direct translation looks like: iTest – Original version
The good thing about the app is that it is easily customisable.
Where we are now.
Because we strayed so far away from the paper-based version we have produced two versions: A ‘Newman’ version and an ‘Original’ version and we have the code to give to anyone else who wants to run it on their own server although as yet we do not have the instructions ready to explain how to host it and modify it yourself. Coming soon!
We did promise to host this for anyone to use so we will also create a generic version which will not be difficult as we just need to remove references to Newman services in the feedback pages.
Students have used the app but we have not collected formal feedback. We are using it amongst a raft of initiatives to raise awareness of digital literacies.
How it could be further developed.
My ideal would be able to have feedback pages that were easy to edit and format and ideally wiki pages so that students themselves could provide pointers as to how to enhance these digital literacies. Maybe a student project?
It would also be good if we could link it to a VLE so that users were automatically authenticated so we could monitor usage and record results.
Although not my area of expertise, I would like to see it be more responsive to the device it’s used on. It looks good on a phone but not so good on widescreen monitors and because it is viewed through a browser, it is subject to browser settings and idiosyncracies. I might see what can be produced with Xerte using the questions and feedback we have written.
Thanks to Exeter.
We’ve learnt a lot from working on iTest and thank the team at Exeter – Stuart Redhead, Matthew Newcombe and Elisabeth Dunne for putting it out under a Creative Commons Licence.
If you have any comments on this, please leave them on the blog or email me direct bob.ridge-stearn[at]newman.ac.uk
A Technical Perspective.
I’ll try to get one of my colleagues to provide some sort of technical perspective to all this.