This is a further outcome of the discussion in the HELF group around the issue of efficiently managing clickers. The following is from Vicki Simpson at Surrey. I love it, but, wow …. does it scare me … 2500 clickers.
However, the solution could probably transfer to Bath OK, in the sense we can create QR Codes (unique device number), with a web interface. We could include the email reminders etc., (all this is developed from the QR Code Submission Project). I could imagine a similar approach for staff. However, given the few clickers we have (400), I’m not sure where this would add value.
I’ll ask, Vicki about utilisation, and situations where demand > supply.
Overview (for staff and students)
– we have @2500 handsets
– staff member wishing to use handsets completes an online booking form which comes to us
– we check the request against availability and approve/decline
– if approved, staff member instructs students to go to the Library and borrow a handset (loan period of 1 semester)
– handsets are in lockable cases in the Library, which students self-issue
– case can be opened using a decoupler which is in the Library foyer (i.e. once they have passed the security gates)
– student gets Library reminder to return handset (as with a book loan)
– not off the shelf, we sourced the component parts
– small lockable cases imported from Hong Kong
– foam inserts designed in-house and made by foam specialist
– paper sleeves designed in-house and printed/die-cut
– handsets etched using ‘Selectamark’
– tattle security strip and a bar code in each case, just like a book – uses the existing Library software
– temp staff employed for assembly (approx four people/two weeks)
I can put you in touch with Paul Burt in our team if you would like more information about this.
– The semester-long loan period is partly for practical reasons (the Library would be stretched if handsets were being returned each week for example) and partly to encourage more regular handset use by staff, but the downside is that if a lecturer only uses the handsets once or twice, the handsets are sitting in a student’s bag when then could be used by some-one else. However the efficiencies gained in terms of managing, issuing and tracking the handsets are huge and in our opinion offset these scenarios.
– If a staff member only wants to use the handsets once (e.g. some-one trying it for the first time), we have a small set (@100) which they can borrow direct for us for a single session, but they are responsible for handing them out and covering any losses.
– Students don’t always borrow handsets (usually down to forgetfulness/too much effort to walk to Library!) so they need reminding and nudging to do so. Also affected by good practice – if a lecturer uses EVS well, students will be more motivated to loan a handset.
Hope that helps. Worth looking at Paul’s presentation as it has some photos of the cases and the Library facilities.
The outcomes of a really good meeting with Mark Ames around the use of QR Codes for campus inductions is the following. This offers a really good opportunity to maintain the profile of the technology, and look for value added applications.
I’ll present at the next Induction Operations Group (7th June?). This will be an authentic activity, similar to the suggested role out.
Collect a audio story about the building, i.e., a 3 minute mp3. Which contains a number of different voices. For demo purposes I’ll create one around the e-leanring Team in Wessex House. i.e., what we do, impact, interesting stuff etc.,
This will be delivered in a similar way to the e-learning podcast model, i.e., a blog (contains audio file – link to audio file) and some blurb, including an explicit link to encouraging people to leave comments. However, it is not going to be rss’d so we could look at using bath.blogs.ac.uk
An A5 laminated card will be created, this will contain the QR Code directly to the audio file, a short url to the specific blog post about the building (this will include a statement to encourage them to leave their thoughts and comments) and info to tigtags.com/getqr
We’ll monitor the use of the site through google analytics, and see if we can how to track the number of click throughs to the audio file from the QR Code.
If the induction operations group think this is an interesting innovation and should be rolled out, then I’ve agreed on the following.
Mark Ames provides the list of buildings (upto 4) and a list of people to talk to. e-learning arrange to record 1 minute voxpop style discussions, and create a short (5 minute) audio track for each of the buildings. We then create the blog, supporting material and the A5 laminated sheets.
I’ll suggest to Vic / James they can lead on the recording etc., and use a very similar methodology (template) to our e-learning podcast.
Some questions which spring to mind our …
- what are the aims of the University’s central e-learning team? Andy to answer
- what do you think is a major achievement by the e-learning team this year? Andy & Nitin to answer … link to OPuS / Blog evidence
- how does e-learning work in the departments? Geraldine, Rachel and Rania to answer
- as a student, how will I work with the e-learning team? Andy to answer
Just demo’d the metric scanner we are going to use in the QR Code submissions. The feedback was very positive, and they are going to buy one 🙂
Points to note for James
- user requirements – display details, name, unit id etc., but don’t want any extra steps … so scan and auto submit.
- i said it should update SAMIS straight away, so they can check using business objects, and student will get email’ed. Is that the case?
- i said, to use it would include logging into a web page (authentication via university username and password), and scan. I’ll provide some supporting documentation.
- ACE (and Mech Eng) have a submission near end of April – previous problems with iPhone, so can they use this? They are happy to login into Moodle as stop gap to access the form, … note: I’d like it hosted at http://www.bath.ac.uk/barcodes/submission/, using LDAP and grouper software (I assume)
I’ve been exploring the use of wikitude augmented reality web browser on the iPhone. The scenario would be to develop a set of POI’s around a building which include learning materials provided by staff, and student generated material. The first task was to better understand how the student might use the software. I was particularly interested in the screen settings. It appears wikitude is designed to be as inclusive as possible, which is great, however, it does lead to a slight problem when designing learning activities. This is you need to turn off some of the information sources. In addition, support information will need to be provided about reducing the noise from POIs not included in the learning activity. For instance, reducing the radius, to as small a distance as possible.
well, I took a little step into a big world and tried wikitude.me, and wikitude on the “team” iPhone.
All very straight forward. Logged into wikitude.me using Twitter credentials. Made a point of interest using the embedded google earth screen. The POI was some text about the office, and a web link.
This morning, fired up the iPhone, pointed it at the office and it was there … read the text, follow the link.
So to get started the barriers to entry are very very low.
It does create lots of noise around my POI as it draws from lots of other sources (not just wikitude.me). As an induction tour of capus, or a specific learning activity I’ll need to close down the other information feeds, and sort out how to make an individual journey (it looks like using a KML file).
Another point worth further investigation is the use of GPS-trk on the iPhone. This should allow students to create an kml file, with all the lovely pictures etc., So enabling student generated content 🙂
More to come ….
Thanks to Vic Jenkins and Phil Wilson … I’ve been starting to explore the use of wikitude.me to create an augmented reality expereince around the University of Bath. This is interesting as it allows you to develop the augemented reality layer using a kml file … so Google Earth, here we go ….
nice intro to augmented reality >>> http://go.bath.ac.uk/ji9w
I spent a little more time on the Moodle4iPhone course on my iTouch.
The first point, is in terms of navigation, I think I started to get it. The zoom in / zoom out worked well, and I started to develop effective navigation strategies on the device. So you might be able to teach old dogs new trick 🙂
They’d made some changes to the choices. These work better, however, I still think my advice to academics who wished to use the tool would be to consider what you want to achieve and try to design your activity appropriately. I feel, the fewer options, the better 🙂
I spent time on the glossary tasks. The many stumbling block was, when you want to add your own term, so the form is open, the course navigation appears at the top, and the onscreen keyboard is active, then it doesn’t work. On my device the text input was masked by the top navigation layer. So I couldn’t see the what I was writing. This meant I had to keep flicking between different views to get it to work.