This is a summary of the discussion with Jane, Alison etc., w.r.t the use of an objective assessment for interview candidates – note the test / assessment is in morning and the interview panels are in the afternoon.
Originally, they were going to run the assessment in WOLSEY (bbd), in a computer lab. Then print the results out of the gradebook and combine these with the thoughts and comments of the interview panel to inform their decisions.
However, on reflection (post a risk assessment) it was suggested and agreed they’d not use WOLSEY for this but an OMR solution. The rationale being;
- difficult to find a computer room which is near the interview rooms
- need to invest a significant amount of time in testing the IT room (PCs), setting up with secure browsers,
- need to invest a significant amount of time with other teams to ensure it works, and provide more staff to invigilate and support on the day
- need to sort out usernames and passwords for the candidates
- can’t guarantee the system (performance) will be OK on the day with our current set up
- don’t need the results instantaneously for staff, and students wouldn’t see their results (scores) or feedback
- only using a MCQ format, and don’t want to change to use any other question type
- the candidate orientation activity, ie., you’ll do questions like this, involves looking at freely available questions from the Open University
- only get one chance with the candidates … can’t call them back in if the system fails. Also, the fall back if the system did fail was to get them to complete on paper. Given it is an interview, if there was a failure in the process (note, a large number of points of failure – given the number of teams involved) it is worth taking the risk?
- also, given the large number of people involved is it cost effective to use a CAA approach compared to a less technical solution?
They do want
- the results within a few hours
- they do want to be able to analysis (evaluate) the appropriateness and degree of difficulty for the student
- they do want to manage question banks and test design
So, WOLSEY will be used to manage the questions, and manage the gradebook / reporting. The test will be created and printed from WOLSEY, and we’ll use a generic answer sheet from the OMR system. This makes it very easy to author both aspects. The students are given the test, they complete, a member of the e-learning Team collects the answer sheets (for the pilot phase only) and scans them using the nearest available UCS photocopier/scanner. The e-learning member will upload (imported) the results to the WOLSEY gradebook. The test papers and the results in ReturnForm software is destroyed. WOLSEY becomes the point of truth. The discipline team access WOLSEY to print / run reports against the gradebook.
Therefore, the proposed solution is to use an OMR system, where the answer sheets are scanned using UCS standard photocopies/printers/scanner machine, these are sent (emailed) to a email box, and then the images uploaded to the OMR software.
The advantages of this are … test can be run in any room, very reliable, can be very quickly hand marked if required, no perceived technical barriers for all concerned – likelihood to failure compared to CAA is much lower.
We’ll be trying http://www.formreturn.com/