Tagged: omr

creating a form … use lots of segments

a note for me (and rest of team) – when creating an answer sheet with lots of questions, use lots of segments for clustered questions. The problem is aligning the question number and the answer box … mysterious things seem to happen. So the best solution is lots of segments and the snap-grid.

OMR – FormReturn & Maths Notations

Thanks Louise … the Maths questions are in. Interestingly, they include lots of maths notation, and cover 30 odd questions (11 pages). I’d suggest we print out the questions as a question book, and the candidate would need to answer on one sheet of paper. This has a number of advantages, including significantly reducing the amount of scanning and processing which is needed. It also overcomes the problem I’m encountering which is how to get maths notation into FormReturn. The import model is based around PDF, but the maths notation is not coming across 😦

I’m not saying it can be done, its more its not obvious through the interface 🙂

The other advantage of this deployment method within this use is we can hand out all the right marksheets (they’ll have their names on them), then ask them to open the question book, as opposed to have the questions integrated within the marking sheet so some people will see it earlier than others.

 

OMR, FormReturn software for the high stake candidate interviews

A few notes.

Firstly it works 🙂 The process is as follows, create the question sheet, which can pull from a csv sheet for IDs, or you can create where they complete their own. Do a few things, print all the question pages out, hand out, collect in, scan using our multi purposes printers and send to yourself (as a PDF), open the file up in the software and the reports are there !!! You can then export out as you wish.

My preferred model for the exam in December is,

  • Alison and Jane complete a csv file which lists the candidates names (firstname and lastname and any unique ID they might use).
  • Jane and Louise send me the final questions – I’ll create two tests (literacy and numeracy) – These should be as word docs, and can include scoring, ie., negative marking
  • Andy will make sure the tests papers are created – for simplicity (and given low numbers) these will be pre-printed with the candidates names – I’ll test with Louise, Jane and Alison
  • Jane, Louise and Alison run and invigilate the test, then collect in the sheets, scan and send to me (after test one, I’ll shift the address to generic e-learning address)
  • Andy will process the results, create the report and upload it to a shared Google spreadsheet
  • Jane, Louise and Alison access this page for results

In the first version one of the e-learning team will attend 🙂

Lessons I’ve learnt – user tested by Aaron and David

  • make sure space on the page for candidates to write their names
  • clear instructions, they need to completely fill the bubble
  • try to get all the questions for the test onto each sheet
  • it produces lots of files, so a clear naming convention is needed,
  • save lots of questions (blocks) in segments – get the author to categories in blocks
  • the report can be exported as CSV, it will contain lots of detail, including the total score for the test (form_score), and break down by question (answer and score)
  • if a question is not completed, or completed by not read by the system then it is left blank and a score of 0 is included
  • the form password is the unique number generated on the top of each form, this is mapped to a unique ID, which in the exported spreadsheet can be linked to a name (this info is for my purposes only)

Where next?

  • questions from Louise and Jane
  • I’ll create the tests, and user test with Jane, Louise and Alison 🙂