Category: Projects

JISC call – Tech transfer around assessment (strand C)

The JISC call on Assessment and Feedback is currently out >>

We (UCS) are particularly interested in Strand C – the technology transfer component. The technology we’d like to transfer is a location aware assessment engine. This uses QR Codes (with short urls) as a means to access and a multiple quiz engine.

We are also interested in finding other institutions who have a specific (and existing need) for this technology. If you are interested, and interested in coming into this proposal, please email me …

The tool has been tested, it needs a few tweaks, and we are developing the authoring and reporting tool during August and September. For more information, see

A potential scenario of use is as follows:


Pre-formative assessment

  • An lecturer develops a set of questions around a physical space, each question is associated with a physical object, and includes the question and four possible responses.
  • Students are arranged in teams, they register a team name, and complete a practice test to make sure the they can complete the questions on their mobile devices.
  • During the lecture the lecturer explains the test, maps it to the curriculum etc, and sets the ground rules.

The formative assessment

  • In their groups the students complete the test in the different locations, on their mobile devices. They complete each question, and submit the responses.

Post formative assessment

  • The lecturer looks at the results, looking for patterns in performance, for instance responses by question.
  • During the net lecture the leader board is displayed, particular groups are selected to explain their responses to certain questions, and generic feedback is provided.


Why location aware?

The location aspect is very exciting as it allows us to develop a number of questions for the student to complete in situ, when they can see, feel and touch the stimulus material.

Why QR Codes?

The QR Code based approach allows the quick and easy access to a web based resource, which reduces the technical barriers for students and enhances the efficiency of completing the task on the mobile device.

Could this be solely a mobile assessment?

Yes, the scenario above is based on the idea the lecturer would blend the activity for group feedback through a face to face learning activity. However, the system would be able to provide formative feedback (generic and individual) for each question.

How might the transfer work?

A proposed model would be;

  • October 2011: familiarise people with the system, ensure it meets their needs, identify what support they’ll need for the transfer (both technical and staff / student support), set up advisory group meetings for all partner institutions to attend.
  • November 2011: based on feedback on specific needs complete the software
  • December 2011: roll out to the institutions, and share dates when the tool will be used
  • January  – April 2012: use in partner institutions, support feedback to use. This will include a development sprint in March to make any tweaks to the software
  • May 2012 – June 2012: Write reports
  • July 2012 – complete the packaging of the open source solution, with documentation, reports,
  • July 2012 – October 2012 – dissemination


Screen capture on the iPad: Providing audio and video feedback via Show Me

Aaron has been looking at all things iPad app for common learning tasks, and I think he’s identified a really useful tool

I’d suggest this offers really exciting opportunities for providing feedback on student work, or walk through of solutions etc., without the need to go via a desktop computer. The basic principle is it records you annotating a whiteboard (using a stylus or finger, and you voice), then produces a video file which is uploaded to the ShowMe wen site. You can import pictures into the whiteboard space if you require.

The Show Me web space is a little youtube-esk, so you can have a non listed space where people can’t search and find your videos. This means you can very quickly create and upload feedback, solutions, etc., for sharing from your Ipad.

Why do I love it you ask? It is a very low threshold technology … having encouraged staff to adopt similar approaches via tablet technology, and desktops we always hit a barrier with respect to the technology. This however, is so easy 🙂

However, let us not get carried away, the drawbacks at the moment is it’s difficult to manage more than one image at a time. Hence, I’d have questions around scaleability, and there are still lots of issues about managing access to sensitive feedback files. So again you’ll need to think through the work flow quite carefully. However, as a proof of concept it is really cool

project: OERs & UCS

Post conversations with MS and CD, the following is being suggested to explore the potential of OERs at UCS.

Project Aim

  • to explore a sustainable and scaleable model for the roll out of OERs at UCS. In particular;
    • getting started guide for the lecturer: What is an OER? How do you discover them and how might you use them in your teaching? (work package 1)
    • how do other UK HEIs create and manage OERs? What lessons can UCS draw from other institutions experience? (work package 2)

Phase 1: Complete July 2011

The Elevate Team will;

  • write a short paper which sets the context of OERs, and the aims and timeframe of this investigation to be presented to the Executive Committee: start of April
  • getting started guide: start of May
  • how do other UK HEIs create and manage OERs? What lessons can UCS draw? end of July

OMR – FormReturn – Iteration 3

Authoring was really smooth this time 🙂

The Elevate Team responsibilities are as follows;

  • create the answer sheet as a PDF and email to the discipline expert

This time, no changes to the question booklet, so we can run again. Discipline expert supplies the users (applicants in this case). Therefore, the process for us is …

Step 1 >> create the applicants_list_month_year_csv and store in Dropbox

Step 2 >> in Form Return, open the generic segments for both the Maths and Literacy Test (stored in the dropbox folder)

Step 3 >> make a new form, and include the segment. Add a barcode and the text

Step 4 >> add source data, a new table (follow emerging naming convention), and import the applicants_list_month_year_csv  from Step 1

Step 5 >> publish and test (print some, complete, scan, upload data and download) – post test, delete the data

Step 6 >> email the answer sheet PDFs to the discipline expert

All done in 30 mins 🙂

Outcomes from UCS FAQ Engine Meeting

The following notes are the outcomes / comments from the UCS FAQ Engine meeting … I’ll incorporate these into the project plan etc.,


  • Tariq & Karen >> send list to Andy of people you want to attend the first workshop – by 17th Feb
  • Karl >> will confirm the install on a Virtual Box by end of Feb
  • Laura >> think about the URL. Karl will set up as – will use alias if need be
  • Andy >> once software installed will look at the reporting tools
  • David >> look at using our Google analytics
  • Andy >> move css / branding into March
  • Andy >> expand roles and responsibilities to include right / access management for authors – so can differentiate and include sign off for certain teams
  • Laura >> will recruit Chantelle into the project
  • Andy >> in workshop >>> include activity in March workshop on design tips on FAQs, and how to use them with other information flows (emails)
  • Andy >> arrange workshop, for early March
  • Andy >> sort out project document store

VLE Review: How are we currently using the VLE in Teaching and Learning?

As part of our emerging VLE review, I need to answer the following question;

How are staff currently using the VLE in their teaching and learning?

To answer this question, I propose the following methodology;

  • sample a number of courses and classify course use into broad types  (I’ll use the IoE pedagogical template model – see below)
  • highlight any particularly innovative approaches
  • run a number of staff focus groups to ask the question, how are you using Blackboard, and what do you want from a VLE in your teaching?

This will also feed into wider discussions, around is our current VLE fit for our purposes for the next few years?

For more info on the IoE templates >>

The driver for this work, includes,

  • i’m new  to the institution so I’d like an answer to this question
  • we need to quantify the answer so we can develop appropriate staff development programmes and communities of practice
  • the license costs of Blackboard

The IoE templates develop the work of Mason and others. They are divided into 7 types;

Blended (includes some component of Face to Face teaching)

  • online admin support
  • follow up
  • parallel
  • F2F events

Distance / Online (purely distance learning, no face to face component)

  • distance online support
  • online reource based
  • online discussion based

The user scenarios are as follows

Online Admin Support

This uses the VLE course as a repository for electronic information as the main emphasis of the teaching and learning takes place when they meet face to face. The online material is likely to include; administrative information (such as course announcements, contact information, and calendar dates), the course handbook, readings, teaching material (presentation), and submission of assignments (formative and summative).

In terms of the quality of the learning experience, it can be suggested that if the student did not access the course and only attended all face to face events then their learning experience would not be significantly effected. This is because the learning activities, and assessment and feedback are undertaken in the face to face environment.

Researchers Notes >> after going through 50 courses, a key question I asked was “could they have included the material (docs etc.,) in the handbook as a printed version?” If yes, I’d categorise as an Online Admin Support – hence tasks which just state a timetable or action (read before next session), are within this category

Follow Up

The follow up course differs from the Online Admin Support course as it involves the student undertaking learning activities within the VLE which are not available in face to face teaching. For instance, there might be a number of learning activities (summative and formative quizzes, discussion board activities) which students need to undertake between the face to face events. These will be explicitly used by the lecturer in the face to face teaching and be part of the feedback loop.

Another key difference is some of the online material is not available face to face, for instance, the inclusion of examples of previous students work, glossary of terms and more sophisticated use of links to online resources. In addition an indicator would include the use of blogs and wikis within learning activities.

In terms of the quality of the learning experience, it can be suggested student’s who do not access online material will be at a disadvantage (have a lower learning experience) compared to those who do.

NOTE >> an aspiration for the Elevate Team should be to encourage staff to shift towards the “Follow up” course and away from the “Online Admin Support” course


The parallel course is a variation on the follow up course. It is where the course leader runs their learning activities in parallel, between the online and some face to face component. The scenario would be where a number of learning activities are completely online. It is not assumed the two learning spaces need to connect, in other words the tutor will not shift the outcomes from the online activities to the next face to face meeting. Therefore this is a key difference when compared to the Follow Up course.

In terms of the quality of the learning experience, it can be suggested student’s who do not access the online material will be at a disadvantage (have a lower learning experience) compared to those who do.

Researchers Notes >> try to unpick if the activity closure point is undertaken online or in face to face teaching. If closed online (ie quiz automated feedback) and no obvious reference this will be discussed in class, then classify as Parallel, otherwise, maybe Follow Up of F-2-F Events

Face to Face Events

This course type is another variation on the follow up course. Where the proportion of learning activities are undertaken in the online space, and the use of face to face events are more to bring closure and feedback to the online activities, or prepare the cohort for the next set of online activities.

This type of classification would inlcude the “block teaching” model

In terms of the quality of the learning experience, it can be suggested student’s must participate online if they are to learn.

NOTE >> an aspiration for the Elevate Team should be to shift the provision of our staff development programme to this type of course. Also we should challenge staff to thinking about designing their learning activities for this approach.

We can organise these blended courses as;

  1. Blended (includes face to face teaching)
    1. Online Admin Support
    2. Online Learning
      1. Follow up
      2. Parallel
      3. Face to Face Events

NOTE >> an aspiration for the Elevate Team should be to encourage and develop staff to shift their model of use along the blended learning spectrum from Admin Support to Face to Face events.

The distance courses are easier to classify, also fewer within a UCS context. The key difference is the integration of the online learning activities between the printed pack (CD or VLE course). Therefore, it follows a very similar pattern to the blended courses.

Distance online support

This course focuses on providing administration and support to the student as the learning activities and outlined in the course pack or CD.

In terms of the quality of the learning experience, it can be suggested that if the student did not access the course and only access the support part their learning experience would not be significantly effected. This is because the learning activities, and assessment and feedback are undertaken in the course pack.

The shift of the learning activities from the course pack to taking advantage of the course tools shifts the classification to online learning. The course classification can be subdivided into those who use predominantly discussion board based learning activities (online discussion based) or e-learning tutorials / content (online resource based)

We can organise these online courses as;

  1. Distance (no face to face teaching)
    1. Distance Online Support
    2. Distance Online Learning
      1. Online resource based
      2. Online discussion based

Evaluation Framework

So, how am I going to answer the question about staff use? I’ll sample a hundred courses or so, then make a subjective call (based on the info above) about classifying them into the type of use, by ticking the appropriate box in the data collection form. This sample will be across LNs and Schools.

mock up for harvey … location aware activity

The objectives of this activity are you to think how the Ipswich Dock area has changed over time, and what are the likely impacts on people who live and work in the area.


1. How busy is it, and what type of people are they?

Complete a pedestrian count template (paper) and upload the data in the form <<link>>

2. How has the area changed through time?

Watch the slide show for your zone (still images from flickr of area in the past). Audio record on your device, your thoughts on how the area has changed, and what

Zone 1: <<Link>>

Zone 2: <<Link>>

Zone 3: <<Link>>

Additional Resources

Thoughts from previous people on the course, can be found at: <<link>>


OMR – FormReturn – Iteration 2

The next OMR based exam will be next week. I’ve nearly finished off the a few enhancements to the process. In particular, getting the student to include (shaded bubbles) their Application Number as there ID. This means no manipulation of the data by me post exam to get the results spreadsheet correct. This automates the process. The other enhancement has been the inclusion of negative marking, ie., +1 for the right answer,  0 if select I don’t know, and -1 for all the rest. All very simple to do, although post a conversation with Andrew (UCS’s Educational Developer), I’ll be looking at the use of confidence weighting next time.

Also worked through some process issues for how we will scale this as a service. So, I’m know looking for some new recruits 🙂

e-stream … post meeting notes

Great chat today about the e-stream pilot project we (the elevate team) are taking over.

Phase 1 – support three key user scenarios (Library for off air recordings, Elevate Team for video case studies, and Marketing (?) – Jan to July

Sounds like Sarah is happy with the functionality, she agrees it will need some local branding, appropriate policies are written and has things ready to go from her perspective. It sounds like this will be passed across to the resources team.

An important consideration for her is the poor / unreliable performance. Talking to Karl, he’d suggest further discussions with Planet e-stream. There might be some concerns around the hardware spec, it might need improving if we are to get a large number of concurrent users

All agreed to work with a model which would give us the lowest user management requirements. So make all freely accessible, use AD to manage schemas (user access) for those wishing to access course only material, and individual item for management other materials. It sounds like library would like to offer the use of staff arranging / booking the off air recording service.

Part of the project aims would be to explore the most effective and effecient way of managing video assets and access rights. An other aim would be to look at the way we integrate e-stream with other e-learning tools, and marketing tools.

Agreed Sarah would complete some final testing, using a test AD Student Account (we’ll provide), staff in participating libraries / colleges. Once David is back (he’ll be the operational manager for the e-stream pilot), he’ll connect with Wendy & John at Planet e-stream, to start conversations about the application, getting them in for a visit, talk through the software, best way to support it, work up the questions we need to answer and work out how to evaluate the pilot. We also need to re-visit the costs for the module to allow us to upload content without the encoder.

We’ll work up on the look and feel (stylesheet), and start to work up some supporting materials. The show stopper is performance, if we (with ITS  – Karl) can’t get this sorted, we aren’t going anywhere 😦 If OK, we’ll set aside a week sprint to sort out the branding. Karl will ensure we have appropriate access rights.

I’ll discuss e-stream at my meeting with Marketing, and also ask a question around hardware with ITS






UCS FAQ Engine Pilot – Some notes

meet up with Karl and John – the following are a few notes (in no order)

  • Laura – infozone will manage the project
  • run a 12 month pilot service on a VM box
  • will pilot with a few Helpdesk teams, say elearning (ELEVATE), ITSHelpdesk, and a few others
  • responsibilities – ITS (OS / VM – Karl, DBA – Will, backup), ELEVATE (Application, style / branding – input from Ben, staff (author) training)
  • have a dev and live set up >>> url? http://www.ucs/infozoneonline
  • need a plan with a set of clear questions  targets (see below)
  • expectation management >>> this is a pilot, some potential issues of scaleability – open to the web, not to search engines
  • will look at LDAP hook up to see what offers, especially around user generated content
  • develop a window of opportunity for enhancements (sprint type approaches)
  • write a service catalogue

Needs (technical side … between ITS and ELEVATE) before pilot starts

  • SLA
  • change control process procedure (based on word template)


  • set up a central space for the hosting of FAQs by teams
  • enhance the user experience at discovering answers (online, just in time) – evaluated via logs, user survey
  • develop a community of practice across the helpdesk teams w.r.t standardising practice, use within face to face help desk – evaluated via putting in place guidance on writing FAQs, Tag Conventions, Category Names, also, focus group
  • Profiling and inclusion of FAQs within other systems, the UCS Web, VLE, Team Blogs

Phase 1: Jan

  • work up project management team
  • write the project plan
  • review of available functionality and how FAQ engines are being used at other institutions
  • help desk teams mock up suggests layout (categories and sub categories)

Phase 2: Feb

  • sign off look and feel
  • staff training
  • write lots of FAQs

Phase 3: Mar

  • roll out to users