The following is some feedback on the learning activity which Helen rolled out for students.
Our role in the process was slightly different to the previous role of the learning technologists at UCS, instead of help on the tool side, we also focussed on the learning design aspect. The model we adopted was based around an action research approach, were the emphasis is on Helen to design, develop, implement, evaluate and reflect of the activity, and feed this into the next iteration or transfer the lessons when working on other learning designs. Therefore, our role is not to create it for her, but more facilitate her understanding of issues. We did provide the key support areas around tools, and the student evaluation.
The activity was a VLE (e-learning ) activity for students which sits between a number of face to face teaching sessions. The activity is designed as a “distance resource intensive” model, where by the student is expected to work through a linear learning path in their own time, with instructions, information and activities being provided. The learning design broke down into a screencast introduction (powerpoint and audio talk over), three activities (involving reading or watch videos), with some guiding questions, and a summary (which included the intended learning outcomes).
The Elevate designed and deployed a short questionnaire (three questions) in a google form within the activity to capture student views. Students were encouraged to complete through announcements on the VLE course (where the activity was hosted).
The responses (n=7) implied the learning activity was very effective for them, especially as it allowed them to work through the material at their own pace, and revisit ideas. This was often compared to the lecture format where they could not easily do this (or had no control on the pace). For instance, “This was a good learning aid for me as I could access it at home and return to it and replay it so that I could understand what I was asked. I was also able to download articles as i was working through and stop the presentation while I read the article which made it more relevant”.
However, there was one respondent who did comment they prefer to learn in a more collaborative way, they commented, “some of it is really good and interesting, but I need other people to bounce ideas and themes of to find what I need to learn” This design did not accommodate this need.
When asked about how this might be improved, a number of respondents focused on technical aspects about not being able to access the material.
I’d focused on two aspects where this resource could be strongly enhanced, without radical changes.
1. clearer information for the student. When reviewing the material and imagining myself as a student, I thought I’d need clear instructions at the start to help set the activity (what are the intended learning outcomes, what am I expected to do, who much time should I spend on the activity (minimum), and how might I get help if I need it?). I’d also suggest re-visiting the questions / tasks within each activity to see if they can be broken down into something more tangible for the student, and adopt consistent format for setting the task (title), instructions, resource links and scaffolding.
2. more effective closure of the activity. The activity drifted at the end, interms of effective closure. I was wondering, would you bring ideas together in the next face to face event, or is the summary it? If the summary is it, then based on effective feedback, we need to re-think the last activity. For instance, a popular model would be to include a couple of quiz questions (with feedback) where students could gauge their level of understanding after completing the activity. This will also help you (the lecturer) monitor how well people have understood the activity. Alternatively, if they are in a face to face teaching session the week after, then close it down with a discussion / clicker activity.
Finally, I’d make a point which might require a change in the learning design. I’m not sure if the design is very motivating for a large number of students. If I was a student, I’m not sure I’d complete the whole activity, even though I’m instrinsically motivated. The design does not include any obvious feedback opportunities for the student, therefore, I’ve no way of ensuring my learning is on track. This is likely to have a big impact on motivation and completion. A good starting point would to think about the principles of good feedback, and map these to the activity (see http://www.jisc.ac.uk/media/documents/programmes/elearning/digiassass_eada.pdf, pgs 14-15).