I have now reviewed 54 odd courses (might be higher when you read this post) for this periods data collection. The aim of this post is to help me work up how I am interpreting the questions.
For background info search learnUCS on this blog.
The summary results (raw data) from this data collection is available from:
The interpretation would be:
Indicator: Course Classification (Overarching)
This will be part of the headline statement, as it is a subjective marker from me about how I would classify the course. This is based upon the Institute of Education’s pedagogical templates.
The course administration is the proposal LearnUCS is used in Teaching and Learning at UCS is as a repository for electronic information. The main focus of the teaching and learning takes place when they meet face to face. The online material is likely to include; administrative information (such as course announcements, contact information, and calendar dates), the course handbook, readings, teaching material (presentation), and submission of assignments (formative and summative).
The way I interpret the course is to think if the course was rebuilt as an electronic handbook (pdf) would the learning experience be the same? I yes, it would indicate use as a course administration tool. If no, it is being used within a learning context.
I have included a number of data collection points around the theme of personalisation. I tend to view if the sense of ownership and engagement is higher the individual has taken time to personalise their online course. The concept of the degree of personalisation by staff using LearnUCS is identified through three questions; has the default menu been changed? has the course colours scheme been changed? have the uploaded a course banner?
Indicator: Tool use
I have included a list of tools being used on the course. Within this question I am not having to interpret the effectiveness of the tool, just identify if there is evidence of use. There are a number of reasons for this approach; firstly, through clustering the tools by affordance you can quickly build up a sense of use across the institution. For instance, some tools are strongly associated with assessment (quiz, peer assessment), while others are associated with student active and collaborative learning (wiki, discussion board). Secondly, this should flush out some interesting uses of good practice.
Indicator: Content use
I have included a short list of types of content being used in courses. Again, I am not interpreting any effectiveness. However, I am interested in a gaining a sense of the use of multimedia within courses, either material they have uploaded or third party.
Indicator: Learning Design
This question does require a subjective interpretation, where I need to evidence if the lecturer have developed a learning activity across a number of items or tools. So it can’t simply be a text based item which includes a question or reflective pointer. This indicator is strongly correlated to the overarching course classification indicator.
Indicator: Scaffolding the learners
I have included a number of data collection points around the theme of the lecturers helping to provide support or scaffold for the learner? Examples include, are they using tasks to provide guidance and workload management? are they using the contacts tool? and are they taking the extra step when creating items to include a description?
Indicator: Is the course empty?
All the courses in the sample should contain content to the default position where a course has to be released by staff before they are listed to students. However, I do include a data collection point to see if staff have been enrolled, make their course available but there was nothing in it.
The following link is to the summary responses (live data) for my data collection. This will allow you to interpret the emerging evidence. The initial view (sample is not yet statistically significant) is;
UCS continues to use it’s VLE as a course administration tool, and this use has not changed radically over the last two years.
The next steps for me are over the next two weeks, complete the data collection phase, and identify some innovative uses.