How are we using LearnUCS (VLE) at UCS – first sweep of large data set

Given we have nearly finished the data collection phase, I thought it would be useful for a short review of the data.

Firstly big thanks to Aaron Burrell who has analysed lots of courses 🙂

The structure is outlined in a previous post. The sample size for this post is 163 courses on Blackboard. So it is still not statistically significant, but it is getting close.

A previous report is available from: http://ucselevate.blogspot.co.uk/p/reports-publications.html

The broad aims of the exercise are to answer the question, how are we using the VLE at UCS? Has our use changed over time?

Unfortunately, the comparison between the previous reports is difficult as the previous study was not statistically significant.

I’ve clustered the analysis into the following categories;

Indicator: Course Classification (Overarching)

This draws upon the definitions of the Institute of Education’s pedagogical template to answer at the broad level, is the VLE module being used as an online administration took or for online learning?

The classification for online administration is where LearnUCS is used as a repository for electronic information. The main focus of the teaching and learning takes place when staff and students meet face to face. The online material is likely to include; administrative information (such as course announcements, contact information, and calendar dates), the course handbook, readings, teaching material (presentation), and submission of assignments, essays and reports (formative and summative).

This contrast with the online learning classification where the lecturer(s) integrate the tools within LearnUCS for their learning and teaching models.

The litmus test is to imagine if the course was rebuilt as an electronic handbook (for example, a pdf format) would the learning experience be the same? I yes, it would indicate the VLE is being used as a course administration tool. If no, the VLE is being used within a learning context.

The analysis highlights VLE use at UCS is focussed on providing online administration (91%), with only 9% of modules integrating the tools within a learning activity. This is a very slight increase in the use of online learning from the previous report.

  • Online Administration: 91%
  • Online Learning: 9%

A later post will review the 9% of courses within the sample which have used the VLE as an online learning tool.

Indicator: Personalisation

The analysis includes a number of data collection points around personalisation. This is an indicator for a sense of ownership and engagement. It can be suggested staff who personalise their courses will be more likely and open to using different tools and approaches of the VLE within their teaching. The concept of the degree of personalisation by staff using LearnUCS is identified through three questions; has the default menu been changed? has the course colours scheme been changed? have the uploaded a course banner?

  • Have they changed the default menu? Yes: 34% No: 66%
  • Have they changed the default colour scheme? Yes: 19% No: 81%
  • Have they added a course banner? Yes: 4% No: 96%

The emerging message is encouraging, it appears over 30% have changed the default menu. This is important because they will have made a conscious decision about ensuring the menu meets their learning and teaching needs. However, on the flip side, 66% of courses have not changed the default menu. There is no comparable data in the previous study.

Indicator: Tool use

The intention is to identify what tools are being used. There are a number of reasons for this approach; firstly, through clustering the tools by affordance you can quickly build up a sense of use across the institution. For instance, some tools are strongly associated with assessment (quiz, peer assessment), while others are associated with student active and collaborative learning (wiki, discussion board). Secondly, this should flush out some interesting uses of good practice.

  • Use of announcements (more than 5): 49%
  • Use of Blogs: 8%
  • Use of contacts: 4%
  • Use of discussion Boards: 14%
  • Use of Journals: 2%
  • Use of Peer Assessment: 0%
  • Use of tasks: 6%
  • Use of tests: 6%
  • Use of Surveys: 4%
  • Use of Question Pools: 0%
  • Use of wikis: 7%

The data shows some interesting patterns, including;

  1. the discussion boards have a high use (14%). This is surprising given this is not a tool we get regularly asked about.
  2. the test (LearnUCS) quiz has increased in use (6%).

Indicator: Content use

The intention is not to interpret effectiveness of the content, more gaining a sense of the use of multimedia within courses, either material they have uploaded or third party.

  • Use of Learning Module: 15%
  • Use of audio file: 10%
  • Use of video file: 19%
  • Use of still image: 11%
  • Use of embed 3rd party material: 11%
  • Use of external links tab: 33%

Indicator: Learning Design

This question requires a subjective interpretation by the researcher. The researcher needs to evidence if the lecturer(s) have developed learning activities using a range of content types and tools. Therefore, it can’t simply be a text item including a reflective question. This indicator is strongly correlated to the overarching course classification indicator.

  • Is there any evidence of an online learning task through the material? Yes: 10% No: 90%

Indicator: Scaffolding the learners

A recommendation to lecturers when designing and developing their courses is to include item and folder descriptions to help scaffold learners around what to expect and how it tools and content relates to their learning tasks. This requires lecturers to take the extra step when creating items to include a description.

  • Do they include descriptors for items and folders? Yes: 41% No: 59%

The outcomes are really pleasing as 41% of courses seem to include a descriptor around the content or tool they are about to access.

Indicator: Is the course empty?

The sampling method to select the courses was if the course had a lecturer enrolled, and was available. This data collection point checks if the available courses actually have any content. In terms of the learning experience, it would be poor if the learner accessed the VLE, saw their module listed but it was empty.

  • Is the course available but empty? Yes: 14% No: 86%

The evidence indicates 14% of courses are available but empty. This is a concern around managing learning expectations, and will need to be addressed within our staff development model.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s