The following is are some of my developing ideas based around the initial responses of a survey I’m currently running with our users on the University’s Moodle Service.
The survey runs until 15th November. The aim of this survey is to enable the e-learning Team to gather information on Moodle performance during the start of this academic year (2009-10).
As background, we are aware, although Moodle did not stop working, there were some periods when service performance was very slow. For instance; during the first 4 days of the new academic year we had the following number of unique users logging on (need to multiple for the number of logins, and issues with concurrent accesses). This is over 40% of our student FTE !!!
More details are available on the Moodle Service Blog (http://blogs.bath.ac.uk/moodle). We also had a number of emails to our firstname.lastname@example.org queue. One email in particular was concerned with availability for distance learners. Therefore, I’m keen to identify how the experience was from the perspective of staff and students who were attempting to complete tasks.
The following are the initial respones (still have a week until the survey closes). We’ve over 150 responses 🙂 The methodology was to collaboratively develop the questionnaire (big thanks to the e-learning Team), promote this via a post on the Moodle Service Blog, a resource in Moodle (the post log-in page), a forum post in the Staff Area course, and an email to the University of Bath’s Directors of Studies (UG, PG). I’ve not incuded any of the qualitative responses.
It is evident the large number of responses are from students (91 out of 154).
The intention was to capture how satisfied people where with the Moodle Service over the key period. The following ar ethe responses when asked overall how satisfied were you with the Moodle Service.
It was very pleasing the majority of respondents (59%) were either satisfied or very satisfied with the Moodle Service. Even more pleasing was only 4% were not satisfied at all. It can be interpreted the vast majority of the users were satisfied with the service.
This trend continued with specific questions concerning responsiveness and availabilty. Overall, for both these indicators evidenced that the “very satisfied” category increased. In terms of overall availability, there was a higher number of people very satisfied with the service (32%).
An interesting observation is pattern for satisfaction with respect to responsiveness shifts slightly towards the satisfied / neutral choices. This isn’t a surprise as the system was available over the period, i.e., it didn’t stop working whereby it was not available for all users. However, at some periods it became very slow, i.e., the responsivess declined. This is captured in the responses.
The findings from the survey suggest we (the Moodle Service supplier) still have a significant way to go interms of the quality of our service. Yes, there are a significant number of users who are very satisfied with the service, however, we have a small number who are not at all satisfied. This might be a small number, however, they can not be ignored. The target for the start of the next academic year will be ensure the service is more responsive so we have an improvement in the responses.
The e-learning Team have been using RT to manage our generic e-learning email queue.
Based on this information, we should be able to answer the following questions;
- who busy is the e-learning helpdesk?
- assuming this is a typical year, when do we need to provide more resource for the helpdesk?
- given this resource is a pooled team, how should our other staff development activities accomodate the needs of the helpdesk?
The story in figures is as follows;
Table 1: Number of created tickets (by month)
- Oct 2008 – 289 (average 10.0 days to complete ticket) – average weekly logins: 8810
- Nov 2008 – 120 (average 8.6 days to complete ticket) – average weekly logins: 8654
- Dec 2008 – 109 (average 4.2 days to complete ticket) – average weekly logins: 6593
- Jan 2009 – 204 (average 2.7 days to complete ticket) – average weekly logins: 8281
- Feb 2009 – 142 (average 3.6 days to complete ticket) – average weekly logins: 8663
- Mar 2009 – 92 (average 4.1 days to complete ticket) – average weekly logins: 7634
- Apr 2009 – 81 (average 3.3 days to complete ticket) – average weekly logins: 6930
- May 2009 – 63 (average 1.09 days to complete ticket) – average weekly logins:
So what do these figures tell us? Firstly, the email help desk is very busy. The peak months are October and January – which align very nicely to the start of semesters. Therefore, these periods need to be resourced accordingly by ensuring we do not run too many staff development events during the period.
Interestingly the login pattern follows a similar pattern.
How might we reduce the burden on email support, and provide high quality just in time support? The answer is in trying to unpick type of questions we are being asked. Therefore, the next piece of work will be to look at the emails submitted in October and identify the type of questions being asked. We’ll need to develop some type of classification matrix and then identify solutions to certain types of requests.