The aims of this post are to use the aggregated findings from three staff digital capabilities audits at UCS to inform us of a more effective staff development model for technology enabled learning, and encourage you to undertake similar work within your institutions.
The first question is, why did we introduce a staff digital capabilities audit? The answer revolves around us trying to identify current staff digital capabilities to introduce “targeted” staff development interventions. It was envisaged the data collected would add value across a number of levels;
- offering a staff development tool to empower individuals to take ownership of their development
- providing an aggregated view from a course / departmental team perspective to more effectively identify bespoke developmental requirements
The more background on the individual studies, read the posts: https://andyramsden.wordpress.com/tag/dcsa/
The discussion has been informed by three small data collection points. The questionnaires were collected from
- Course Teams & Departments Meetings
- Dept 1 (n=10): April 2015
- Dept 2 (n=10): May 2015
- Online – Any staff at UCS (n=17): April – June 2015
It needs to be acknowledged, given the small sample size and data collection methods we need to keep the discussion within the context this is not statistically significant study. However, the outcomes do provide an insight, and give an opportunity to reflect on our current practice.
The data gathering process was to take all the results, and identify the questions where the “negative / no” responses were greater than the “positive / yes” responses for an individual question. These are collated in Table 1, and organised by group.
Table 1 illustrates there are a number of technologies and tools which staff haven’t used within the last 12 months. For some tools, there is a strong pattern across all sample groups.
Table 2 applies a criteria matrix to rank the findings into prioritises. The criteria is upon the relative levels of “no” compared to “yes” responses, and the extent to which they cover the three groups.
Table 2 highlights the collective experience and awareness across the institution of using certain tools is very low. In particular, the tasks, the quiz and the blog tool within Blackboard (LearnUCS). This lack of exposure to some tools raises concerns of a disconnect between the strategic and operational levels within the institution. For instance, our current Learning and Teaching Strategy prioritises the need to explore innovative and creative assessment and feedback models. Given this priority, the expected solution would include the use of objective testing for summative and formative assessments. However, looking at the outcomes of the staff digital capabilities audit, there needs to be significant resource directed to developing staff experience around using the quiz tool before they can make connections within their own innovative practice. It also revisits the often discussed topic, “how are lecturers suppose to make effective decisions around the technology mix within their teaching, learning and assessment models, if they don’t know what the don’t know?”
This raises the question, “how we might develop staff digital capabilities?”
Table 3 draws together the free text responses around perceived barriers to adoption for the two face to face groups. Note the online audit (self-diagnostic) does not include this set of questions.
It is always difficult to draw meaningful messages from this type of analysis (especially given the very small sample size). The application of the 4E’s framework by Collis et al., (2001) offers an insight. It facilitates the questioning around who owns or has responsibility to take these issues forward within the institution?
The responses imply time is the most common perceived barrier. It is suggested the prioritisation of time for lecturers to design, develop and deploy TEL based initiatives is an institutional issues informed by around workloads, culture and expectations. However, there are clear opportunities at the course and departmental level to use the limited time more effectively.
One approach would be to develop a team based staff development programme at the course or departmental level. The need to focus at a team perspective is based upon the early/late adopters of an innovation typically represent 70% of a population. With peer pressure and other outside forces being the key influences on this group deciding to adopt the innovation (Hixon et al., (2011)). Schneckenberg (2009) proposes the real barriers to TEL in HEIs are linked to cultural and motivational factors within lecturers. In particular, conflicting demands for research, and enterprise activities which are perceived as more important than TEL based teaching innovations. This suggests a staff development model which focusses predominantly on the individual will be ineffective and only support early adopters.
An approach to tackle the previous observation would be for a team based problem based project approach, which includes the setting their own problem and targets, evidencing success at other institutions, and designing and developing the TEL enhanced learning model. The focus of the session design needs to reflect lecturers will not spend time on tools unless they see value in using them. Therefore, will be more opportunities at exploring the jobs they need to do (Macdonald and Poniatowska (2011). This is inline with the Carpe Diem learning design, which “drew from creative processes, agile development, and storyboarding” (Salmon and Wright, (2014:54)). An enhancement on the Carpe Diem would be to re-run with the same discipline experts at regular intervals to develop enhancements which are drawn upon lessons learnt from implementing the innovations. This is inline with a Gibb’s Learning Cycle framework.
This raises the question, who should be part of the team?
Owens (2012) observes the dependence between technology and the learning design. “Without a working knowledge of technologies, lecturers do not know what is possible … equally, without knowledge of the … pedagogical design of these technologies … online learning environments will remain an underutilised and ineffective resource” (Owens (2012:398))
The team needs to include representation from discipline experts, e-learning developers, academic skills advisers, librarians, curriculum developers and students.
The final question is, what has this to do with the digital skills capability audit?
The role of the digital skills capability audit is crucial within the process as it helps determine the topics to choose through identifying the unknown unknowns.
- Armellini, A. and Jones, S. (2008) ‘Carpe Diem: seizing each day to foster change in e-learning design’, Reflecting Education, 4(1), pp. 17–29.
- Cho, M.-H. and Rathbun, G. (2013) ‘Implementing teacher-centred online teacher professional development (oTPD) programme in higher education: a case study’, Innovations in Education and Teaching International, 50(2), pp. 144–156.
- Hixon, E., Buckenmeyer, J., Barczyk, C., Feldman, L. and Zamojski, H. (2012) ‘Beyond the early adopters of online instruction: Motivating the reluctant majority’, The Internet and Higher Education, 15(2), pp. 102–107.
- JISC (2013) The Design Studio / Curriculum Design at the University of Ulster. Available at: http://jiscdesignstudio.pbworks.com/w/page/60261367/Curriculum%20Design%20at%20the%20University%20of%20Ulster (Accessed: 24 June 2015)
- Macdonald, J. and Poniatowska, B. (2011) ‘Designing the professional development of staff for teaching online: an OU (UK) case study’,Distance Education, 32(1), pp. 119–134.
- Owens, T. (2012) ‘Hitting the nail on the head: the importance of specific staff development for effective blended learning’, Innovations in Education and Teaching International, 49(4), pp. 389–400.
- Salmon, G. and Wright, P. (2014) ‘Transforming Future Teaching through “Carpe Diem” Learning Design’, Education Sciences, 4(1), pp. 52–63.
- Salmon, G. (no date) Carpe Diem: Access to Handbook. Available at: http://www.gillysalmon.com/carpe-diem.html (Accessed: 24 June 2015)
- Schneckenberg, D. (2009) ‘Understanding the real barriers to technology-enhanced innovation in higher education’, Educational Research, 51(4), pp. 411–424.