Tagged: DCSA

If we are serious about TEL, we need to develop staff capabilities as collaborative teams

Aims

The aims of this post are to use the aggregated findings from three staff digital capabilities audits at UCS to inform us of a more effective staff development model for technology enabled learning, and encourage you to undertake similar work within your institutions.

Background

The first question is, why did we introduce a staff digital capabilities audit? The answer revolves around us trying to identify current staff digital capabilities to introduce “targeted” staff development interventions. It was envisaged the data collected would add value across a number of levels;

  • offering a staff development tool to empower individuals to take ownership of their development
  • providing an aggregated view from a course / departmental team perspective to more effectively identify bespoke developmental requirements

The more background on the individual studies, read the posts: https://andyramsden.wordpress.com/tag/dcsa/

Data Collection

The discussion has been informed by three small data collection points. The questionnaires were collected from

  1. Course Teams & Departments Meetings
  2. Dept 1 (n=10): April 2015
  3. Dept 2 (n=10): May 2015
  4. Online – Any staff at UCS (n=17): April – June 2015

It needs to be acknowledged, given the small sample size and data collection methods we need to keep the discussion within the context this is not statistically significant study. However, the outcomes do provide an insight, and give an opportunity to reflect on our current practice.

Findings

The data gathering process was to take all the results, and identify the questions where the “negative / no” responses were greater than the “positive / yes” responses for an individual question. These are collated in Table 1, and organised by group.

Table 1

Table 1

Table 1 illustrates there are a number of technologies and tools which staff haven’t used within the last 12 months. For some tools, there is a strong pattern across all sample groups.

Table 2 applies a criteria matrix to rank the findings into prioritises. The criteria is upon the relative levels of “no” compared to “yes” responses, and the extent to which they cover the three groups.

Table 2

Table 2

Table 2 highlights the collective experience and awareness across the institution of using certain tools is very low. In particular, the tasks, the quiz and the blog tool within Blackboard (LearnUCS). This lack of exposure to some tools raises concerns of a disconnect between the strategic and operational levels within the institution. For instance, our current Learning and Teaching Strategy prioritises the need to explore innovative and creative assessment and feedback models. Given this priority, the expected solution would include the use of objective testing for summative and formative assessments. However, looking at the outcomes of the staff digital capabilities audit, there needs to be significant resource directed to developing staff experience around using the quiz tool before they can make connections within their own innovative practice. It also revisits the often discussed topic, “how are lecturers suppose to make effective decisions around the technology mix within their teaching, learning and assessment models, if they don’t know what the don’t know?”

This raises the question, “how we might develop staff digital capabilities?”

Table 3 draws together the free text responses around perceived barriers to adoption for the two face to face groups. Note the online audit (self-diagnostic) does not include this set of questions.

Table 3:

Table 3

It is always difficult to draw meaningful messages from this type of analysis (especially given the very small sample size). The application of the 4E’s framework by Collis et al., (2001) offers an insight. It facilitates the questioning around who owns or has responsibility to take these issues forward within the institution?

Table 4:

table 4

The responses imply time is the most common perceived barrier. It is suggested the prioritisation of time for lecturers to design, develop and deploy TEL based initiatives is an institutional issues informed by around workloads, culture and expectations. However, there are clear opportunities at the course and departmental level to use the limited time more effectively.

One approach would be to develop a team based staff development programme at the course or departmental level. The need to focus at a team perspective is based upon the early/late adopters of an innovation typically represent 70% of a population. With peer pressure and other outside forces being the key influences on this group deciding to adopt the innovation (Hixon et al., (2011)). Schneckenberg (2009) proposes the real barriers to TEL in HEIs are linked to cultural and motivational factors within lecturers. In particular, conflicting demands for research, and enterprise activities which are perceived as more important than TEL based teaching innovations. This suggests a staff development model which focusses predominantly on the individual will be ineffective and only support early adopters.

An approach to tackle the previous observation would be for a team based problem based project approach, which includes the setting their own problem and targets, evidencing success at other institutions, and designing and developing the TEL enhanced learning model. The focus of the session design needs to reflect lecturers will not spend time on tools unless they see value in using them. Therefore, will be more opportunities at exploring the jobs they need to do (Macdonald and Poniatowska (2011). This is inline with the Carpe Diem learning design, which “drew from creative processes, agile development, and storyboarding” (Salmon and Wright, (2014:54)). An enhancement on the Carpe Diem would be to re-run with the same discipline experts at regular intervals to develop enhancements which are drawn upon lessons learnt from implementing the innovations. This is inline with a Gibb’s Learning Cycle framework.

This raises the question, who should be part of the team?

Owens (2012) observes the dependence between technology and the learning design. “Without a working knowledge of technologies, lecturers do not know what is possible … equally, without knowledge of the … pedagogical design of these technologies … online learning environments will remain an underutilised and ineffective resource” (Owens (2012:398))

The team needs to include representation from discipline experts, e-learning developers, academic skills advisers, librarians, curriculum developers and students.

The final question is, what has this to do with the digital skills capability audit?

The role of the digital skills capability audit is crucial within the process as it helps determine the topics to choose through identifying the unknown unknowns.

References

  • Armellini, A. and Jones, S. (2008) ‘Carpe Diem: seizing each day to foster change in e-learning design’, Reflecting Education, 4(1), pp. 17–29.
  • Cho, M.-H. and Rathbun, G. (2013) ‘Implementing teacher-centred online teacher professional development (oTPD) programme in higher education: a case study’, Innovations in Education and Teaching International, 50(2), pp. 144–156.
  • Hixon, E., Buckenmeyer, J., Barczyk, C., Feldman, L. and Zamojski, H. (2012) ‘Beyond the early adopters of online instruction: Motivating the reluctant majority’, The Internet and Higher Education, 15(2), pp. 102–107.
  • JISC (2013) The Design Studio / Curriculum Design at the University of Ulster. Available at: http://jiscdesignstudio.pbworks.com/w/page/60261367/Curriculum%20Design%20at%20the%20University%20of%20Ulster (Accessed: 24 June 2015)
  • Macdonald, J. and Poniatowska, B. (2011) ‘Designing the professional development of staff for teaching online: an OU (UK) case study’,Distance Education, 32(1), pp. 119–134.
  • Owens, T. (2012) ‘Hitting the nail on the head: the importance of specific staff development for effective blended learning’, Innovations in Education and Teaching International, 49(4), pp. 389–400.
  • Salmon, G. and Wright, P. (2014) ‘Transforming Future Teaching through “Carpe Diem” Learning Design’, Education Sciences, 4(1), pp. 52–63.
  • Salmon, G. (no date) Carpe Diem: Access to Handbook. Available at: http://www.gillysalmon.com/carpe-diem.html (Accessed: 24 June 2015)
  • Schneckenberg, D. (2009) ‘Understanding the real barriers to technology-enhanced innovation in higher education’, Educational Research, 51(4), pp. 411–424.

Digital Capabilities: What are the (tangible) outcomes for SBS?

We completed the first stage of the Digital Skills Audit for SBS at UCS. This covered 11 staff members. The initial broad outcomes to be feedback to HoD and TEL Lead will include;

  1. The key barriers to adopting TEL are; staff time to explore the technology, and seeing an educational application where it enhances existing practice.
  2. Timetable a number of bespoke sessions for staff development sessions on specific tools (Wikis, Blogs & Clickers). These need to be driven by a discussion on educational application, especially within UK HE Business education.
  3. The preference for marking assignments by hand is still high within this cohort (50% preferred to mark by hand). This is connected to hardware and room issues.

From our (Learning Services) perspectives it is clear we need to ensure;

  1. All staff are aware of the online, in course help documents (50% of respondents had not accessed it)
  2. Create a how to guide on why and how of effectively using Tasks in their modules (70% of respondents had not used the Task tool)
  3. Create a how to guide on the why and how to use quizzes in their modules (70% of respondents had not used the quiz tool)

The challenge of time and educational application could be addressed through exploring a more developmental staff development package. For instance, the taking a small number of staff for half day workshops on educational design within a TEL environment. This should to be delivered within a problem based learning design.

Digital Capabilities: What are the (tangible) outcomes?

As the dust settles on the first phase of developing the digital capabilities within a course team, (we’ve completed the audit), so you ask, what are the tangible outcomes?

I’d suggest the short term outcomes are as follows;

  1. the development of a course team specific support page (see video below).
  2. the implementation of a simple communication plan
  3. auditor & team leader agree on staff development priorities based on audit analysis
  4. auditor emails all participants concerning the results, and where to take this next in terms of ongoing support (face to face, and online)
  5. the appropriate member of the TEL staff development team agrees to;
    • be based in their course team office at certain times of the week
    • will collect stories of TEL use and share these on the support page (one per month)
    • is invited to attend course team meetings
    • sets up a regular (quarterly) review with the course team leader about progress and opportunities
    • maintains the online support page, including; creating / updating guides, writing appropriate discussion papers, adding stories etc.,
    • develops an evaluation framework to be able to answer the questions, has it worked?

Course Team Specific Support Page

The following video is a proof of concept to gather feedback and ideas from staff on what they might need as a “one stop shop”. The expectation is the worked up version will in using a different tool.

Image: With Thanks (free to use and share): https://farm4.staticflickr.com/3855/14442547509_a377f38679_o.png

 

Digital capabilities staff audit questionnaire. Initial observations from ECS

The following observations are based on the first deployment of the digital capabilities audit with a course team. The paper questionnaire was completed by 10 lecturers, and took about 10 minutes.

The audit is available from:

The initial observations / feedback included a number of enhancements, including;

Data Management

Include on the document a data management statement and an overview of the objects of the questionnaire. For instance, the data will be kept for 16 months, aggregated (annoymous) data will be published via Elevate blog, used in presentations etc.,

Post Analysis

Individual emails (within 2 weeks)

  • direct individuals to support material based on their answers
  • email people about forth coming staff development opportunities through the year
  • link to web site / page which contains the aggregate results, help guides and list of available local expertise

Course leader / HoD (within 1 week)

  • We will email the course leader the aggregated results, and arrange follow up to discuss staff development opportunities, and where next

The following are some of my initial thoughts based on aggregate responses. So, what can we take from these results?

It was very pleasing to see most people are have completed the “minimum standards” of LearnUCS use. Also, the majority (6 verses 3) prefer to mark online (even given the issues the team have had around support, hardware, and assessment types).

Room for improvement around staff support and development.

  1. Only one respondent had accessed the online help which is included within all learnUCS modules. Need to address this, as this was suppose to be help at point of need
  2. It was very pleasing to see 90% of respondents knew who to contact about enrollments. I would suggest we include a similar style question about setting up online submission portals.
  3. We need to map our online guides to the questions / topics within the Audit. For instance, 90% had not used the tasks tool to provide guidance to students. Therefore, we need to ensure there are short getting started FAQ, which covers, what the tool is, how it might be used to enhance teaching and learning, and how to use it. Need to include; discovering copyright free images, discussion boards, the quiz, blogs and wikis.

We need to provide bespoke sessions on

  • Classroom interaction using Clickers (high demand for session)
  • Effective uses of social media in teaching and learning (very interesting to see a high response in terms of using in personal life, and having captured shared images / video. However, not using it in formal teaching and learning, and no-one is confident to support colleagues).

The barriers to adoption was strongly focused on time, and hardware. The time aspect is something we need to work closely with the course team around. For instance, if the time is needed to design and implement, as opposed to becoming aware. Then I’d suggest the use of development sprints with individuals or small groups on actual course to complete large changes in the designs, create the materials, the quizzes etc.,

Finally, why so I suggest you do a similar approach?

I’d suggest this is a really useful process to commence with course teams and individuals. Importantly, it has a key role in starting and maintaining dialogues with individuals around their current digital capabilities, and the development of SMART, personalised development plans. As with so many of these initiatives, the hardest step if the first. This is an excellent means of managing the first step (and acknowledging the “elephants in the corner of the room”).

I’d also recommend this initiative as a very good way of auditing your own online support materials (FAQs, Guides and Discussion Papers). For instance, these questions are based upon our existing technologies and learning models within the institution. Therefore, we should have supporting online material, which is accessible, to enable an individual to complete the task. This process has highlighted to me, we haven’t mapped support material to the questions. We started it, but not completed it.

Once the mapping is in place (mapped to our Getting Started Guides, LearnUCS Materials, discussion papers, and vendor provided guides) we can roll out an online self-diagnostic.

Developing staff digital literacies for TEL: The Audit Questionnaire

This is Phase 3 of our emerging programme around developing digital capabilities in Staff (see earlier post for background).

This is a first iteration of the audit questionnaire. The following is an outcome of a discussion with the course leader around their current and future learning, teaching and assessment models. The intention is this will be completed by staff during a course team meeting, therefore, we have a window of about 15 minutes. After which, we’ll analyse the data and report (aggregated) back to the course leader, and follow up with some individuals for examples of use.

The intention of the audit is to identify skill levels, and steer the design of the staff development model. The debrief with the course leader will identify areas for whole course team development sessions, and our individual support provision.

It will be really interesting to see how the audit questionnaire changes by different course team or Department.

The Digital Literacy Capabilities Audit questions are;

1. Can you complete the following tasks in LearnUCS?

1.x.1: Who would you contact about enrolling students / staff / externals examiners on your modules? – Module Leader / Course Administrator / Elevate Team / Other
1.x.2: Have you accessed the online help available in every LearnUCS course menu: LearnUCS Staff Support – Yes / No
1.x.3: Do you regularly use the Announcements tool to keep students updated on course news? – Yes / No
1.x.4: Do you regularly use the task Tool to provide guidance to students on what to complete by when? – Yes / No
1.x.5: Do you use date controls or adaptive release to control access to information – Yes / No
1.x.6: Can you embed a static image into a content item? – Yes / No
1.x.7: Can you embed a YouTube Video into a content item? – Yes / No
1.x.8: Do you know about copyright free image banks / online resources? – Yes / No
1.x.9: Do you use advanced search in Google Images / YouTube to find copyright cleared material? – Yes / No
1.x.10: Can you create a discussion board forum in LearnUCS – Yes / No
1.x.11: Have you used discussion boards in your teaching (during the last 12 months) – Yes / No
1.x.12: Would you like to use a discussion board in your teaching? – Yes / No
1.x.13: Have you used the LearnUCS inline grading tool to mark student work online? (during last 12 months) – Yes / No
1.x.14: Have you printed student work and annotate / mark by hand? (during last 12 months) – Yes / No
1.x.15: If you have marked work online, do you tend to attach a file in the feedback? – Yes / No
1.x.16: Would you prefer to mark by hand (print and annotate script) compared to marking online (onscreen) – Yes / No
1.x.17: Have you used the LearnUCS test (quiz) tool (during the last 12 months) – Yes / No
1.x.18: If you have used the LearnUCS test (quiz) tool is it for – Formative assessment / Summative assessment / Both
1.x.19: Have you created groups in LearnUCS (during the last 12 months) – Yes / No
1.x.20: Have you used the Wiki tool in LearnUCS for group work (during the last 12 months) – Yes / No
1.x.21: Have you required students to use the Blog tool in LearnUCS (during the last 12 months) – Yes / No

1.1. Clickers
1.1.1: Do you use clickers in your face to face teaching sessions? – Yes / No
1.1.2: Have you used the TurningPoint software in Powerpoint to create a clicker questions (during the last 12 months) – Yes / No
1.1.3: Would you like to use a classroom voting tool to gather input from students in face to face teaching – Yes / No

1.2. OMR (bubble sheets)
1.2.1: Have you worked with the Elevate Team to create a multiple choice exam using the bubble sheet (OMR) software – Yes / No

1.3. Social Media in TL&A
1.3.1: Do you encourage students to use Twitter to discover and share information as part of your teaching? – Yes / No
1.3.2: Which of the following do you use in your work or personal life? – Twitter / Facebook / Google Docs & Slides / Pinterest / YouTube
1.3.3: In the last 12 months, have you taken a photo or video on your phone (tablet) and emailed / shared it with a friend – Yes / No

2. Which of the following would you be happy to help colleagues (Course Team or Department) on using?

THIS IS THE ONLY RESPONSES WHICH ARE ASSOCIATED WITH YOU IN THE REPORT

2.1: Using the clickers (creating questions, using in class) – Yes / No
2.2: Using the LearnUCS Discussion Board – Yes / No
2.3: Marking online using the LearnUCS inline grading tool – Yes / No
2.4: General use of LearnUCS (managing menus, creating folders, adding content, including multimedia) – Yes / No
2.5: Advice on using the OMR (bubble sheets) for exams – Yes / No

3. What other tools have you used?

4. What tools would you like to know about?

5. What do you consider are the three main factors which act as a barrier to your adoption of Technology Enhanced Learning?

Developing staff digital literacies for TEL: We need to go way beyond the audit

We are piloting a number of digital literacy audits for staff within Departments and course teams at UCS. The aims are to identify current skill and experience levels within the academic teams, map these to the emerging vision of technology enhanced learning within the department or course team, and operationalise these through the appropriate support and development packages.

The process involves teasing out a number of learning scenarios, based on their and teaching and assessment philosophies and vision statements from the Senior Management Teams and/or course Leaders. This will enable the creation of exemplars, getting started guides, and develop the audit questionnaire. The analysis will inform the design of the required staff development programme.

As with most change programmes, it’s loosely based around Kotter’s 8 Steps. Where the evaluation of impact is associated with Step 7 (Building on the change) and Step 8 (Anchor the change into Corporate Culture).

Reflecting on the likely effectiveness of the plan, the digital literacy audit and staff development programme are relatively easily deployed. However, they are unlikely to guarantee a longer term impact through a change in practice. This will require a cultural change within the staff group towards a Learning Organisation. The primary focus of a learning organisation is around valuing, managing and enhancing development of its individual employees in order to ensure its continuous transformation, enhanced innovation and competitiveness (Scarbrough et al (1998)).

The following evidence justifies the previous statement.

  • Drucker (1995: 176) suggested within an organisation knowledge is often specialised to individuals. Therefore, there needs to be a process to turn potential into performance. Otherwise most of the available knowledge will not become productive, it will remain as information.
  • Matley (2000: 204) proposes only fragments of the knowledge gained through individual (informal) learning are actually recorded or disseminated, either formally or informally for corporate use.
  • Moilanen, Ostbye and Woll (2014) identified the absorptive capacity of a team is a significant mediator in transferring external knowledge into higher innovation performance. Where AC is measured by learning activities, level of educational attainment and knowledge management processes.
  • Lee, Tsai and Amjadi (2012:35)) deployed a learning set approach with defined members, time-based activities, required direct participation and included an evaluation element to effectively create and share knowledge amongst workers.

The challenge for us is to facilitate the development of a learning organisation within academic teams who are characterised by being seldom in the same location at the same time, and they tend to work in small teams or on their own.

To facilitate the creation and sharing of knowledge within a dispersed team (temporal not spatial) , we will need to;

  1. publish a directory of expertise based on common TEL tasks which a person is confident teaching others. This will task focussed and allow team members to edit the directory as they gain more skills.
  2. improve the effectively and efficiency of online knowledge management (how to guides, FAQs, useful links) and channels of communication.
  3. design the staff development sessions (design sprints) around specific, authentic tasks, within a problem based scenario approach. The outcomes (how to guides etc.,) will be included within the knowledge bank
  4. identifying a person who is responsible for maintaining and updating the online knowledge resource. A standing item will be added to all Dept Learning and Teaching themed meetings to discuss the development of the knowledge store, arrange future staff development sessions and help the broader team celebrate successes.

A challenge will be the software for the knowledge bank, it needs to be a very simple authoring tool, a low threshold technology; easy and intuitive to author and contribute, while also integrated within single sign on. If single sign on wasn’t an issue, a Google Community would be ideal.

This also needs to be built in as a ongoing commitment, where we use programme validations / approavals as the key opportunity for major TEL curriculum developments.

Overview of the process

Phase 1: Discussion with Course Leader or HoD
Phase 2: Create the audit questionnaire
Phase 3: Capture the data, 1-2-1’s
Phase 4: Develop staff development programme, create guides, create directory of expertise, create blog (online knowledge management)
Phase 5: Implementation

Phase 1: Discussion with Course Leader or HoD

  1. Is there a vision for TEL within the learning, teaching and assessment model? If yes, what is it?
  2. What sort of learning and assessment tasks are students likely to need to complete across modules?
  3. Have you a minimum expectation in terms of the use of the VLE, e-Portfolios, Library Systems, Multimedia and Classroom technologies (face to face)?
  4. Where next?
    • Names of people?
    • Dates for data collection?
    • Who will provide the background information they will need?