What impact will TEF have on TEL?

This is a cut & paste from my LinkedIn account where I tend to publish my posts. However, given the topic I’d be interested in gathering views and thoughts.

Question: What impact will UK’s Teaching Excellence Framework (TEF) have on Technology Enhanced Learning (TEL) in UK HEIs?

A colleague of mine asked if I could provide some input on this question, and I said, “I’ve a three hour train ride home, so I’ll put the thinking hat on”. The following is the outcome of that long, lonely train ride home. As many in this area will know, there is no simple answer to this, and there no evidence to draw upon. Therefore, I’d love to read / hear what other people think will be the likely impact. Don’t hold back, get those comments flying ….

Key take away: It will have a positive impact on the adoption of TEL, however, the determinants of effective adoption of TEL within an institution are varied. In the initial period, TEF will highlight the need for more effective embedding of TEL. I would strongly expect a number of institutions to use TEF as the driver to embed more effective TEL.

The three core elements of TEF (NSS, Retention & Employability) are strongly influenced by the use TEL when combined with improved monitoring and intervention strategies. The impact of TEF on TEL is going to be very mixed, and we’d expect many institutions to undergo little change. However, there we will be a number where TEF will drive a re-think of the role of TEL, and TEL will become an enabler for improved TEF ratings.

TEF provides a rationale and urgency for an institution to re-visit the effective use of TEL within its wider innovative teaching and learning agenda. The effective embedding of TEL within learning models has the potential to become a differentiator between institutions, and a provide the mechanism by which an institution can improve its TEF rating. For instance, the simple scenario of delivery of blended (flipped) learning to enhance large group, face to face teaching should improve NSS Assessment & Feedback ratings, and provide more real-time data to inform personal tutor and intervention strategies (positive impact on retention and attainment). Both of which will improve the core TEF components. In addition, a TEL focussed learning and teaching model will provide an easier mechanism for institutions to surface excellent practice to include within their Institutional statements.

As stated there is likely to be a positive correlation between changing TEF performance and TEL adoption. However, institutional wide academic adoption of TEL is strongly influenced by a number of factors, including institutional culture, commitment, robust infrastructure, as well as academic and student support, development and engagement models. Therefore, for some institutions the likely impact on broad adoption of TEL will be low due to wider institutional barriers and challenges. The likelihood is institution’s will start to set clearer academic adoption KPIs (both in terms of level and type of adoption) to align with its emerging “teaching USP”, and be much more strategic in aligning its limited curriculum support and development resources (teams) to ensure maximum impact. Therefore, it is likely TEF will drive a re-think in how TEL is supported and developed within an institution.

The institutional requirements for TEL are also likely to change over time in response to TEF. For instance, an emerging institutional requirement is the need to improve reporting and monitoring. This will enable institutions to identify opportunities to improve TEF performance through more proactive interventions. This is likely to follow a similar institutional response to the introduction of the National Student Survey. Where a consequence of introducing the NSS question set for final year students, has been most institutions now include the same questions within their internal surveys for pre-final year students, and depending on the outcomes, Departments may introduce action planning to improve the experience of the student cohort as they progress through the final year. The expected outcome is, improved NSS scores. This required institutions to implement more robust and standardized reporting and monitoring.

The size of the impact will depend strongly on Senior Managers ability to drive institutional change towards decision making processes being more data informed in response to a need to align enhancement programmes, ie., curriculum redesign or better TEL support. The need for data decision making around TEF is likely to drive two related changes in TEL, firstly a consolidation on a core set of e-learning tools which will provide better quality data sets through more sophisticated data integrations and secondly improved reporting tools which can surface reports from the core e-learning tools.

Will the impact of TEF on TEL be large and quick? I would expect not initially due the resistance to change being relatively high in the sector given the nature of its current implementation and relative immaturity (only on phase 2). I am not sensing a significant level of dissatisfaction by staff with the current situation and a strong business case and vision from the sector. The expectation is this will change as TEF becomes more established, and the rewards for a strong TEF rating start to emerge.

Advertisements

How can you drive Learning Analytics within your Institution? #blackboardEMEA

gregynog_hall_tregynon_near_newtown_powys_wales_ukThe question “how can you drive Learning Analytics within your Institution” was the one I set myself to try to answer when I presented at the recent Gregynog Colloquium. The link to the slides illustrates how I tried to facilitate the audience to find an answer. The audience where IT Services staff from across various Welsh Higher Education Institutions.

The approach was to (i) make them aware on the cross institutional nature of Learning Analytics, therefore all stakeholders have a role, (ii) touch on how  we all are change agents, (iii) and provide some practical steps to promoting the change?

As you’ll see from the slide deck, I drew upon a methodology we use to develop out use cases, and encouraged them to reflect on how their role fits within these use cases. The final step was to raise awareness of the factors which determine institutional readiness.

Gregynog Blackboard Analytics June 16:

Achieving institutional adoption. ITTHE Session

20160604_121758The attached slide deck (see below) was used at a recent presentation (ITTHE 2016 Conference, Kadir Has University, Istanbul) focussing on, Achieving institutional wide adoption in Learning Technology

The key message was, institutional wide adoption of learning technology has little to do with technology, but a lot about managing institutional change.

The audience was predominantly lecturers (faculty members) with a sprinkling of Senior Managers and those in professional services. Given this audience, the intended aims where to ensure awareness of Blackboard’s ebook – 6 characteristics to increase technology adoption, provide an insight into the required alignment between leadership, institution commitment & investment and faculty training models on institutional wide adoption. The session included a number of activities to allow the participant the opportunity to reflect on where their institution currently resides on an adoption matrix.

The slides (PDF) are available from: ITTHE Istanbul Adoption Jun 16 final PDF

An wider observation from the day included significant interest around the effectiveness of flipped classroom learning models. A number of the research questions which presenters were addressing include;

  • Has the flipped classroom increased students engagement with the course?
  • Do students actually engage with the materials before class? What motivates them?
  • Do students prefer the flipped classroom approach compared to the traditional lecture?
  • What are the least favourite elements of a flipped classroom?
  • What problems do students encounter during the flipped classroom?
  • What factors reduce student motivation in the flipped classroom?
  • How would students design a flipped classroom activity?

Audience Task: Grade Journey Presentation #bbtlc16

As you maybe aware, Stephen Bryne and myself are presenting on “Optimising the Online Feedback and Grades Journey Experience” on Wednesday at the Blackboard TLC in Groningen.

The advertised session aims are:

  • What is the Grade Journey?
  • How will it help you achieve your strategic aims?
  • What lessons can we learn when implementing Grade Journey?
  • How might Blackboard help?

However, the unadvertised session aim is to encourage those attending to write an elevator pitch for a DVC/PVC/Rector or Provost in response to …

DVC asked the following question in a recent Education Committee Meeting:

Given all our pressing priorities, why should the institution invest the resource and effort to implement an enhanced grade journey?

Can you develop an elevator pitch to suggest … why?

You can write, video, record these wherever you like, just sharing using the tags: #bbtlc16 #gjelevator – for more info on elevator pitches, see

 

What is the grade journey? #bbtlc16

The intention of this post is to answer the question, what is the grade journey?

This is the second post in a series which are based on our presentation at Blackboard’s Teaching and Learning Conference in Groningen.

The short answer would be, the Grades Journey enables an institution to manage more effectively and efficiently the exchange of grades related data between Blackboard Learn and any Student Information System.

The previous definition illustrates it is associated within a broader process, where it is a small part of a much wider solution. For instance, the grade journey would be part of a wider set of electronic management of assessment solutions;

  • Ensure there is a method of submitting coursework electronically through a single interface.
  • Enable Academics to mark work and to annotate comments on students electronic scripts
  • Ability to either run all electronically submitted coursework through plagiarism detection (SafeAssign) or individual pieces of coursework or batches of coursework.
  • Enable rubrics & define metrics to assure the quality of marking
  • Improve the students understanding of their feedback and marks allocations. With a means 
to understand how they can improve their mark for the next piece of summative coursework

Figure 1, illustrates the end to end process from the initial creation and publication of the assessment task driven by information within the Student Record System, to the pubication of the final grade from the Student Record System.

Figure 1: Grade Journey as an end to end process

figure 2 end to end gj

Those who are aware of the manual intensive processes for e-submission, e-marking and e-return will be familiar with it often being error-prone, time consuming and involving many manual integrations to enable effective synchronization of assessment data. Against this backdrop, the grade journey synchronization offers a number of key enhancements;

  • Automated grade column creation creation based on records within your Student Information System. However, there is flexibility in terms of columns being displayed, and associated with grazeable content items (assignments, tests etc.,)
  • Marks (grades) approval and release. There is flexibility in determining when you release grades and your approval process. These will align with your institutional approval workflows. There is flexibility to enable you to deploy a disaggregated approval process through role settings for authorized users.
  • Automated grade exchange from Blackboard to your SIS, with the functionality to allow you to lockdown the grade centre as read only once the data has been synchronised.

So, where might you start to think through how the grade journey applies within your institutional context? An important first step is to map out what you are currently doing, and who is responsible for the different parts of the grade journey end to end process. A good place to start is to work through the following visualisation (from Sheffield Hallam University), identifying what each section means within your institutional context, and who is responsible.

Figure 3: Assessment Lifecycle (Sheffield Hallam University)

shu_assessment_lifecycle

Source: http://repository.jisc.ac.uk/6287/1/SHU_Assessment_Journey_Programme_Overall_Lifecycle.pdf

How does the Grade Journey help you achieve your strategic goals? #bbtlc16

I’m off to the Blackboard Teaching & Learning Conference (as you do), and I’m presenting on “optimizing the online feedback and grades journey experience”. As part of the positioning piece I thought it would be useful to share what I’ll be talking about around a Q&A. The first question is … how does the grade journey help you achieve your strategic goals?

If we define the Grade Journey as an end to end process for enhancing the efficiency and effectiveness of exchanging grade related data between Blackboard Learn and their Student Information System. Then the question must be, how does this granular, strategic design focus map back to the institution achieving its Strategic aims?

This can be answered through a number of different perspectives.

Firstly, placing the Grade Journey within the broader context of an institution moving from vision and strategy to implementation. This is illustrated in Figure 1, with a potential process by which the institutional strategy is achieved through a Grade Journey project.

Figure 1: Grade Journey (Strategic Design) within wider context

Figure1-grade-journey

If it is assumed an institutional assessment and feedback review identifies the following key requirement; “students will experience flexible and innovative approaches to learning and assessment informed by contemporary pedagogical practice”, with a priority being “to promote quality in all aspects of assessment and feedback” Through a Strategic Review and Planning engagement there are a number of short and medium term operational objectives identified to deliver these strategic requirements.

One of these in the short term, is implementing a Grade Journey project as part of a wider electronic management of assessment change program.

From the institutional perspective a number of cost (time) savings are expected;

  • Realise cost savings through introduction of more efficient administrative workflow
  • Remove duplication of effort by using a single system for marking and submission of marks
  • Enable a consistent approach for all staff entering marks, and increase the opportunity for greater use technology for feedback
  • Decrease the amount of time spent by academics on the administration of marks
  • Improve potential for errors to be identified and corrected before board processing by providing early and sustained visibility of provisional marks for both staff and students

Outside of these efficiency gains there are likely to be a number of enhancement gains which are associated with the standardisation process, which will also deliver a number of wider strategic aims. These include;

  • Assessment criteria can be presented explicitly via an interactive rubric with component marks and feedback
  • Enriched feedback through margin notes, summative comments and rubrics and increased use of a wider range of assessment types
  • Provide academic with greater control over assessment process through more visible monitoring
  • Diverse and wider range of assessment methods which can be embedded into the teaching and learning journey – include journals, wikis and automated testing
  • Feedback can be more easily provided in a variety of formats (comments box, attached document, annotation, audio, video, rubrics)
  • Feedback tools are embedded as part of the process, to allow more detailed feedback against assessment criteria

The next question is, in reality have significant efficiency gains been realized? The JISC EMA Programme identified there are, with significant potential resource savings (https://www.jisc.ac.uk/guides/electronic-assessment-management).

  • At Queen’s University Belfast the School of English moved to e-submission and marking which saved 20 working days per year in administrative staff time (in a school with c.900 students).
  • At the University of Huddersfield a time and motion study showed that an administrator dealing with 1,620 students saved 137 hours per year or 3.7 weeks based on a 37 hour working week.
  • At the University of Dundee time saved through e-submission and e-tutoring works out at around 20 mins per assignment submitted. The centralised e-tutoring approach at the University of Dundee has also generated further efficiencies in that tutor time spent responding to emails is 60 minutes per week as opposed to 176 minutes on a similar programme that does not employ the system.

The question to ask yourself is, what are your strategic goals around Assessment & Feedback, and how well do they align to the direct and indirect benefits associated with the Grade Journey?

The intention of a later post is to answer; what is the grade journey?

Notes from JISC’s 5th Learning Analytics Network Meeting (UEL)

The following are a few notes I made when attending the JISC 5th Learning Analytics Network Meeting at UEL on the 3rd Feb.

The primary reason for me to be there was to sit on a panel session on reviewing the institutional readiness discovery assessments which Blackboard have been commissioned to support (http://bit.ly/LADiscovery). My notes for specific questions are below;

Can you outline briefly the process you’ve gone through with each institution?

The discovery exercise is focussed on the four “pillars of readiness” culture, people, processes & terminology infrastructure

The assessment is made against 20 plus evidence statements which are mapped to a number of rubrics. The outcome is by pillar and category. We categorise as Ready, Ready with recommendations, Not Ready with recommendations

Data is collected through a number of different routes – Pre-onsite (questionnaire, doc review) & Onsite (forum, interviews/open groups, workshops)

The second deliverable is an indicative feature set and data improvement cycle. This is informed by the topic / theme. The aim is to give an indication of what the indicators might be, and the pilot design.

How much similarity have you found across the institutions you’ve visited so far?

The context >> I was expecting there to only be a view similarities across the five institutions because of different types of institutions (Russell Group, Post 1992), plus a very small sample.

However, have found a number of similarities

  • Strong vision / belief in importance of learning analytics from a senior managers around enhancing the student experience
  • Interest in enhanced reporting and dashboards, compared to predictive analytics
  • common themes around progression & attainment
  • The agent (vehicle) of change is primarily personal tutor mediating the data / intervention, not automatic or large scale unmediated student facing dashboard

Overall what are the main areas which institutions need to tackle to move forward?

Given small sample, very difficult not to generalise – I have tried to organise these by theme

The focus is on “big challenges”, selected three, view technology infrastructure as easier to overcome

Culture >> need to widen and deepen the conversation with staff – what is learning analytics? what does it look like? how can it improve learning an teaching at the institution?

Process >> Exploring interventions for retention / attainment – how are you going to create frequent, actionable data? must question is, will current curriculum development process & staff development models generate what is needed at the programme level

People >> capacity – looking at workflows for LA with interventions – this makes a very large, cross institutional project. So, all institutions undergoing significant change programs – so in the short term, capacity to handle more change. However, in longer term, links into culture and need to reduce resistance to change across all stakeholders.

Notes from the day

JISC Update (Paul & Michael)

  • Student app will be available in App stores – invite only by April
  • Code of practice, need to tie this in, and listen to the podcasts (5 in total, rolled out individually – first next week) – http://analytics.jiscinvolve.org/wp/

Gary: UEL

What are they are doing

  • Student engagement for personal tutors – developed spring 2015
  • Developing a student engagement metric – working up a model – enhancement on the student engagement for personal tutors – this will provide the student with an indicator of their own level of engagement
  • Developing a student facing app – includes position in terms of student engagement compared to peers, also visualization of distance to travel

Their research

  • Proof of very strong correlation between attendance and average module marks.
  • Use the data / research to inform the metric / importance of the various indicators, ie., use own data to identify significance factors
  • identifying a group of students where interventions will have the biggest effect. Sounds similar to stuff from John Whitmer

Micro Projects

Open University – Student workload tool. The problem: how to manage the workload for online learners, especially workload distribution. Observed workload distribution seems to be negatively correlated with student outcomes. So need to ensure workload is even. Use activity taxonomy to inform a web based workload assessment tool. Road ahead … hosted version, and download version. Should be available within the next month – Action :: follow up at end of March 2016 – this would be really useful to have in the back pocket for institutions when onsite and program development

Edinburgh – cognitive presence coding – supporting traditional De/OLlots of scary stuff around coding of cognitive presence. Would allow easier identification and intervention within MOOCs and online courses. Potential down the line to provide visualisation and monitoring of engaged learners. Watch this space …

University of Greenwich – Survey data as an indicator of learning outcomes? They are trying to leveraging more out of the data they collect. Follow Sheffield Halam’s work in this area. Greenwich stuff <<< http://analytics.jiscinvolve.org/wp/2016/01/30/guest-post-can-survey-data-be-used-as-an-indicator-of-learning-outcomes/ >>>