Notes from JISC’s 5th Learning Analytics Network Meeting (UEL)

The following are a few notes I made when attending the JISC 5th Learning Analytics Network Meeting at UEL on the 3rd Feb.

The primary reason for me to be there was to sit on a panel session on reviewing the institutional readiness discovery assessments which Blackboard have been commissioned to support (http://bit.ly/LADiscovery). My notes for specific questions are below;

Can you outline briefly the process you’ve gone through with each institution?

The discovery exercise is focussed on the four “pillars of readiness” culture, people, processes & terminology infrastructure

The assessment is made against 20 plus evidence statements which are mapped to a number of rubrics. The outcome is by pillar and category. We categorise as Ready, Ready with recommendations, Not Ready with recommendations

Data is collected through a number of different routes – Pre-onsite (questionnaire, doc review) & Onsite (forum, interviews/open groups, workshops)

The second deliverable is an indicative feature set and data improvement cycle. This is informed by the topic / theme. The aim is to give an indication of what the indicators might be, and the pilot design.

How much similarity have you found across the institutions you’ve visited so far?

The context >> I was expecting there to only be a view similarities across the five institutions because of different types of institutions (Russell Group, Post 1992), plus a very small sample.

However, have found a number of similarities

  • Strong vision / belief in importance of learning analytics from a senior managers around enhancing the student experience
  • Interest in enhanced reporting and dashboards, compared to predictive analytics
  • common themes around progression & attainment
  • The agent (vehicle) of change is primarily personal tutor mediating the data / intervention, not automatic or large scale unmediated student facing dashboard

Overall what are the main areas which institutions need to tackle to move forward?

Given small sample, very difficult not to generalise – I have tried to organise these by theme

The focus is on “big challenges”, selected three, view technology infrastructure as easier to overcome

Culture >> need to widen and deepen the conversation with staff – what is learning analytics? what does it look like? how can it improve learning an teaching at the institution?

Process >> Exploring interventions for retention / attainment – how are you going to create frequent, actionable data? must question is, will current curriculum development process & staff development models generate what is needed at the programme level

People >> capacity – looking at workflows for LA with interventions – this makes a very large, cross institutional project. So, all institutions undergoing significant change programs – so in the short term, capacity to handle more change. However, in longer term, links into culture and need to reduce resistance to change across all stakeholders.

Notes from the day

JISC Update (Paul & Michael)

  • Student app will be available in App stores – invite only by April
  • Code of practice, need to tie this in, and listen to the podcasts (5 in total, rolled out individually – first next week) – http://analytics.jiscinvolve.org/wp/

Gary: UEL

What are they are doing

  • Student engagement for personal tutors – developed spring 2015
  • Developing a student engagement metric – working up a model – enhancement on the student engagement for personal tutors – this will provide the student with an indicator of their own level of engagement
  • Developing a student facing app – includes position in terms of student engagement compared to peers, also visualization of distance to travel

Their research

  • Proof of very strong correlation between attendance and average module marks.
  • Use the data / research to inform the metric / importance of the various indicators, ie., use own data to identify significance factors
  • identifying a group of students where interventions will have the biggest effect. Sounds similar to stuff from John Whitmer

Micro Projects

Open University – Student workload tool. The problem: how to manage the workload for online learners, especially workload distribution. Observed workload distribution seems to be negatively correlated with student outcomes. So need to ensure workload is even. Use activity taxonomy to inform a web based workload assessment tool. Road ahead … hosted version, and download version. Should be available within the next month – Action :: follow up at end of March 2016 – this would be really useful to have in the back pocket for institutions when onsite and program development

Edinburgh – cognitive presence coding – supporting traditional De/OLlots of scary stuff around coding of cognitive presence. Would allow easier identification and intervention within MOOCs and online courses. Potential down the line to provide visualisation and monitoring of engaged learners. Watch this space …

University of Greenwich – Survey data as an indicator of learning outcomes? They are trying to leveraging more out of the data they collect. Follow Sheffield Halam’s work in this area. Greenwich stuff <<< http://analytics.jiscinvolve.org/wp/2016/01/30/guest-post-can-survey-data-be-used-as-an-indicator-of-learning-outcomes/ >>>

Leave a comment