How can you drive Learning Analytics within your Institution? #blackboardEMEA

gregynog_hall_tregynon_near_newtown_powys_wales_ukThe question “how can you drive Learning Analytics within your Institution” was the one I set myself to try to answer when I presented at the recent Gregynog Colloquium. The link to the slides illustrates how I tried to facilitate the audience to find an answer. The audience where IT Services staff from across various Welsh Higher Education Institutions.

The approach was to (i) make them aware on the cross institutional nature of Learning Analytics, therefore all stakeholders have a role, (ii) touch on how  we all are change agents, (iii) and provide some practical steps to promoting the change?

As you’ll see from the slide deck, I drew upon a methodology we use to develop out use cases, and encouraged them to reflect on how their role fits within these use cases. The final step was to raise awareness of the factors which determine institutional readiness.

Gregynog Blackboard Analytics June 16:

Achieving institutional adoption. ITTHE Session

20160604_121758The attached slide deck (see below) was used at a recent presentation (ITTHE 2016 Conference, Kadir Has University, Istanbul) focussing on, Achieving institutional wide adoption in Learning Technology

The key message was, institutional wide adoption of learning technology has little to do with technology, but a lot about managing institutional change.

The audience was predominantly lecturers (faculty members) with a sprinkling of Senior Managers and those in professional services. Given this audience, the intended aims where to ensure awareness of Blackboard’s ebook – 6 characteristics to increase technology adoption, provide an insight into the required alignment between leadership, institution commitment & investment and faculty training models on institutional wide adoption. The session included a number of activities to allow the participant the opportunity to reflect on where their institution currently resides on an adoption matrix.

The slides (PDF) are available from: ITTHE Istanbul Adoption Jun 16 final PDF

An wider observation from the day included significant interest around the effectiveness of flipped classroom learning models. A number of the research questions which presenters were addressing include;

  • Has the flipped classroom increased students engagement with the course?
  • Do students actually engage with the materials before class? What motivates them?
  • Do students prefer the flipped classroom approach compared to the traditional lecture?
  • What are the least favourite elements of a flipped classroom?
  • What problems do students encounter during the flipped classroom?
  • What factors reduce student motivation in the flipped classroom?
  • How would students design a flipped classroom activity?

Audience Task: Grade Journey Presentation #bbtlc16

As you maybe aware, Stephen Bryne and myself are presenting on “Optimising the Online Feedback and Grades Journey Experience” on Wednesday at the Blackboard TLC in Groningen.

The advertised session aims are:

  • What is the Grade Journey?
  • How will it help you achieve your strategic aims?
  • What lessons can we learn when implementing Grade Journey?
  • How might Blackboard help?

However, the unadvertised session aim is to encourage those attending to write an elevator pitch for a DVC/PVC/Rector or Provost in response to …

DVC asked the following question in a recent Education Committee Meeting:

Given all our pressing priorities, why should the institution invest the resource and effort to implement an enhanced grade journey?

Can you develop an elevator pitch to suggest … why?

You can write, video, record these wherever you like, just sharing using the tags: #bbtlc16 #gjelevator – for more info on elevator pitches, see


What is the grade journey? #bbtlc16

The intention of this post is to answer the question, what is the grade journey?

This is the second post in a series which are based on our presentation at Blackboard’s Teaching and Learning Conference in Groningen.

The short answer would be, the Grades Journey enables an institution to manage more effectively and efficiently the exchange of grades related data between Blackboard Learn and any Student Information System.

The previous definition illustrates it is associated within a broader process, where it is a small part of a much wider solution. For instance, the grade journey would be part of a wider set of electronic management of assessment solutions;

  • Ensure there is a method of submitting coursework electronically through a single interface.
  • Enable Academics to mark work and to annotate comments on students electronic scripts
  • Ability to either run all electronically submitted coursework through plagiarism detection (SafeAssign) or individual pieces of coursework or batches of coursework.
  • Enable rubrics & define metrics to assure the quality of marking
  • Improve the students understanding of their feedback and marks allocations. With a means 
to understand how they can improve their mark for the next piece of summative coursework

Figure 1, illustrates the end to end process from the initial creation and publication of the assessment task driven by information within the Student Record System, to the pubication of the final grade from the Student Record System.

Figure 1: Grade Journey as an end to end process

figure 2 end to end gj

Those who are aware of the manual intensive processes for e-submission, e-marking and e-return will be familiar with it often being error-prone, time consuming and involving many manual integrations to enable effective synchronization of assessment data. Against this backdrop, the grade journey synchronization offers a number of key enhancements;

  • Automated grade column creation creation based on records within your Student Information System. However, there is flexibility in terms of columns being displayed, and associated with grazeable content items (assignments, tests etc.,)
  • Marks (grades) approval and release. There is flexibility in determining when you release grades and your approval process. These will align with your institutional approval workflows. There is flexibility to enable you to deploy a disaggregated approval process through role settings for authorized users.
  • Automated grade exchange from Blackboard to your SIS, with the functionality to allow you to lockdown the grade centre as read only once the data has been synchronised.

So, where might you start to think through how the grade journey applies within your institutional context? An important first step is to map out what you are currently doing, and who is responsible for the different parts of the grade journey end to end process. A good place to start is to work through the following visualisation (from Sheffield Hallam University), identifying what each section means within your institutional context, and who is responsible.

Figure 3: Assessment Lifecycle (Sheffield Hallam University)



How does the Grade Journey help you achieve your strategic goals? #bbtlc16

I’m off to the Blackboard Teaching & Learning Conference (as you do), and I’m presenting on “optimizing the online feedback and grades journey experience”. As part of the positioning piece I thought it would be useful to share what I’ll be talking about around a Q&A. The first question is … how does the grade journey help you achieve your strategic goals?

If we define the Grade Journey as an end to end process for enhancing the efficiency and effectiveness of exchanging grade related data between Blackboard Learn and their Student Information System. Then the question must be, how does this granular, strategic design focus map back to the institution achieving its Strategic aims?

This can be answered through a number of different perspectives.

Firstly, placing the Grade Journey within the broader context of an institution moving from vision and strategy to implementation. This is illustrated in Figure 1, with a potential process by which the institutional strategy is achieved through a Grade Journey project.

Figure 1: Grade Journey (Strategic Design) within wider context


If it is assumed an institutional assessment and feedback review identifies the following key requirement; “students will experience flexible and innovative approaches to learning and assessment informed by contemporary pedagogical practice”, with a priority being “to promote quality in all aspects of assessment and feedback” Through a Strategic Review and Planning engagement there are a number of short and medium term operational objectives identified to deliver these strategic requirements.

One of these in the short term, is implementing a Grade Journey project as part of a wider electronic management of assessment change program.

From the institutional perspective a number of cost (time) savings are expected;

  • Realise cost savings through introduction of more efficient administrative workflow
  • Remove duplication of effort by using a single system for marking and submission of marks
  • Enable a consistent approach for all staff entering marks, and increase the opportunity for greater use technology for feedback
  • Decrease the amount of time spent by academics on the administration of marks
  • Improve potential for errors to be identified and corrected before board processing by providing early and sustained visibility of provisional marks for both staff and students

Outside of these efficiency gains there are likely to be a number of enhancement gains which are associated with the standardisation process, which will also deliver a number of wider strategic aims. These include;

  • Assessment criteria can be presented explicitly via an interactive rubric with component marks and feedback
  • Enriched feedback through margin notes, summative comments and rubrics and increased use of a wider range of assessment types
  • Provide academic with greater control over assessment process through more visible monitoring
  • Diverse and wider range of assessment methods which can be embedded into the teaching and learning journey – include journals, wikis and automated testing
  • Feedback can be more easily provided in a variety of formats (comments box, attached document, annotation, audio, video, rubrics)
  • Feedback tools are embedded as part of the process, to allow more detailed feedback against assessment criteria

The next question is, in reality have significant efficiency gains been realized? The JISC EMA Programme identified there are, with significant potential resource savings (

  • At Queen’s University Belfast the School of English moved to e-submission and marking which saved 20 working days per year in administrative staff time (in a school with c.900 students).
  • At the University of Huddersfield a time and motion study showed that an administrator dealing with 1,620 students saved 137 hours per year or 3.7 weeks based on a 37 hour working week.
  • At the University of Dundee time saved through e-submission and e-tutoring works out at around 20 mins per assignment submitted. The centralised e-tutoring approach at the University of Dundee has also generated further efficiencies in that tutor time spent responding to emails is 60 minutes per week as opposed to 176 minutes on a similar programme that does not employ the system.

The question to ask yourself is, what are your strategic goals around Assessment & Feedback, and how well do they align to the direct and indirect benefits associated with the Grade Journey?

The intention of a later post is to answer; what is the grade journey?

Notes from JISC’s 5th Learning Analytics Network Meeting (UEL)

The following are a few notes I made when attending the JISC 5th Learning Analytics Network Meeting at UEL on the 3rd Feb.

The primary reason for me to be there was to sit on a panel session on reviewing the institutional readiness discovery assessments which Blackboard have been commissioned to support ( My notes for specific questions are below;

Can you outline briefly the process you’ve gone through with each institution?

The discovery exercise is focussed on the four “pillars of readiness” culture, people, processes & terminology infrastructure

The assessment is made against 20 plus evidence statements which are mapped to a number of rubrics. The outcome is by pillar and category. We categorise as Ready, Ready with recommendations, Not Ready with recommendations

Data is collected through a number of different routes – Pre-onsite (questionnaire, doc review) & Onsite (forum, interviews/open groups, workshops)

The second deliverable is an indicative feature set and data improvement cycle. This is informed by the topic / theme. The aim is to give an indication of what the indicators might be, and the pilot design.

How much similarity have you found across the institutions you’ve visited so far?

The context >> I was expecting there to only be a view similarities across the five institutions because of different types of institutions (Russell Group, Post 1992), plus a very small sample.

However, have found a number of similarities

  • Strong vision / belief in importance of learning analytics from a senior managers around enhancing the student experience
  • Interest in enhanced reporting and dashboards, compared to predictive analytics
  • common themes around progression & attainment
  • The agent (vehicle) of change is primarily personal tutor mediating the data / intervention, not automatic or large scale unmediated student facing dashboard

Overall what are the main areas which institutions need to tackle to move forward?

Given small sample, very difficult not to generalise – I have tried to organise these by theme

The focus is on “big challenges”, selected three, view technology infrastructure as easier to overcome

Culture >> need to widen and deepen the conversation with staff – what is learning analytics? what does it look like? how can it improve learning an teaching at the institution?

Process >> Exploring interventions for retention / attainment – how are you going to create frequent, actionable data? must question is, will current curriculum development process & staff development models generate what is needed at the programme level

People >> capacity – looking at workflows for LA with interventions – this makes a very large, cross institutional project. So, all institutions undergoing significant change programs – so in the short term, capacity to handle more change. However, in longer term, links into culture and need to reduce resistance to change across all stakeholders.

Notes from the day

JISC Update (Paul & Michael)

  • Student app will be available in App stores – invite only by April
  • Code of practice, need to tie this in, and listen to the podcasts (5 in total, rolled out individually – first next week) –

Gary: UEL

What are they are doing

  • Student engagement for personal tutors – developed spring 2015
  • Developing a student engagement metric – working up a model – enhancement on the student engagement for personal tutors – this will provide the student with an indicator of their own level of engagement
  • Developing a student facing app – includes position in terms of student engagement compared to peers, also visualization of distance to travel

Their research

  • Proof of very strong correlation between attendance and average module marks.
  • Use the data / research to inform the metric / importance of the various indicators, ie., use own data to identify significance factors
  • identifying a group of students where interventions will have the biggest effect. Sounds similar to stuff from John Whitmer

Micro Projects

Open University – Student workload tool. The problem: how to manage the workload for online learners, especially workload distribution. Observed workload distribution seems to be negatively correlated with student outcomes. So need to ensure workload is even. Use activity taxonomy to inform a web based workload assessment tool. Road ahead … hosted version, and download version. Should be available within the next month – Action :: follow up at end of March 2016 – this would be really useful to have in the back pocket for institutions when onsite and program development

Edinburgh – cognitive presence coding – supporting traditional De/OLlots of scary stuff around coding of cognitive presence. Would allow easier identification and intervention within MOOCs and online courses. Potential down the line to provide visualisation and monitoring of engaged learners. Watch this space …

University of Greenwich – Survey data as an indicator of learning outcomes? They are trying to leveraging more out of the data they collect. Follow Sheffield Halam’s work in this area. Greenwich stuff <<< >>>

Workshop Outputs: Blackboard User Group: Durham University #durbbu

The following is a transcript of the tasks we ran in the workshop at the Blackboard User Group hosted by Durham University (6th Jan). Malcom, please share ….

The background to the workshop tasks was around institutional readiness for Learning Analytics. The tasks where a little rushed due to timing issues (not my fault :-)).

Thanks to the attendees for participating, and I’ll leave it to them to interpret the outcomes.

The aim of the first task was to think about the attributes, qualities and characteristics of an engaged learner, and then explore the likely indicators they might use to measure engagement. The exercise fits into the Strategic Review / Planning as we seek institutional consensus around terminology, and understanding.

>> Could you describe the qualities, characteristics and attributes of an engaged learner at your institution

• Attends all types of timetable sessions. They take part in assessment activities. Performing to their capability. Making progress.
• Enthusiasm, interest in the their subject, and willingness to share in the whole experience
• An active learning student (high attendance in teaching and learning), with regular contact, successful outcomes and is happy
• accessing reading material (in VLE course library etc.,), watching lecture recordings, submitting work in time, completing formative assessment, engagement in tutorials, leading in group work & discussions, and joining clubs
• Attends sessions, use of available resources, intrinsically motivated, participants in discussions and peer groups, gets desired results.
• Participates in sessions, activities, collaborates and communicates. Asks questions, is inquisitive and uses feedback to improve work

>> Based on your description, how would you measure student engagement? What are the top five indicators of an engaged student?

1. Attendance
2. Assessment – completion
3. Assessment – grades

Group 2

1. Competition of assessments
2. Extra curricular activities, including external commitments
3. Self declared data

Group 3

1. Attendance
2. Submissions of assignments
3. Quality of communication
4. Pass / fail of outcomes
5. Happiness
6. Support requested / received

Group 4

1. VLE data
2. Door access data
3. SIS data

Group 5

1. Attendance
2. Submission
3. Contribution to online and/or face to fave sessions
4. Access to resources (library and online)
5. Evidence of critical thinking

Group 6

1. Library usage
2. Attendance
3. Grade history
4. Wifi tracking
5. Course triggers

Group 7

1. Attendance
2. Activity competition (formative assessment)
3. Activity of course (discussion boards, forums)
4. Achieving desired grades

The “elevator pitch to a PVC within your institution on the benefits of Learning Analytics within teaching, learning and assessment” task was not so easy to analysis (or actually write up !!! – as I mentioned in the session, but would anyone listen to me). An emerging theme was to pitch the concept that we have lots of data, so what questions do you want answering?

One group did take a slightly different approach;

Healthy Body, Healthy Mind: Using on campus purchases of food and drink (including alcohol) and compare this to academic performance.