Visualisation of Learning Analytics V2

Having talked it through with a few people, and tried to use it as a discussion aid at a recent event, I’ve tweaked my visualisation of learning analytics slightly, and included the new version below.

The tweaks are because I wanted to illustrate not just the required technology infrastructure an institution is likely to need, but also the people who are likely to be involved within the process, and provide an indicator of their likely required skill sets.

It does illustrate just how learning analytics touches on the majority of the institution. It also, allows us to look at the visualisation to see if we recognise these roles within our institution.

Note – this only uses a subset of potential data sources, and feedback loops to senior managers etc.,

People & Technology Infrastructure- Generic

What might the Learning Analytics process look like from my perspective?

The starting point for this post is firstly to admit I’m not technical, and secondly, admit I’m no expert on the nuisances of Learning Analytics. However, I am asked quite regularly about it, and I do try to explain what is required from various perspectives. The question is, do I help? I’m not as effective as I’d like to be 🙂 I feel I need a graphical means of enabling around the people round the table to have a meaningful conversation.

The following is an attempt to visualise conversations I’ve had with colleagues, for people who aren’t immersed in learning analytics. I’d love feedback …

The context of use start a conversation around a feature set an institution use as part of a pilot study. Therefore, I’d talk through a general use case, and start to identify the likely indicators used to identify the degree of student engagement. For instance, in a previous post on this blog, I’ve discussed a use of learning analytics to nudge curriculum design to generate timely, actionable data. A use case might be around retention and the need for enhanced reporting on educational data to action appropriate intervention strategies. The proxies for the level of student engagement could be a combination of educational data: LMS/VLE (formative assessments – quizzes, discussion boards, and login data), SIS (summative assessment record), Lecturer Attendance, and engagement with classroom technologies (clickers).

An indicative feature set for an institution to enable this is visualised below,

Indicative Feature Set- Overview (1)


  • Extract, Transform and Load (ETL) refers to a process in database usage and especially in data warehousing that: Extracts data from homogeneous or heterogeneous data sources. Transforms the data for storing it in proper format or structure for querying and analysis purpose.
  • OLAP is an acronym for Online Analytical Processing. OLAP performs multidimensional analysis of business data and provides the capability for complex calculations, trend analysis, and sophisticated data modeling.
  • SIS is student information system
  • LMS / VLE is learning management system / virtual leaning environment

I’d suggest a visualisation approach for lay people is a really useful way to move the conversation between the bigger picture of interlinked determinants, such as culture, people and processes and the required technical infrastructure.

It also provides a framework to drive out the questions which would need answering within a learning analytics pilot. This is through exploring the big picture, and how the individual components function. For instance;

  1. What does Learning Analytics mean within your context?
  2. What are the specific questions you’d like answering?
  3. What are the data points and indicators you need to answer these questions?
  4. Where does that data reside? Where and how would it be best stored?
  5. Who can access the data? What is the reporting process?
  6. Is your current curriculum development model generating consistent, actionable learning data?
  7. What are the most appropriate intervention strategies and work flows?
  8. What are the ethical considerations around this use Learning Analytics?
  9. What changes, if any do you need to consider in Policy and Procedures for this use of Learning Analytics?
  10. What is the required evaluation framework and enhancement cycle for Learning Analytics?

Plus many, many more.  So, what should I add to this to make it work for you?

With Thanks – Image –

Competency based learning: have you any examples outside health studies?

I’ll be off on my travels soon to present at a Teaching and Learning Forum. The topic they’d like me to focus on is competency based learning. The audience is very diverse, from senior managers, academics, support teams (curriculum design), and sys admins. So, I’m thinking, the following should resonate in part with all of them (40 minutes listening to me, ranting about a topic close to my heart, but not theirs might be a tall order for some in the audience).

However, what I do need is some research findings, and case studies which are not from non-medical / health discipline areas. For instance, is competency based learning being used to teach undergraduate social scientists?

So if you know of any materials, links please ping them to me – thanks 🙂

The Abstract:

Competency based learning has a rich history in medical and health education. The key characteristic is student progression based on demonstrating a proficiency and/or mastery of specific skills or abilities, as measured through assessments. This contrasts with the more traditional learning design with a fixed-time model (a semester or year). Competency based learning advocates suggested student engagement is higher through the creation of personalized learning pathways which are tailored to their unique needs.

Recent innovations in Technology Enhanced Learning have reduced the barriers to designing, implementing a monitoring a competency based learning model. For instance, the ease by which Goals and Achievements are set within Blackboard Learn, and enhanced reporting through Learning Analytics. Therefore, a number of perceived barriers have disappeared, which enables us to re-visit the possibilities of this learning model.

The aim of this presentation is to discuss the potential of adopting competency based learning within the context of the individual faculty member, and the wider institution. This will be achieved the through answering five questions; 

1. What are the characteristics of Competency Based Education (Learning)?

2. In what educational context has it be used effectively?

3. How might Competency Based Learning transfer to other disciplines?

4. From a Faculty perspective, what do the technology enhanced learning activities look like? What is good practice around how we design these activities?

5. From a Senior Managers perspective, how might we scale up Competency Based Education across a the institution?


With Thanks – Image –

Using Learning Analytics to “nudge” students to take more ownership over their learning

Becoming an independent or self regulated learner is an important part of becoming a more effective the learner. This type of learner is “distinguished by their responsiveness to feedback regarding the effectiveness of their learning, and by their self perceptions of academic accomplishment” (Zimmerman (2000:14)). It requires not only the learner taking responsibility for their weaknesses, but Faculty to give them the opportunity to identify, correct and improve upon these weaknesses (Fritz (2013)).

The following discussion has two aims. Firstly, to outline how a Faculty member can redesign their learning activities around tools within the virtual learning environment to provide richer feedback and trigger dialogue amongst students with the intention of nudging them to become more effective self regulated learners. Secondly, to outline the case for Senior Mangers to drive this redesign process to enable the institution to take better advantage of learning analytics and data driven decision making.

Davenport et al., (2000) define learning analytics as “the application of analytic techniques to analyze educational data, including data about the learner and teacher activities, to identify patterns of behaviors and provide actionable information to improve learning and learning related activities”. An important aspect within this definition is the concept of the data being “actionable”, by either the learner, teacher or another stakeholder group.

To gather data and take actions requires the learning model to move away from an orthodox approach of a few high stake summative assessments (typically one essay and an unseen exam) towards one which provides more frequent feedback opportunities and learning loops for the student but is sustainable and scaleable for the Faculty member. The problem with the orthodox design is it can not easily develop self regulated learners because there is too little reliable and actionable data. Therefore, if frequent actionable data is required the re-designed approach should include using the VLE quiz engine, the submission of short online writing tasks (with defined marking criteria) and potentially classroom voting technologies. Within this case the learning activity will generate more feedback and enhanced reporting for the individual learner. For instance, if you deployed a number of five question online tests to be completed by all students. This would generate significant amounts of actionable date without significant work. Also, by using a variety of different question types you’ll be easily able to tease out higher order learning skills. Question types might include Likert Scale (to what extent do you agree …), and short free text responses (in no more than 150 words). The individual can access their score and feedback online, and compare themselves to the average grade. The Faculty can dedicate a proportion of the next face to face teaching session to provide additional feedback on the questions. This approach aligns with the all seven principles of good feedback (Nicol & Macfarlane-Dick (2006)

So, why should Senior Managers encourage Faculty to enhance their technology based learning designs? The simple answer is, from the institutional perspective the use of the VLE to generate and share data underpins the effective adoption of learning analytics. These re-designs by individual Faculty will act “as a series of small [steps] designed to gain experience and make the case that data based decisions have enhanced value” (Bichsel 2012:26). For this to work, the Institution needs to be aware Learning Analytics does not require perfect or near perfect data, and the institution needs to walk before it can run. Therefore, a number of small scale pilots around enhanced reporting and effective intervention strategies will enable the institution to improve its readiness for learning Analytics. The pilots would instigate conversations around institutional culture, people and processes, as well as technology infrastructure. This will add more value compared to exploring correlations in large data sets between VLE access data as a proxy for student engagement, and final grade performance often discussed in the literature.

Where next for you? If you are Faculty a good starting point would be to explore the functionality and possibilities of the online quiz tool and grade center. If you are a Senior Manager a good starting point would be to explore effective academic adoption models within the context of your institution.


Bichsel, J., (2012) ECAR Study of Analytics in Higher Education, available at: (Accessed October 2015)

Fritz, J., (2013) Using Analytics at UMBC, Educase Center for Applied Research, Research Bulletin, available at

Nicol, D., & Macfarlane-Dick, D., (2006), Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218

Nussbaumer, A., Hillemann, E-C., Gutl C., and Albert, D., (2015), “Competence-based Service for Supporting Self-Regulated Learning in Virtual Environments, Journal of Learning Analytics, 2(1), 101-133

Zimmerman, B., (1990), “Self-Regulated Learning and Academic Achievement: An Overview” in Educational Psychologist, 25(1), 3-17

Why read “Using Analytics at UMBC”?

I really enjoyed reading the Educase Research Bulletin, Using Analytics at UMBC (Fritz, 2013), and thought how might I get others to read it? I also thought, umm … should I be really enjoying reading an article on Learning Analytics (!!!)

Anyway, the question is, if I’ve read it (a person looking for a rationale to use Learning Analytics, who has worked a long time in UK Higher Education, and marked far too many student assignments), how might I encourage you to devote twenty precious minutes to reading it?

I’d suggest from a quality assurance perspective it mirrors many discussions in the literature. So, what is being suggested is within emerging good practice.

However, more importantly, I really enjoyed the narrative which gives a sense of discovery, and a cycle of improvement as they moved from relatively unsophisticated approaches to using it in ways they’s not originally thought about. In fact, when they started they didn’t have perfect analysis or even a prediction they could test. However, the unfolding story creates an obvious set of phases or stages, which encourages me to transfer this model to the institutions where I’d worked.

If you are not sure what to predict, a good starting point would be to see if you can replicate their findings of a positive correlation between LMS (VLE) access stats (clicks, time online etc) and end of semester / year grades

The above pilot would create tangible conversations and outcomes which you’d need to have within your institution. For instance, UMBC applied enhanced reporting (dashboard) and self-directed intervention strategies to “nudge” students to take ownership over their learning. This raised questions around ethics, the processes and policies around interventions, and the need to improve staff and curriculum development to ensure there is accurate and timely data to action?

I would also encourage you to read it as they used Learning Analytics to discover examples of effective practice, which had long term impacts on student learning and performance in other modules. I often reflected that my knowledge network around which people are using innovative TEL within the institution, was very much like an iceberg. The way knowledge was shared, I saw a fraction of what was going on in those hundreds of silos. Lots of effective and innovative practice was going on under the waterline. However, at UMBC they used Learning Analytics to open up the silos and expose the part of the iceberg under the waterline. The outcome being the possible for improving TEL practice as we more effectively share our understanding of our educational effectiveness of TEL activities. If that isn’t a great (and appropriate) usage for learning analytics (and a key reason to read the article) I’m not sure what is.

I hope you enjoy the article as much as I did ….


Image – with thanks –

MoodleMoot Italy: Learning analytics is only as good as the learning design ….

Thanks to those who commented.

I just thought I’d share the final presentation and the PDF for the proceedings.

I also wrote a short context blog post from which I created the presentation. This might help people fill the gaps between the slides


Sanity check: a presentation idea for a moodlemoot – would it work for you?

I have been set one of those nice challenges (or opportunities) to present at a forthcoming MoodleMoot, on a trend in UK HEIs, which needs to be pitched at the strategic / institutional level, and I’ve got 20 mins (including questions). However, I appreciate the room is likely to be full of academics and teachers. Therefore, the wrong audience for a typical strategic pitch. The following outlines what I’m doing to focus on, any thoughts … greatly appreciated. Role play, you are an academic, would this work for you?

I’m going to discuss some emerging trends around learning analytics.

At the globlal scale, learning analytics is a rapidly growing area of interest. As a term it covers a range of concepts and applications. Consequently, there is no single definition, “Analytics is the use of data, statistical analysis, and explanatory and predictive models to gain insights and act on complex issues.” (Bichsel 2012:6). Cooper (2012:3) describes analytics as “the process of developing actionable insights through problem definition and the application of statistical models and analysis against existing and/or simulated future data”. The goal is to “build better pedagogies, empower students to take an active part in their learning, target at risk student populations, and assess factors affecting completion and student success” (Johnson et al., 2015:12)

The presentation aims to answer the following questions:

What do we mean by Learning Analytics and why should we care?

  • propose a working definition for this session
  • examples / case studies of success and impact – link to institutional and individual perspectives

Where is the current focus around Learning Analytics for UK HEIs?

  • draw upon the findings of the recent JISC Report
  • provide one case study in more detail

What determines an institutions readiness for learning analytics?

  • propose a set of lens to assess readiness: includes appropriate policy, processes, systems and people
  • literature review to support / highlight key determinants of these lens
  • make reference to not simply a tool solution – “x-ray” is great, but why have it, data quality, ethical issues etc.,

Why does academic adoption of e-assessment designs underpin Learning Analytics?

  • explore a scenario for the “people lens”
  • focus on how a redesign of the assessment model will create good quality, useable data for the institution to build the foundations for effective use of learning analytics (see slides)
  • make reference to them enhancing their teaching materials, ie., learning design informed by analytics

Bichsel, Jacqueline. Analytics in Higher Education: Benefits, Barriers, Progress, and Recommendations (Research Report). Louisville, CO: EDUCAUSE Center for Applied Research, August 2012, available from

Johnson, L., Adams Becker, S., Estrada, V., and Freeman, A. (2015). NMC Horizon Report: 2015 Higher Education Edition. Austin, Texas: The New Media Consortium (Available from:

Learner Analytics: I’m getting more than your toes wet

The “dip your toe in the water” idiom expresses you starting something carefully because you are not sure whether it will work. This resonates with my journey so far into Learning Analytics. I’m still struggling to get a clear understanding of what people are currently using Learning Analytics for, and the role of the academic (faculty) within the process? My perception is, it is still early days within its development, while, early success stories of effective deployment of analytics have involved a broader engagement from all parties across an institution. Therefore, effective deployment of analytics requires more than the solitary academic lead.

Two resent tasks have re-enforced these view. Firstly, reading Learning Analytics: The current state of play in UK higher and further education, edited by Niall Sclater (JISC – This excellent piece explores a number of interviews across the UK sector. For me the emerging messages included; the need for strong leadership from the top of the institution, the variety of uses (not just focussed on indicators of engagement and students at risk), and the ethical dimensions. The drivers included, “most institutions mention a desire to enhance the student learning experience in various ways such as improving achievement and reducing the number of resits, providing better feedback, and empowering students to become more reflective learners. Some institutions have significant issues with retention and see learning analytics as a way to identify students at risk of dropout; for others retention is not a significant problem. Providing students themselves with better information on their progress is also mentioned as being an important driver.” (Scatter (2014:4))

The question of usefulness of analytics is strongly linked to the timing of intervention. This idea was fronted in the the keynote at Blackboard DevCon15 Conference, by Ryan Baker, Predicting Learner Outcomes with Learning Analytics. His discussion acknowledged the growth in supporting literature around learning analytics being able to predict outcomes. However, his focus was on what is the point if you are not going to intervene at the appropriate times to help and support students changing their behaviour.

Ryan’s presentation also helped tackle the question about effective deployment of analytics. A tweet during the session captured a strong opinion amongst many, “this is too much work to ask my faculty [lecturers]. Data mining tools need to be easier”. However, the message by the end of the presentation was, this is a collaborative effort between the academic teams and professional services, in particular, knowledge experts in data mining tools and developing dashboard technologies. So to answer the question raised in the tweet, the faculty [lecturer] should not be expected to find the time to master the data mining tools, they should work with wider experts within the institution to ensure the appropriate indicators are valid, being collected at the right times, and assist in the interpretation and timely intervention.

Some of Ryan’s references include;

Baker, R. and Siemens, G. (2014) ‘Educational Data Mining and Learning Analytics’, The Cambridge Handbook of the Learning Sciences, pp. 253–272. doi: 10.1017/cbo9781139519526.016

Barber, R. and Sharkey, M. (2012) ‘Course correction: Using Analytics to Predict Course Success’, Proceedings of the 2nd International Conference on Learning Analytics and Knowledge – LAK ’12, doi: 10.1145/2330601.2330664

Koedinger, K. R., Corbett, A. T. and Perfetti, C. (2012) ‘The Knowledge-Learning-Instruction Framework: Bridging the Science-Practice Chasm to Enhance Robust Student Learning’, Cognitive Science, 36(5), pp. 757–798. doi: 10.1111/j.1551-6709.2012.01245.x

If we are serious about TEL, we need to develop staff capabilities as collaborative teams


The aims of this post are to use the aggregated findings from three staff digital capabilities audits at UCS to inform us of a more effective staff development model for technology enabled learning, and encourage you to undertake similar work within your institutions.


The first question is, why did we introduce a staff digital capabilities audit? The answer revolves around us trying to identify current staff digital capabilities to introduce “targeted” staff development interventions. It was envisaged the data collected would add value across a number of levels;

  • offering a staff development tool to empower individuals to take ownership of their development
  • providing an aggregated view from a course / departmental team perspective to more effectively identify bespoke developmental requirements

The more background on the individual studies, read the posts:

Data Collection

The discussion has been informed by three small data collection points. The questionnaires were collected from

  1. Course Teams & Departments Meetings
  2. Dept 1 (n=10): April 2015
  3. Dept 2 (n=10): May 2015
  4. Online – Any staff at UCS (n=17): April – June 2015

It needs to be acknowledged, given the small sample size and data collection methods we need to keep the discussion within the context this is not statistically significant study. However, the outcomes do provide an insight, and give an opportunity to reflect on our current practice.


The data gathering process was to take all the results, and identify the questions where the “negative / no” responses were greater than the “positive / yes” responses for an individual question. These are collated in Table 1, and organised by group.

Table 1

Table 1

Table 1 illustrates there are a number of technologies and tools which staff haven’t used within the last 12 months. For some tools, there is a strong pattern across all sample groups.

Table 2 applies a criteria matrix to rank the findings into prioritises. The criteria is upon the relative levels of “no” compared to “yes” responses, and the extent to which they cover the three groups.

Table 2

Table 2

Table 2 highlights the collective experience and awareness across the institution of using certain tools is very low. In particular, the tasks, the quiz and the blog tool within Blackboard (LearnUCS). This lack of exposure to some tools raises concerns of a disconnect between the strategic and operational levels within the institution. For instance, our current Learning and Teaching Strategy prioritises the need to explore innovative and creative assessment and feedback models. Given this priority, the expected solution would include the use of objective testing for summative and formative assessments. However, looking at the outcomes of the staff digital capabilities audit, there needs to be significant resource directed to developing staff experience around using the quiz tool before they can make connections within their own innovative practice. It also revisits the often discussed topic, “how are lecturers suppose to make effective decisions around the technology mix within their teaching, learning and assessment models, if they don’t know what the don’t know?”

This raises the question, “how we might develop staff digital capabilities?”

Table 3 draws together the free text responses around perceived barriers to adoption for the two face to face groups. Note the online audit (self-diagnostic) does not include this set of questions.

Table 3:

Table 3

It is always difficult to draw meaningful messages from this type of analysis (especially given the very small sample size). The application of the 4E’s framework by Collis et al., (2001) offers an insight. It facilitates the questioning around who owns or has responsibility to take these issues forward within the institution?

Table 4:

table 4

The responses imply time is the most common perceived barrier. It is suggested the prioritisation of time for lecturers to design, develop and deploy TEL based initiatives is an institutional issues informed by around workloads, culture and expectations. However, there are clear opportunities at the course and departmental level to use the limited time more effectively.

One approach would be to develop a team based staff development programme at the course or departmental level. The need to focus at a team perspective is based upon the early/late adopters of an innovation typically represent 70% of a population. With peer pressure and other outside forces being the key influences on this group deciding to adopt the innovation (Hixon et al., (2011)). Schneckenberg (2009) proposes the real barriers to TEL in HEIs are linked to cultural and motivational factors within lecturers. In particular, conflicting demands for research, and enterprise activities which are perceived as more important than TEL based teaching innovations. This suggests a staff development model which focusses predominantly on the individual will be ineffective and only support early adopters.

An approach to tackle the previous observation would be for a team based problem based project approach, which includes the setting their own problem and targets, evidencing success at other institutions, and designing and developing the TEL enhanced learning model. The focus of the session design needs to reflect lecturers will not spend time on tools unless they see value in using them. Therefore, will be more opportunities at exploring the jobs they need to do (Macdonald and Poniatowska (2011). This is inline with the Carpe Diem learning design, which “drew from creative processes, agile development, and storyboarding” (Salmon and Wright, (2014:54)). An enhancement on the Carpe Diem would be to re-run with the same discipline experts at regular intervals to develop enhancements which are drawn upon lessons learnt from implementing the innovations. This is inline with a Gibb’s Learning Cycle framework.

This raises the question, who should be part of the team?

Owens (2012) observes the dependence between technology and the learning design. “Without a working knowledge of technologies, lecturers do not know what is possible … equally, without knowledge of the … pedagogical design of these technologies … online learning environments will remain an underutilised and ineffective resource” (Owens (2012:398))

The team needs to include representation from discipline experts, e-learning developers, academic skills advisers, librarians, curriculum developers and students.

The final question is, what has this to do with the digital skills capability audit?

The role of the digital skills capability audit is crucial within the process as it helps determine the topics to choose through identifying the unknown unknowns.


  • Armellini, A. and Jones, S. (2008) ‘Carpe Diem: seizing each day to foster change in e-learning design’, Reflecting Education, 4(1), pp. 17–29.
  • Cho, M.-H. and Rathbun, G. (2013) ‘Implementing teacher-centred online teacher professional development (oTPD) programme in higher education: a case study’, Innovations in Education and Teaching International, 50(2), pp. 144–156.
  • Hixon, E., Buckenmeyer, J., Barczyk, C., Feldman, L. and Zamojski, H. (2012) ‘Beyond the early adopters of online instruction: Motivating the reluctant majority’, The Internet and Higher Education, 15(2), pp. 102–107.
  • JISC (2013) The Design Studio / Curriculum Design at the University of Ulster. Available at: (Accessed: 24 June 2015)
  • Macdonald, J. and Poniatowska, B. (2011) ‘Designing the professional development of staff for teaching online: an OU (UK) case study’,Distance Education, 32(1), pp. 119–134.
  • Owens, T. (2012) ‘Hitting the nail on the head: the importance of specific staff development for effective blended learning’, Innovations in Education and Teaching International, 49(4), pp. 389–400.
  • Salmon, G. and Wright, P. (2014) ‘Transforming Future Teaching through “Carpe Diem” Learning Design’, Education Sciences, 4(1), pp. 52–63.
  • Salmon, G. (no date) Carpe Diem: Access to Handbook. Available at: (Accessed: 24 June 2015)
  • Schneckenberg, D. (2009) ‘Understanding the real barriers to technology-enhanced innovation in higher education’, Educational Research, 51(4), pp. 411–424.

Google Tag Manager MOOC: Why or why …

I’ve just signed on for the Google Tag Manager Fundamentals Course. The reason being, I’m rather interested in the continuing evolution of the Google educational platform and associated learning design. It was over a year ago when I completed my last Google MOOC. My previous successful strategy was based more on trial and error with respect to the assessment component. I’d literally sit with the two tabs open and work backwards from the assessment, reading all the transcripts and submit.

This time I’m really interested to see how the learning platform has progressed, and if the learning design has shifted from knowledge based MCQs. I’ll note my thoughts as I progress through the course, and get back to ….