Learning Analytics: So, what is an effective intervention?

I just read some really interesting work at San Diego State University on triggers and intervention strategies. I was forwarded by John Whitmore, at Blackboard (thanks John). The work was reported at LAK 2015, the report was by Dodge, Whitmore, and Frazee (2015) but I can’t find a public link to the SDSU report. So, sorry … you’ll need to hunt it down yourself 🙂

Their work is based on a number of large courses, over a long period of time. From my perspective, the pilot introduced weekly trigger events through the semester. These combined triggers included;

  • Classroom voting / clickers – if no score at session, then they did not attend
  • Regular Blackboard quizes – if you didn’t complete one a trigger was activated
  • Blackboard logins – if they had not logged into Blackboard for over a week a trigger was activated
  • Score / grades on course tests / assignments – low scores on assignments and tests would activate a trigger
  • Cumulative average grade is just below passing as they near the end of the semester

The work drew a number of important findings;

1) the deployment of regular (weekly) triggers through a range of technologies and meaningful learning tasks is achievable (I do know this, but I am often told it is unlikely to happen given Institutional culture). Therefore, if you can drive the required changes in curriculum design across a programme, the learning experience for certain students can be improved.

2) these triggers are significant predictors of the likelihood of passing the course. They found none of the students who activated only one trigger needed to repeat the course, while almost all the students who activated ten or more triggers in the semester needed to retake the course. Therefore, the number of trigger events is a good indicator of who needs support.

3) intervention strategies based on emails, including links to multimedia based support materials etc., did not have any statistical impact on behavior of those who received them. They conclude, “while the triggers may indicate a problem, students may not have the skills to address the cause of the problem despite awareness and the will to do so”. The article suggests a more human intervention strategy based on supplemental instruction by peer (student) mentors. The interesting challenge for SDSU was the issue of achieving impact with tight resources, given student intakes of over 400 on the course, scaleability is critical.

These outcomes resonate with some of the discussions I have with UK HEIs, around effective intervention strategies. It also supports the position held by many for the need of a mediated intervention, as opposed to large scale student dashboards and auto-generated emails. To further support this position, the question I’m interested in answering is, is there any evidence that student facing dashboards are demotivating impact on the learner?


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s