The “dip your toe in the water” idiom expresses you starting something carefully because you are not sure whether it will work. This resonates with my journey so far into Learning Analytics. I’m still struggling to get a clear understanding of what people are currently using Learning Analytics for, and the role of the academic (faculty) within the process? My perception is, it is still early days within its development, while, early success stories of effective deployment of analytics have involved a broader engagement from all parties across an institution. Therefore, effective deployment of analytics requires more than the solitary academic lead.
Two resent tasks have re-enforced these view. Firstly, reading Learning Analytics: The current state of play in UK higher and further education, edited by Niall Sclater (JISC – http://repository.jisc.ac.uk/5657/1/Learning_analytics_report.pdf). This excellent piece explores a number of interviews across the UK sector. For me the emerging messages included; the need for strong leadership from the top of the institution, the variety of uses (not just focussed on indicators of engagement and students at risk), and the ethical dimensions. The drivers included, “most institutions mention a desire to enhance the student learning experience in various ways such as improving achievement and reducing the number of resits, providing better feedback, and empowering students to become more reflective learners. Some institutions have significant issues with retention and see learning analytics as a way to identify students at risk of dropout; for others retention is not a significant problem. Providing students themselves with better information on their progress is also mentioned as being an important driver.” (Scatter (2014:4))
The question of usefulness of analytics is strongly linked to the timing of intervention. This idea was fronted in the the keynote at Blackboard DevCon15 Conference, by Ryan Baker, Predicting Learner Outcomes with Learning Analytics. His discussion acknowledged the growth in supporting literature around learning analytics being able to predict outcomes. However, his focus was on what is the point if you are not going to intervene at the appropriate times to help and support students changing their behaviour.
Ryan’s presentation also helped tackle the question about effective deployment of analytics. A tweet during the session captured a strong opinion amongst many, “this is too much work to ask my faculty [lecturers]. Data mining tools need to be easier”. However, the message by the end of the presentation was, this is a collaborative effort between the academic teams and professional services, in particular, knowledge experts in data mining tools and developing dashboard technologies. So to answer the question raised in the tweet, the faculty [lecturer] should not be expected to find the time to master the data mining tools, they should work with wider experts within the institution to ensure the appropriate indicators are valid, being collected at the right times, and assist in the interpretation and timely intervention.
Some of Ryan’s references include;
Baker, R. and Siemens, G. (2014) ‘Educational Data Mining and Learning Analytics’, The Cambridge Handbook of the Learning Sciences, pp. 253–272. doi: 10.1017/cbo9781139519526.016
Barber, R. and Sharkey, M. (2012) ‘Course correction: Using Analytics to Predict Course Success’, Proceedings of the 2nd International Conference on Learning Analytics and Knowledge – LAK ’12, doi: 10.1145/2330601.2330664
Koedinger, K. R., Corbett, A. T. and Perfetti, C. (2012) ‘The Knowledge-Learning-Instruction Framework: Bridging the Science-Practice Chasm to Enhance Robust Student Learning’, Cognitive Science, 36(5), pp. 757–798. doi: 10.1111/j.1551-6709.2012.01245.x