Using Learning Analytics to “nudge” students to take more ownership over their learning

Becoming an independent or self regulated learner is an important part of becoming a more effective the learner. This type of learner is “distinguished by their responsiveness to feedback regarding the effectiveness of their learning, and by their self perceptions of academic accomplishment” (Zimmerman (2000:14)). It requires not only the learner taking responsibility for their weaknesses, but Faculty to give them the opportunity to identify, correct and improve upon these weaknesses (Fritz (2013)).

The following discussion has two aims. Firstly, to outline how a Faculty member can redesign their learning activities around tools within the virtual learning environment to provide richer feedback and trigger dialogue amongst students with the intention of nudging them to become more effective self regulated learners. Secondly, to outline the case for Senior Mangers to drive this redesign process to enable the institution to take better advantage of learning analytics and data driven decision making.

Davenport et al., (2000) define learning analytics as “the application of analytic techniques to analyze educational data, including data about the learner and teacher activities, to identify patterns of behaviors and provide actionable information to improve learning and learning related activities”. An important aspect within this definition is the concept of the data being “actionable”, by either the learner, teacher or another stakeholder group.

To gather data and take actions requires the learning model to move away from an orthodox approach of a few high stake summative assessments (typically one essay and an unseen exam) towards one which provides more frequent feedback opportunities and learning loops for the student but is sustainable and scaleable for the Faculty member. The problem with the orthodox design is it can not easily develop self regulated learners because there is too little reliable and actionable data. Therefore, if frequent actionable data is required the re-designed approach should include using the VLE quiz engine, the submission of short online writing tasks (with defined marking criteria) and potentially classroom voting technologies. Within this case the learning activity will generate more feedback and enhanced reporting for the individual learner. For instance, if you deployed a number of five question online tests to be completed by all students. This would generate significant amounts of actionable date without significant work. Also, by using a variety of different question types you’ll be easily able to tease out higher order learning skills. Question types might include Likert Scale (to what extent do you agree …), and short free text responses (in no more than 150 words). The individual can access their score and feedback online, and compare themselves to the average grade. The Faculty can dedicate a proportion of the next face to face teaching session to provide additional feedback on the questions. This approach aligns with the all seven principles of good feedback (Nicol & Macfarlane-Dick (2006)

So, why should Senior Managers encourage Faculty to enhance their technology based learning designs? The simple answer is, from the institutional perspective the use of the VLE to generate and share data underpins the effective adoption of learning analytics. These re-designs by individual Faculty will act “as a series of small [steps] designed to gain experience and make the case that data based decisions have enhanced value” (Bichsel 2012:26). For this to work, the Institution needs to be aware Learning Analytics does not require perfect or near perfect data, and the institution needs to walk before it can run. Therefore, a number of small scale pilots around enhanced reporting and effective intervention strategies will enable the institution to improve its readiness for learning Analytics. The pilots would instigate conversations around institutional culture, people and processes, as well as technology infrastructure. This will add more value compared to exploring correlations in large data sets between VLE access data as a proxy for student engagement, and final grade performance often discussed in the literature.

Where next for you? If you are Faculty a good starting point would be to explore the functionality and possibilities of the online quiz tool and grade center. If you are a Senior Manager a good starting point would be to explore effective academic adoption models within the context of your institution.

————————————————

Bichsel, J., (2012) ECAR Study of Analytics in Higher Education, available at: http://www.educause.edu/library/resources/2012-ecar-study-analytics-higher-education (Accessed October 2015)

Fritz, J., (2013) Using Analytics at UMBC, Educase Center for Applied Research, Research Bulletin, available at https://net.educause.edu/ir/library/pdf/ERB1304.pdf

Nicol, D., & Macfarlane-Dick, D., (2006), Formative assessment and self-regulated learning: A model and seven principles of good feedback practice. Studies in Higher Education, 31(2), 199-218

Nussbaumer, A., Hillemann, E-C., Gutl C., and Albert, D., (2015), “Competence-based Service for Supporting Self-Regulated Learning in Virtual Environments, Journal of Learning Analytics, 2(1), 101-133

Zimmerman, B., (1990), “Self-Regulated Learning and Academic Achievement: An Overview” in Educational Psychologist, 25(1), 3-17

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s