Learning Analytics: Research, practice and ethics

Reflections from LACE SoLAR Flare October 9th, 2015 at the Open University
Many (too many) years ago, as part of my Postgraduate Certificate in University Teaching, I recorded student attendance across a module and the final mark students achieved. This produced a very nice correlation which I showed to subsequent cohorts of students. I like to think of this as an early example of learning analytics. Although this was a fairly basic piece of action research, it captured three of the important questions that should be asked in any Learning Analytics project, that is:
An early approach to learning analytics?

• What are you measuring?
• Why are you measuring it?
• What are you going to do with the data?

What? In this example, I was measuring attendance, the data was reasonably reliable. In the days before electronic attendance monitoring, a “sign-in sheet” was passed around the class. Notwithstanding that there may have been some deception on the part of students; I think that it was a reasonably accurate data set.
Why? I don’t recall giving this much thought at the time, but probably assumed that attendance would be a reasonable predictor of student engagement and retention.
What do you do with the data? I shared the data with subsequent students, in the hope that it would encourage better attendance. Unfortunately I can’t remember whether it had an impact on attendance, as I said it was long ago. On reflection this was the critical piece of the jigsaw, for effective use of learning analytics – we need to know if we are having an impact on what our students do. This can be a direct impact on students – did attendance rates go up in subsequent years? or indirect – did I reflect on how I could make my classes more engaging and boost attendance?

I recognise that there were flaws in this analytical process. Attendance is only a proxy for engagement, how many students were dozing in the backrow, or discretely reading the sports page? Secondly, there was a significant time delay between measurement, action and evaluation of impact.

In the current LA environment, where much of student learning happens online, students leave a digital trace of their learning activity, the data collected can be much more nuanced and potentially insightful. We can measure which parts of online material are downloaded, which students engage in online discussions, how long students spend watching videos, and which library books and papers are downloaded. Of course evidence of downloading a paper, is not evidence of reading it, although, rather scarily, the latest developments in LA are incorporating eye tracking devices and facial recognition software in an attempt to measure attention!

The ability to track student online engagement, and combine this with other metrics and demographic data can provide opportunities for acting in real time. Researchers at the OU have identified 30 factors that can be used to predict which students are ‘at risk’ of dropping out. It would be interesting to compare the accuracy of this model with the predictions from my attendance measurements!

Bart Rienties, Reader in Learning Analytics at the Open University (@DrBartRienties), has been carrying out some really interesting work classifying the different parts of a course: assimilation, construction, peer-to-peer interaction, finding information, assessment etc. and measuring the impact on student satisfaction and student success. The parts of the course which lead to greatest satisfaction are not the same as those that lead to student success. I think we always knew this was the case, but it still provides important evidence for those analysing National Student Survey (NSS) data and developing metrics for the Teaching Excellence Framework (TEF). The real value of this type of analysis, however is the ability to provide feedback to the tutor and the individual student in real-time. This allows early intervention by the tutor and/or the student.

Lightening presentations
The Lightening presentations – 2 minutes and one slide per speaker, gave a great insight into the different ways learning analytics are being used across education, from primary schools, to supporting immigrants, as well as in formal accredited and open learning. The slides are available on slideshare, but I have highlighted some of the projects that I found particularly interesting below:

Doug Clow (@dougclow) is leading work building an evidence hub, seeking to find evidence for the following four propositions:
• LA improves learning
• LA improves learning support and teaching
• LA has been widely taken up and used at scale
• LA are used in an ethical way

If you have evidence for or against these propositions whether in schools, universities or other work-based or informal learning, you can add your evidence to the hub at evidence.laceproject.eu. Currently there is mixed evidence on LA being used ethically, and I will certainly be interested to see how evidence in this and the other propositions develops.
Another inventory project was “Learning analytics for European Educational Policy”, the team are interested in collecting evidence on
• Learning analytics tools
• Learning analytics practices
• Learning analytics policy documents
More information is available at their blog: https://laepanalytics.wordpress.com

Jenna Mittelmeier (Open University, @JLMittelmeier) had just embarked on her PhD using social network analytics combined with learning analytics to identify factors effecting equal participation in group work, particularly in diverse classrooms. Adriana Wilde (University of Southampton, @AdrianaGWilde) is exploring the measurable factors for learning success that are common to MOOCs and face-to-face instruction. I will be keeping an eye out for the outcomes of both of these projects.

Finally, learning analytics raises important ethical questions, and the OU have been leading the way in developing principals for the ethical use of data for learning analytics. Sharon Slade (@SharonSlade) is leading research into students attitudes to online privacy, engaging students more actively in how there data is collected, analysed and stored, and who has access to this data. This work also starts to recognise the dangers of using data to label or stereotype students.

These are exciting times. Whilst most of the focus of using learning analytics in tertiary education has been directed at identifying students at risk of failing or dropping out, there is clear potential for tutors to develop and extend their content to stretch the most able students, and develop the elements of our programmes are having the greatest impact on students’ success and satisfaction.

Well my embedded eye tracking device indicates that you are bored reading this now, so time to sign off.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s