I mentioned previously that I’m hosting a series of seminars at the Open University, with the intention of showcasing our own expertise (internally and externally) and also getting us an institution to engage with current ed tech developments.
Last Thursday my colleague Prof Bart Rienties looked at analytics. Not so much real time, but rather digging into large data sets across multiple courses to explore what it is that students actually do. This is always difficult to ascertain, but particularly so for a distance education university when you don’t get to see what they’re doing. We ask them of course through surveys, which are useful but these aren’t always reliable – people often forget what they actually did, over or under-estimate times spent on certain activities and sometimes give answers they think you want to hear. There are problems with analysing data sets too as you are sometimes inferring what the data means in terms of student behaviour (eg is spending a long time in the VLE a sign of engagement or that they’re struggling to understand?), but combined with other tools it provides some fascinating insights.
Bart framed his talk around ‘6 myths‘ we have regarding OU student behaviour:
- OU students love to work together
- Student satisfaction is positively related to success
- Student performance improves over time (eg from first to third level courses)
- The grades student achieve is mostly related to what they do
- Student engagement in the VLE is mostly determined by the student
- Most OU students follow the schedule when studying
The answer to all of these is “No” (or at least “Not that much”). You can see the talk below to see Bart expand on them, and it’s not my place to expand on his work. But I’ve been doing a lot of thinking about these since the talk. Initially they might be deemed a concern, but perhaps they are also highlighting positive behaviour. For example student behaviour is largely determined by what we set out in the course (no 5). That seems like an effective outcome of good learning design, particularly for distance education students. Similarly, while students don’t slavishly follow the course schedule (no 6), with many studying ahead or just behind the calendar, this can be framed as part of the flexible (and accessible) design. Some online courses from other providers are very strict – you have to do task X on Monday, contribute to discussion on Wednesday, complete the quiz on Friday, etc. For our students such a regimented approach would not work, many students know they will be unavailable at certain times for instance and so study ahead.
And so on for each of these. What this highlights for me is that the work Bart and his colleagues do at the OU is essential in getting us to ask the right questions. We are all guilty of making assumptions about student behaviour I think. The kind of analysis undertaken here is not seeking to automate any process or remove people from the system (often the fear of an analytical approach), but rather encourages us to dig deeper into what is happening in the very real lives of our students. And from this we can design better courses, support mechanisms, assessment, etc.
Here is Bart’s talk: