Via a Martin Weller blogpost (Learning design – the long haul of institutional change), the phrase:
we now have a uniform design process across the university, and are one of the world leaders in this approach. It has allowed us to then match analytics against designs, and to develop a common language and representation.
I asked him for examples of “match[ing] analytics against designs” and got a pointer back to work by Bart Rienties et al. which I guess I should have been following over the years (I have reached out to various folk in IET over the years but never really got anywhere…).
Here’s an example, quickly found; in Linking students’ timing of engagement to learning design and academic performance, Nguyen, Quan; Huptych, Michal and Rienties, Bart (2018), Proceedings of the 8th International Conference on Learning Analytics and Knowledge, ACM, New York, pp. 141–150:
2.3 VLE engagement
The second dataset consisted of clickstream data of individual learners from the VLE and was retrieved using SAS Enterprise 9.4. The data were captured from four weeks before the start of the module until four weeks after the end of the module. Learning activities were planned over 30 weeks. Data were gathered in two semesters (Fall 2015 and Fall 2016) in order to validate the findings from two independent implementations. First, we would like to mention that the student behaviour record includes all students’ VLE activity. In other words, “the spent time” is determined as the time between any two clicks of a student, regardless a course and a type of the VLE activity. Further, not each click can be associated with studying time; for instance, there are clicks related to downloading of some material. We have this information about an action type which is connected with the click. Thus, we can determinate that a click with the connected action “download” was not included in the spent time of student in the analysis. Nonetheless, we can assume that the time of a click with the connected action “view” is associated with the time of learning of a study material for which the click is logged.To compare the LD with the actual student behaviour, time spent on task was calculated as the duration between clicks. As pointed out by previous research [17], this metric could be problematic due to (1) the inability to differentiate between active time and non-active time (students leave the respective web page open and go for a coffee), and (2) the last click of the day is followed by a click next day), which makes the duration excessively long. Any attempt to set an arbitrary cut-off value would pose a threat in underestimating or overestimating of the actual engagement time.
Taking into account the context and LD of a module could produce a more informed cut-off value. Ideally, this cutoff value should be tailored to the design and context of each individual activity. For example, the cut-off value should be different between a 20 minutes activity and a 1-hour activity. While this study does not fully address the aforementioned problems, it leveraged the design of learning activities (discussion between researchers and designers) to set a cut-off value at 1 hour for all activity (e.g. any activity goes beyond 1 hour will be set as 1 hour).
So, there is work going on, and it looks related to some of the approaches I’d like to be able to draw on to review the first year of presentation (at least) of a new course and I should apologise for that. I probably should make more effort to attend internal research events (I used to…) and should track their research outputs more rather than digging my own cess pit of vitriol and bile. (I guess I don’t help make the “friendly” course teams that Martin mentioned as being part of this effort…)
I guess I need to try to find better ways of reach out to folk in the OU in more constructive ways.
PS By the by, the title of another paper, “A multi-modal study into students’ timing and learning regulation: time is ticking” reminds me of a thing I’d built but wasn’t allowed to deploy in my first engagement with HTML delivered learning materials, the T396 eSG (electronic study guide) (Keeping a Distance-Education Course Current Through eLearning and Contextual Assessment) where I used client side Javascript to pop up a widget if you’d spent too long on an eSG page and ask how you were doing. (I think I also tried to experiment with tracking time over a study session too (the eSG was frame based)). We dropped it on the grounds that it would probably be unreliable, and would almost certainly be irritating, as per Clippy. (I still think it’d have been interesting to try to iterate on a couple of times though…) We also had an experimental WAP site containing micro-info about the course, TMA submissions dates and so on. Anyone else remember WAP?!