Greasing the wheels of progress
For the past eight years we have been dealing with assignments submitted online through a locally developed system, EASE (Electronic Assignment Submission Environment), which was originally planned in response to perceived deficiencies in WebCT Vista, our LMS from 2003 until around 2008. By the time EASE was ready we were in transition to Moodle as our LMS but EASE offered enough that most classes used it in preference to the native Moodle submission system, which was available to those who wanted it.
This semester, as part of the Grand Unification Theory of Everything that decreed consistency via a single interface, EASE was suddenly deprecated in favour of a revamped and slightly adapted Moodle module that was expected to offer facilities equivalent to EASE. The new system actually includes some facilities that are better than anything EASE offered. Most notable is the facility to zip up a set of files with feedback to students, upload the archive and have the files distributed to students. That is vastly preferable to selecting and uploading a file for each of up to several hundred students.
Unfortunately those responsible for promoting the change inexplicably chose not to publicise such benefits and instead simply told people that the familiar system, EASE, would no longer be used. There was no real training offered and very limited documentation. The response from staff required to make the transition was less than universally positive and is probably intensifying now that we are near mid-semester and most courses are dealing with assignment submissions.
Now that there is more data in the system and it is being used more frequently some processes appear to have slowed and pages that report substantial amounts of data about marked submissions are not presenting as they should because some processes are timing out. Those issues, added to the lack of training and documentation and an interface that confounds even experienced techno-literate academics, make for a system that is becoming unpopular with users. It is a mystery that those who were funded over a year or more to prepare the system did not think to tweak the interface to make it more user friendly or to add simple features that would have increased its utility for academics.
I’ve enjoyed coding/programming for more than 40 years, though I’ve seldom done enough to claim any real facility or fluency. Still, I like a challenge and am often looking for a reason to tinker with code. A recent conversation with a colleague, David Jones, included discussion of the need to compare results from different markers. My solution, when using EASE and more recently Moodle, has been to get the data into a spreadsheet and use some simple (for me) formulae to produce basic statistics and distributions for comparison. My first thought for enabling others less familiar with spreadsheets to do the same was that we might develop some spreadsheet template they could use. David’s suggestion, more befitting a digital renovator, was that the facility should be built into the system or, in the meantime until that happened, added using the approach he had previously used with Greasemonkey to enhance another institutional system.
That was enough to get me thinking and the distraction of coding to manipulate data seemed vastly more appealing than the marking of assignments that would generate the raw data. I did do some substantial coding (2000+ lines) in my doctoral project but that was 15 years ago and what little I’ve done since has mostly been as an adjunct to AppleScript over institutional web systems. Hence this took far longer than it would for somebody fluent in code but I’m pleased that I got it to work at all.
My markerStats script for the Greasemonkey system that runs over Firefox works on the Moodle assignment system page that lists submissions. It extracts names of markers and marks awarded and calculates means and standard deviations of marks overall and for each marker. It then formats those statistics in a table and injects that into the page. The table is not as well formatted as I’d like but it serves the purpose of enabling easy comparison of marks by marker. Perhaps I’ll improve it over time or somebody else might do that. In a rational system something similar would be added as part of the system.
We are currently engaged in an exercise around the institutional Educational Experience Plan. Part of that is to consider how we might facilitate processes that discover needs that users have or innovations that others have introduced and match those to the skills needed for wider implementation and adoption. Simple projects like this one may point to one way of evolving the systems through informal and agile processes rather than major projects that consume time and resources without delivering what users really need or want.
2 Responses