Down and up the greasy pole
It seems there is no constant but change. I was feeling pleased with myself over my success with a Greasemonkey script to produce simple statistics for comparing results from multiple markers in the Moodle assignment system. My infrequent forays into coding had made that process a bit like climbing a greasy pole but I’d eventually succeeded in reaching the top and a working script.
It was during that process of development or shortly after that I noticed that more complex pages in the assignment module were failing to build and display correctly. Initially that was visible for my first assignment which had marks and attached feedback files for about 170 students. That page was taking a long time to appear and, when it did, it had just the table of results without any of the regular styling or page navigation. Pages without results or with fewer students per page continued to load correctly but it was no longer possible to generate statistics across the whole set of markers and results for a larger course.
When I reported my observation to the systems folk I discovered I was not the first to experience the problem which had appeared first for larger courses. The fix, a couple of days later, was to remove the ‘All’ option from the popup selector on the page so that a maximum of 100 results could be viewed at a time. As a consequence my script was no longer a one step solution for classes larger than 100. It could still be used by viewing results for one marker at a time and recording the means and standard deviations for offline comparison. Not many markers attempt more than 100 pieces in a single assessment so that limitation was unlikely to be a problem.
Nevertheless, I like a tidy solution to my problems and, having climbed up the greasy pole, I was not happy about sliding a good way back down. The challenge was on. It’s taken a few evenings over the past week but I’m pleased to say that version 2.0 of my markerStats script is now available. There were a few challenges along the way and I learned some more about JavaScript and the DOM that I’ll likely forget if I don’t get more practice soon.
The original script worked on data already present in the table that made up most of the page. That was relatively quick and simple. The updated version needs to deal with the cases where that is still true – fewer than 100 students and all on the same page – and with cases that might have several hundred students across a number of pages determined by the total number of students and the number (10, 20, 50, or 100) displayed on a single page. Those conditions are easily distinguished by the absence or presence of a div with class paging.
With the necessary data already present in the page it makes sense to simply calculate and display the statistics. That happens quickly in the background and does not adversely affect the user experience. With the data on multiple pages it is no longer possible to do that and can take some time to retrieve and assemble the data – that time was what was causing the problem that resulted in the ‘All’ option being removed. The obvious solution was to have the calculation done on request rather than as a matter of course so the first step was to replace the display of results with a button to request calculation and display. Placing a button in the page is easy but the script it needs is in Greasemonkey rather than the page. That initiated my first lesson on the use of a listener in JavaScript.
Collecting data across multiple pages required the number of pages and a systematic way of generating the relevant URLs. The number of links in the paging div can vary depending on which page (first, last, or in between) is being viewed but it was easy enough to extract the number from the pattern.
Working through the pages by loading them in the window was likely to be ugly because it would require taking control from the user. Loading in the background was accomplished using XMLHttpRequest, requiring more learning about JavaScript.
Acquiring the content of multiple pages one at a time was a start but then that material needed to be parsed and the required data extracted. The solution was ultimately simpler than I thought it might be. It is possible to create a document fragment that is not linked into the DOM and is therefore not visible in the page but otherwise behaves like regular DOM so that selectors work. It was simple enough to load the first page into a fragment and extract the table from it. I agonised over how to merge multiple tables, imagining solutions that might require appending blocks of HTML and removing unnecessary code or transferring one row at a time from one table to another. In the end I found that, having created my table from the first page loaded into the fragment, I could iteratively replace the fragment contents with a new page, select just the tbody from that and append it to the table I was building. HTML is happy to have tables with multiple tbodies and will treat them like any other table for purposes of extracting data.
With the data from multiple pages secure in a single table I was able to use the code I’d already developed for calculating and displaying the statistics. That can take some time so there is a need a progress indicator. I did a quick and dirty job on that by arranging to update the text displayed on the button to count down as the pages are loaded and processed in the background.
One last issue had been solved earlier. My classes used for testing had few markers but David Jones had mentioned that in his class with more markers the div containing my display table had overflowed down the page, obscuring part of the data in the page. His suggested solution was to put the statistics in a popup window. I may yet do that but it would require work on styling the table that I can get for nothing in the page by using existing styles and it might run foul of users with popup blockers. Instead I limited the height of the div where the table is place (taller when the space from the paging block permits) and set it to scroll on overflow. That’s not perfect but it works.
I’m back at the top of the greasy pole and waiting for the next institutional change to induce backsliding.