Assessing some options

A bit more than a week into the new year and the beginning of semester is inching closer. It’s time to get some work done on the courses I’ll be teaching. Content will need to be updated but I tend to begin most times by thinking about assessment and how that will enable students to demonstrate (or not) their learning.

The larger of the two courses I’ll be teaching is EDP4130 Technology Curriculum and Pedagogy with 130 students enrolled at present, 55 on one or other of 3 campuses (16, 9 & 30 respectively) and the balance online. All students will have access to the online material and, based on past experience, attendance at classes will vary according to students’ other commitments. I’ll be dealing personally with the online group and 16 (so far) on campus. A colleague on another campus will deal with the 30 and the 9 will be serviced by a casual staff member. We will need assessments that can be managed through the LMS (Moodle) by that mixed group of students and staff.

It does not really help that institutional procedures required the general structure of the assessment (number of items, weights, relationship to objectives and AITSL standards, and due dates) to be specified in June-July last year. At that point the dust had not settled on the previous offer of the course and there had been little or no time to reflect on what worked or what should be changed and how. As a consequence, the pattern of assessment is very similar to what I have used previously with some adjustments to weights. Opportunity to consider what happened last time and to work on the design of assessment while retaining some more flexibility would be much preferable.

In addition to a professional experience (weighted 1% but students must pass it to complete the course) there are 4 items of assessment as follows: Task 1, 20%, Week 2; Task 2, 20%, Week 5; Assignment, 45%, Week 14; Quiz, 14%, Week 15. The quiz is handled online through the Moodle system and is intended to encourage students to engage with the course content as presented in lectures, activities, and readings. The details of the other items are yet to be completed. That’s my next task.

I’ve taught this course since 2011 and taught a similar predecessor course from 2002 until 2007. Over that time I’ve used a WebQuest as a learning activity to engage students with an issue associated with application of technologies and I’ve written about it previously. Most times the WebQuest has figured in the assessment pattern and I’ve maintained that connection to encourage engagement with the activities. When there was no connection to assessment engagement reduced in favour of activity that counted. For the past few years the subject of the WebQuest has been Coal Seam Gas in the local region. I spent some time last week to refresh the content of the WebQuest and adjust its structure based on experience in recent years. I simplified the marking guide and updated the actual web material using a Bootstrap template to make it responsive for students who might access it using devices other than a desktop computer. The CSG WebQuest should be ready to go as Task 2.

When EDP4130 first ran in 2011 I used a major assessment item based on one that I’d used in the predecessor course. Students were required to work in large groups to develop curriculum materials for teaching technologies education. That was generally successful. Students did good work and the requirement to share their work with the cohort meant that they completed the course with access to a collection of teaching resources. In 2013 I changed that task and developed one in which students curated existing resources. That was in response to the increasing availability of resources that made it more likely that teachers could adapt existing resources (if they could find them) rather than create their own. Curating resources and sharing those curations with the cohort gave students access to a variety of resources they could adopt or adapt for teaching.

Although the concept of the curation task seemed to work reasonably well with students the actual implementation was a bit of a tangle. It was one part of a multi-part portfolio assignment with an atomised approach to the marking guide. Students found the complexity of the task difficult to grasp and some seemed to become lost in the multiple linked pages that provided instruction and support. The atomised marking guide separated criteria (Howland, Jonassen & Marra, 2012) resulting in a large number of judgements to be made. When that was paired with the complex, and sometimes scattered, submissions from students the workload for marking was much more intense than I anticipated. That needed to change and in thinking about the necessary changes I’ve found some reading that nudged me in the other direction toward fewer and more complex criteria that allow scope for professional judgement (Sadler, 2014). That thinking is reflected in the rubric constructed for the WebQuest.

Other influences on my thinking about how to adapt the assessment in EDP4130 include the commitment of USQ in its strategic plan to personalised learning, the focus in the Australian Curriculum: Technologies on creating preferred futures and design thinking, my own thinking about the need to practise what we preach (eat our own dog food), and some reading and writing I did during 2014 about project based learning which resonates with the inquiry and ‘maker’ emphasis appropriate to technologies education. Alongside those ideas I remain committed to the idea of students sharing their work with others in the cohort and more widely as has been done with the curriculum resources and curation work in the past. Hence my task becomes one of wrestling the remaining assessment pieces (Task 1 and Assignment) into a form that embodies as much as possible of those influences.

My thinking so far is to challenge students to design and complete a project, individually or in groups, in which they both learn about a less familiar aspect of the curriculum and develop a resource that would be useful for teaching that material. In addition to an expectation that they should share their resource with the cohort, and the wider profession, they would also be engaged in formal peer review of work within the cohort. That aspect should improve the overall quality of the materials developed by providing feedback prior to submission and allowing students to see the standard of work produced by peers. The first assessment, Task 1, would be developed as a design task, possibly using a design brief, in which students would review the curriculum, select aspects where they need to develop their capabilities, and design their project with an explicit rationale. The major assignment would then incorporate the projects, peer review, and reflection on the learning.

My challenge now is to specify those tasks in a way that will make sense to students and develop appropriate criteria and marking guides. Let’s see how that goes.


Howland, J. L., Jonassen, D., & Marra, R. M. (2012). Meaningful Learning with Technology (4th ed.). Boston: Allyn & Bacon.

Sadler, D. R. (2014). The futility of attempting to codify academic achievement standards. Higher Education, 67(3), 273-288. doi: 10.1007/s10734-013-9649-1

3 Responses

  1. I’m going to start a similar process with EDC3100 next week. Just a random thought based on something I’ve been thinking about that’s related to the SITE paper.

    The Australian Curriculum and its content descriptors are fairly central to what you’re talking about above. In particular the project. But there’s really no capacity in any of the institutional systems to make those first class objects in the learning process. e.g. have the Moodle editor recognise and automatically link content descriptors to the Oz Curriculum page, have any unit/lesson plan template do checks whether content descriptors are appropriate and aligned with learning experiences, methods for students to know who is working on which content descriptors etc.

    i.e. the tools we use offer more affordances for students to do stuff with these content descriptors.

    The same could perhaps be observed about the AITSL standards and even the learning outcomes for courses.

    I’ve been wondering how this might be addressed and whether it’s worth the effort.
    No answer yet.

    • I’m thinking that the task I assign to students will require them to select a content description as the basis for developing some publicly available resource that could be used to support learning. The resource could be a conventional unit plan, a curated collection of resources as they have done in recent years, a Scootle learning path, or something else.
      That would leave me an option to build a database linking resources contributed by students to content descriptions. That might be useful.
      I think that future developments will require us to identify AITSL standards more tightly in the context of assessment. That’s there in the specifications being developed for the MOLT and that trend seems more likely to expand than contract.
      We probably need to be looking at curriculum mapping tools to see if they might help. I’m not sure the facilities for that in Moodle are up to strength yet.

  2. It’s the MOLT courses that have me thinking about AITSL etc. In particular, about how to make these visible to both students and teaching staff how well (or not) individual activities in courses align with the standards (or learning outcomes). Which as you point out is what curriculum mapping does. But integrating that into the actual learning process is something I think could be useful. Last I checked – a few years ago – Moodle didn’t support this well. Though some people were working on it. I also wonder how the badge stuff might link to some of this. BADGES for AITSL standards? Or at least USQ versions there of?