Applying a framework to evaluate assignment marking software: a case study on Lightwork

Eva Heinrich* and John Milne

Massey University, Palmerston North, New Zealand

(Received 27 October 2010; final version received 18 August 2011; published: 27 March 2012)

Abstract

This article presents the findings of a qualitative evaluation on the effect of a specialised software tool on the efficiency and quality of assignment marking. The software, Lightwork, combines with the Moodle learning management system and provides support through marking rubrics and marker allocations. To enable the evaluation a framework has been developed that is based on an extensive literature review and interviews with academics in tertiary settings. The framework introduces key factors that are crucial to educationally sound and efficient assignment marking. The use of Lightwork is compared to the prior experiences of participants who used either electronic- or paper-based approaches. The findings are analysed using the framework. The study indicates that Lightwork is well suited to support efficient, high quality assignment marking. It is suggested that the evaluation framework can be used for future studies in this area.

Keywords: assignments; assessment; evaluation; Lightwork; case study; marking; quality; efficiency; e-learning

*Correspondence author. email: e.heinrich@massey.ac.nz

RLT 2012. © 2012 E. Heinrich and J. Milne. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons “Attribution 3.0 Unported (CC BY 3.0)” license (http://creativecommons.org/licenses/by/3.0/) permitting use, reuse, distribution and transmission, and reproduction in any medium, provided the original work is properly cited.

Citation: Research in Learning Technology 2012, 20: 16152 - DOI: 10.3402/rlt.v20i0.16152

Introduction

Assessment via assignments is common practice in tertiary education. Students create work that is assessed both summatively and formatively by teachers and markers. Marking includes a wide range of tasks, ranging from administrative steps to the application of pedagogical principles. For a number of years, Learning Management Systems (LMS) have provided rudimentary support for assignment marking. This support is lacking in terms of both practical administration tasks and educational theories. In response, a new software application called Lightwork (http://lightworkmarking.org) has been developed.

Lightwork is an open source application that complements the assignment functionality already provided by the Moodle LMS (http://moodle.org). It is installed on the user's personal computer (Windows, Macintosh or Linux), and its operations are tightly integrated with Moodle's assignment drop box and grade book. The teacher defines the assignment specifications in Moodle; students submit their work to Moodle and receive their marking results via Moodle. Lightwork takes information on courses and student lists from Moodle. The organisation of marking, the marking itself and the quality control of marking are conducted in Lightwork (Figure 1). Lightwork supports marking teams, by capturing who is marking whom as the basis for progress checking and quality control. Among the reasons for developing Lightwork as a separate application instead as a web-based plug-in were the desire of academics to mark offline and the needs around handling and potentially annotating student work in a variety of application formats.

Fig 1
Figure 1.  Screenshot of Lightwork showing marking sheet using a mock-up course.

The decision to interface Lightwork with Moodle was based on the popularity of Moodle (well above 50,000 registered installations in June 2011 according to http://moodle.org/stats/) and its open source license. The level of support Moodle offers for assignments is comparable to that of other widely used LMS. The functionality required (and provided by Lightwork) to complement existing Moodle functionality would conceptually be very similar for other LMS.

This article presents a framework for structuring the evaluation of assignment marking tools and then uses this framework to report on the evaluation of Lightwork. Particular attention is paid to the effects on efficiency and quality of assignment marking. This work is part of a wider research effort that looks at how academics can be supported in conducting assessment with assignments that facilitates student learning and is manageable within the constraints of the tertiary sector. A strength of the framework is that it builds both on education theories and on experiences from practice. It utilises the significant knowledge available around assessment and assignment marking to provide focus to an evaluation of marking approaches and tools.

Key members of the Lightwork team, who are experienced academics and e-learning researchers, have conducted the Lightwork case study. While great care has been taken to remain objective, this was designer-led rather than an independent evaluation. Further independent studies would be valuable to explore how Lightwork could be used in different contexts.

Evaluation framework

A comprehensive review of the literature was conducted to establish the nature of the support that an e-learning solution for assignment marking needs to provide (Heinrich et al. 2007). This was complemented with research involving 90 lecturers in tertiary education in New Zealand (Milne et al. 2007) who were asked about their practical needs for assignment marking. From there, key issues were identified that could be addressed by a software solution. These key issues (Table 1) form the framework for evaluation. The following sections outline the importance of each section of the framework and the focus of the evaluation.


Table 1.  Evaluation framework for marking and management of assignments.
Key issues Focus of evaluation
Marking rubrics What are the issues around devising marking rubrics?
Marking feedback What are quality and efficiency issues of providing marking feedback?
Administration of student submissions and marking results What impact does Lightwork have on traditionally time-consuming tasks (as good pedagogy cannot be implemented if practicalities are not taken care of)?
Marking consistency and reliability If and how does Lightwork facilitate achieving consistency and reliability?
Reflection on marking process and outcomes Does using Lightwork facilitate reflection of teachers?
Additional aspects around electronic support What are marker experiences of using Lightwork with regard to pedagogy and practicalities?

Support for marking rubrics

The importance of marking rubrics is stated in literature (Gronlund 2006; Hanna and Dettmer 2004; Lambert and Lines 2000; Linn and Miller 2005; Nitko 2004). Rubrics support learning and instruction by making expectations and criteria explicit (Jonnson and Svingby 2007). Rubrics should be developed together with assignment tasks and should be made available together with the tasks. Rubrics help students understand targeted learning and quality standards (Reddy and Andrade 2010). Rubrics help markers to focus on assessing the learning outcomes targeted and prevent being side-tracked by issues like presentation. Scoring with a rubric is likely to be more reliable than scoring without (Jonnson and Svingby 2007). The 2006 interviews confirmed that lecturers use marking rubrics. This evaluation looked at how rubrics are supported.

Support for providing feedback aligned to marking rubrics

Every student should receive feedback outlining strengths and weaknesses in their work (Linn and Miller 2005; Nitko 2004). Individualised feedback is required, regardless of type of marking rubric used (Nitko 2004). Marking is a time-consuming task (Linn and Miller 2005) that can be made more efficient with the provision of comment banks (McLachlan-Smith and Irons 1998). Students highly appreciate feedback on their work (Margrain et al. 2009). They often regard feedback they get as too vague (Campton and Young 2005). They request overall guidance as well as specific comments placed directly into their work (Orsmond, Merry, and Reiling 2005). A finding from the 2006 interviews has shown that time is seen as the biggest obstacle to providing high quality feedback. It was also indicated that teachers tend to give more feedback when supported by electronic systems, as doing so is more efficient. This evaluation examined how feedback can be aligned to marking rubrics and what support is available for creating high quality feedback efficiently.

Support for administration of student submissions and marking results

While administration issues are seldom discussed in the education literature, they were prominent in the interviews with lecturers in 2006. Lecturers wanted to reduce time spent on administration in order to focus on what contributes to learning. Current electronic systems help but need improvements, such as the elimination of manual steps like downloading, unpacking, renaming and converting of files or the replacement of manual marker allocation and manual distribution of student work and marking results. Keeping results in electronic form is useful as it simplifies record keeping and allows for analysis. Yet, there is concern about the current need for double-handling between systems. This evaluation analysed the support provided for administration issues.

Support for achieving marking consistency and reliability

Students are concerned about fairness of assessment (Nesbit and Burton 2006). Achieving reliability is challenging (Linn and Miller 2005; Nitko 2004), and moderation is critical to successful marking (Brown 2009; Gronlund 2006). The use of appropriate rubrics enhances the reliability of marking (Gronlund 2006). The 2006 interviews have suggested that having assignments available electronically facilitates moderation and quality control. Teachers tend to feel more confident in their marking and assessment processes as electronic tools support reliability and transparency. This evaluation checked if features are provided that which support the consistency and reliability of marking.

Support for reflection on marking processes and outcomes

The analysis of all feedback provided to students identifies strengths and weaknesses across the class and can guide further teaching (Nitko 2004). Teachers can and should learn from this feedback (Hattie 2009). The 2006 interviews have shown that teachers who keep historic records of marked assignments see benefits for marking and moderation for subsequent classes. Lecturers reflect on assignment experiences to inform future teaching. Doing so is aided by electronic assignments and marking. This evaluation examined if reflection is supported.

Additional aspects around electronic support

Many additional aspects regarding electronic tool support in general, and assessment support more specifically, are addressed in the literature. Electronic systems should offer a complete approach, covering the actual marking process in addition to management aspects (Jones et al. 2005). Electronic systems can positively influence links between assessment and teaching and learning (Buzzetto-More and Alade 2006) and can increase the awareness of staff regarding assessment approaches (Aller et al. 2005). Electronic systems can contribute to staff development by facilitating sharing of marking comments among colleagues (McKenzie 2004). Staff members need strong support networks to support adoption of systems (Freeman and McKenzie 2002). Lecturers in the 2006 interviews emphasized that they need strong institutional support in selection of electronic systems and in learning to use these systems effectively. This evaluation assessed whether the system supports both practical issues and pedagogy.

Methodology

A total of 22 semi-structured interviews were conducted with participants from four New Zealand tertiary institutions. Appendix A provides the schedule of questions used to guide the interviews. Fifteen participants were teachers in charge of the assignments, two participants had the role of providing administrative and marking support, and five participants were employed as assistants to mark assignments under guidance.

An invitation to participate in the evaluation study was sent to known Lightwork users at the participating institutions. All individuals who agreed to participate were interviewed, with the exception of two individuals who were unable to participate in the timeframe available. Massey University's Ethics Committee approved the evaluation project, and all participants’ institutions gave permission to interview staff.

The interviews were audio recorded and transcribed. The analysis of the transcripts was guided by the criteria of current context; previous methods of marking; perceptions of efficiency and marking quality. First, the task and marking contexts of the participants were considered. Data were extracted that relate to the subject areas of the assignments, the class sizes, the type of student work requested for the assignment and the file formats submitted. The composition of the marking teams was examined in terms of the number of team members and their roles in the marking process. Next, the marking approaches practised before using Lightwork were analysed, as knowledge of these was essential for interpreting responses on the effect of using Lightwork. Information on the approaches was extracted and sorted into categories. Shortcomings and advantages of these pre-Lightwork approaches were summarised. The focus of the analysis then turned to addressing the criteria of the evaluation framework. Any statements had to be set against the specific assignment circumstances of the participants and, in particular, their reference points to paper or electronic assessment.

Results

Those interviewed taught in the following subject areas: accountancy, computer science, education, healthcare, information systems, information technology, languages, law, management, physiology, psychology, sociology and sports sciences. Students submitted work in the form of essays, reports, presentation material, calculations and computer programs. The files types of student work included word processing documents, PDF documents, presentation slides, spreadsheets and compressed programming project file collections. The assignments investigated belonged to courses that contribute to certificate, diploma and university preparation levels, as well as undergraduate and postgraduate degree programmes. Overall, assignments for 17 courses with class sizes between 10 and 1000 students were discussed. In some cases, Lightwork had been used for several assignments of the same course.

The experience of using Lightwork was analysed in the context of the approaches used previously by the participants. Some lecturers had already used Moodle for electronic assignment submission. For these lecturers, the transition focused on learning the application Lightwork. Some lecturers had not yet worked with electronic assignment submission, and for these lecturers, using both the Moodle assignment module and Lightwork were new approaches.

The following sections link the findings of the evaluation to the evaluation framework identified earlier.

Findings related to marking rubrics

Participants confirmed that Lightwork provides the structures for creating marking rubrics and found that creating rubrics in an electronic tool requires more precision than just sketching out a rubric on a piece of paper. It seems that using the tool influenced the thinking of lecturers:

L8: “The fact that you have to state what it is and its funny because I mean I have always done that, but probably more fudged, I think it's made me, I think it's made me more explicit.”

Participants stated that structuring a marking rubric requires lots of thought and is best done in conjunction with designing the assignment task, with the rubric structure following the structure of the assignment. Participants spoke about the challenges they experienced when constructing their rubrics. There is tension in not wanting to restrict too much while still having enough structure for marking, especially in context of assignments on higher conceptual levels or with high levels of freedom of content or creativity.

More research will be required to explore issues around marking rubric creation to be able to distinguish more clearly between the effects caused by an electronic tool as compared to the questions that arise conceptually around rubric creation. The indications from this case study are that Lightwork had a positive impact. It allowed all participants to create rubrics in the way they wanted. The mechanics of the software ensured that the rubric creation was well completed before marking started. Using Lightwork triggered reflections on the usefulness of the rubrics used.

Findings related to marking feedback

The rubric structure that is mirrored in the marking sheets Lightwork creates for each student helped to focus marking. The ability to create comment banks was regarded as very useful for providing valuable feedback efficiently. These comment banks can contain extensive explanations that can be quickly inserted by teaching assistants. This led to more feedback than would have been given if all comments had to be typed for each student individually:

L11: “… I think they ended up getting more feedback and probably more useful feedback.”

Using the prepared comments reduced the number of errors contained in the feedback to students. The support provided by comment banks can be particularly helpful to marking assistants and students working and studying in a second language. Completeness of feedback reassures students that markers have considered all aspects of work and consequently led to fewer enquiries after marking had been released.

It seems clear that Lightwork has had a positive impact on providing feedback to students. The software made it faster to provide feedback than was possible with the tools or approaches used previously by the participants. This impacted on feedback being more complete and comprehensive. The organisation provided by the marking rubrics was seen as valuable for the structure and transparency of the feedback given to students.

Findings related to handling administration of student submissions and marking results

Participants appreciated that Lightwork has all the information required for marking in one place. This saved time locating documents and other details during marking. Manual copy-and-paste steps are eliminated, which again saved time and lowered the risk of making mistakes.

Participants agreed that allocating student work to markers could be done easily and efficiently in Lightwork. Having marker allocations and the marking status for each student within easy reach made it feasible to check on marking progress, contributing to faster return times. It was perceived that more quality checking was likely to happen based on the ease with which this is possible.

Participants appreciated how Lightwork simplifies the return of marking to students by uploading marks, marking sheets and annotated student work:

T1: “… before we used to have to upload each individual mark sheet for each assignment that was marked. So we used to go into [Moodle] and we used to then browse to locate a mark sheet that we had input information on and saved in that student's name and then uploaded that and then saved that and then adjusted the mark as well. So that was quite time consuming and Lightwork completely removed the need to do all of that, so that was huge.”

The evaluation clearly showed that Lightwork helps with the administration issues around assignment marking. Participants confirmed that these administration issues are very important. Time saved can be invested into more productive aspects of marking. Ease of access to information enables quality control steps to be carried out and automation reduces the chance of human error.

Findings related to marking consistency and reliability

There was strong support among participants that marking with Lightwork can improve reliability of marking and consistency between multiple markers. The features that support this were the marking rubrics, the comment banks and the instructions to markers that can be integrated into the marking rubrics.

Lightwork provides the lecturers in charge with easy access to marking that has been completed by the teaching assistants. Lecturers appreciated this feature as it enabled quality checking. This facilitated timely feedback to the teaching assistants, reassuring them if they are on the right track or pointing them in the right direction if required:

L8: “… I can pop in and see easily, locate the mark sheet, which I did on a regular basis ….”

Participants stated that Lightwork also had an effect on consistency in instances where there was just a single lecturer marking:

L13: “… it gives you immediately a good view of how you have marked other students.”

In Lightwork, the lecturer in charge of the assignment releases all marking to students. A teaching assistant with lower access rights cannot perform this task. This allows the lecturer to conduct quality checking and determine the point of time all students get access to their marking results. This feature was appreciated.

In summary, participants gave a clear indication that Lightwork facilitates checking for consistency and reliability. The tools that Lightwork provides at this stage are fairly simple and are basically limited to providing structured, timely and flexible access to information concerned with the marking process and marking results. Many extensions to Lightwork are possible that would further improve support for this important area of assignment marking.

Findings related to marking reflection

Nearly all lecturers interviewed had many years of teaching and marking practice. Despite this level of experience lecturers stated that using Lightwork encouraged them to review their practice:

L13: “… when I started using Lightwork I all of a sudden became aware of, oh well okay I've been doing this for some time now but maybe if I structured these things accordingly then that might help me a lot, … that actually worked.”

Lecturers reflected how they moved forward from their experience of marking the first assignment with Lightwork to the second of the same course, how in future they would be more specific in formulating the work requested from students and how it would be better to have requirements and feedback more closely aligned. Using Lightwork had the effect of encouraging thinking beyond the current assignment to the bigger picture of assessment and course design.

Lightwork seemed to have had a strong effect on encouraging lecturers to reflect on their assignment practices. This is positive, as reflection is an important step in the improvement processes.

Findings related to additional aspects around electronic support

Lightwork satisfied the requirements of the participant for assignment marking. While additional features, for example, for audio in addition to typed comments, would be welcomed, all core steps of the assignment marking process are covered.

As Lightwork eliminates most of the manual handling steps related to marking, it lowers the risk of making mistakes. Lightwork encourages much more detailed recording of marking than many lecturers would do otherwise. This was seen as an advantage when there are enquiries about the marking:

L11: “… last year we got questions around would you please clarify why this happened and this year I think they seemed to feel much more comfortable with why we gave them the mark we did.”

Participants acknowledged that Lightwork can be a catalyst for reviewing and changing assessment practices. Participants also rightly expressed that software tools by themselves cannot improve marking but that success will always depend on the person using the tools:

L13: “So you know from that perspective looking at it overall I'd say that it probably is not a blanket statement that Lightwork will make feedback better, it totally depends on who is doing the feedback.”

Lightwork had just been developed when the participants agreed to trail it. Some of the participants moved directly from paper-based assignment submission to use of Moodle and Lightwork. Several participants commented on the importance of having good support available to them.

Discussion

The participants in this study have shown a solid understanding of what constitutes good marking and how important marking, and assessment more generally, is to student learning. The exploration of using Lightwork inadvertently led to comparisons with other tools and approaches used previously by the participants. This confirmed findings from the literature and the previous 2006 study that neither manual, paper-based approaches nor combinations of current LMS with generic tools provide sufficient support to academics in tertiary settings. As a consequence, carrying out all steps required for providing valuable formative feedback and for monitoring the quality of marking is very time consuming and not supported conceptually. This leads to academics either spending large amounts of time on the manual execution of steps or to academics not performing these steps to a level suggested by the education literature.

A LMS like Moodle is good at handling some aspects, such as student assignment submission and provision of access to marking. The case study shows that a tool like Lightwork, which is designed based on assessment theories, can fill the gap that otherwise exists with regard to marking support for academics and their marking teams. Of specific importance is the impact on formative feedback, with case study participants confirming that Lightwork helped providing more complete feedback in better-structured form.

As well-designed software should, Lightwork supports users in carrying out educationally sound practices. Lightwork supports the important steps recommended in the literature for assignment marking. As such, the tool fits in with established good practice, instead of forcing the user to adapt their practice to the tool:

L2: “It is a lot of the practices we do already; it is just consolidated so it talks back to that efficiency and to maintaining consistent quality between different markers”

In addition, using Lightwork has had the welcome effect of encouraging reflection. Table 2 summarises the key findings linked to the criteria of the evaluation framework.


Table 2.  Key findings
Key issues Focus of evaluation Key findings
Marking rubrics What are the issues around devising marking rubrics? Lightwork allowed the construction of marking rubrics.Constructing rubrics in Lightwork helped identify conceptual challenges in designing suitable rubrics that could be discussed and addressed.
Marking feedback What are quality and efficiency issues of providing marking feedback? Compared to tools used previously by participants, Lightwork:
makes providing feedback more efficient;
is likely to result in more complete feedback; and
encourages markers to give positive feedback in addition to constructive feedback.
Quality and efficiency of providing feedback are tightly linked and cannot be looked at in isolation.
Administration of student submissions and marking results What impact does Lightwork have on traditionally time-consuming tasks (as good pedagogy cannot be implemented if practicalities are not taken care of)? Lightwork successfully automates mundane and repetitive tasks.This saves time and reduces human error.
Marking consistency and reliability If and how does Lightwork facilitate achieving consistency and reliability? Lightwork successfully addresses consistency and reliability issues by:
linking the marking rubrics with the student feedback sheets and
making marking data accessible in a structured and efficient way, and so lowering the barriers towards carrying out quality assurance steps.
Reflection on marking process and outcomes Does using Lightwork facilitate reflection of teachers? The use of Lightwork triggers reflection on the marking process, as well as marking and assignment task specification.The reason seems to be that using software (instead of paper-based approaches) brings more structure and enforces consistent application.
Additional aspects around electronic support What are marker experiences of using Lightwork with regard to pedagogy and practicalities? Lightwork successfully addresses two important areas:
It incorporates the pedagogically important concepts (marking rubrics, feedback sheets, comment banks and marker allocations).
It addresses practicalities and saves time that can be invested better into pedagogically valuable tasks.
As with any software tool, the capacity of Lightwork to improve assignment marking rests on the abilities of its users.

Conclusions

The case study indicates that Lightwork has a positive effect on the efficiency and quality of assignment marking. The participants, who showed significant knowledge of the marking practices recommended in the literature, stated that Lightwork provides efficient support for good marking approaches. The evaluation framework outlined in this article has not only justified the findings but can also be used to evaluate other electronic tools for assignment assessment, the results of which could then be compared with the Lightwork results.

The case study collected qualitative data that captured the richness of participants’ experiences. The experiences of the case study participants varied greatly, depending on their personal frameworks for dealing with assignments and the previous experiences with which they compare Lightwork. These qualitative data shed insight on how the participants’ used Lightwork and the impact it had on their practice. Future studies could complement these data with quantitative measures. This was not the purpose of this study.

Participants used an early version of Lightwork in the case study. Lightwork continues to evolve, and comments by participants alerted to general challenges in the design of the software. An application needs to facilitate good educational practice without restricting the user: it should cater for a variety of circumstances without becoming too complex by providing too many options.

The participants in this case study were early adopters who were highly knowledgeable in assignment marking practices and highly motivated. Institution-wide implementation of an application like Lightwork could reveal different issues where academic participants had diversity in motivation and in their knowledge of assessment practices. Excellent support structures on technical and pedagogical levels will be required to enable the software to be used in support of effective assessment practices.

In conclusion, it is important to acknowledge that this research did not test the validity of the marking rubrics developed by case study participants. The focus of the study was on the application of these rubrics in reliable and efficient ways. Further research, in conjunction with Lightwork or in more general terms, should be conducted to examine the validity of marking rubrics.

References

Aller, B. et al. (2005) ‘WeBAL: a web-based assessment library to enhance teaching and learning in engineering’, IEEE Transactions on Education, vol. 48, no. 4, pp. 764–771.

Brown, G. (2009) ‘The reliability of essay scores: the necessity of rubrics and moderation’, in Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research, eds L. Meyer et al., Ako Aotearoa, Wellington, New Zealand, pp 43–50.

Buzzetto-More, N. A. & Alade, A. J. (2006) ‘Best Practices in e-Assessment’’, Journal of Information Technology Education, vol. 5, pp. 251–269.

Campton, P. & Young, J. (2005) ‘Please sir, may I have some more? A comparative study on student satisfaction with assessment feedback methods in an undergraduate unit’, Paper presented at the Balance, Fidelity, Mobility: maintaining the momentum? The 22nd annual conference of the Australasian Society for computers in learning in tertiary education (ascilite), Brisbane, Australia.

Freeman, M. & McKenzie, J. (2002) ‘SPARK, a confidential web-based template for self and peer assessment of student teamwork: benefits of evaluating across different subjects’, British Journal of Educational Technology, vol. 33, no. 5, pp. 551–569.

Gronlund, N. E. (2006) Assessment of Student Achievement, Pearson, Boston.

Hanna, G. S. & Dettmer, P. A. (2004) Assessment for Effective Teaching Using Context-Adaptive Planning, Pearson, New York.

Hattie, J. (2009) ‘The Black Box of Tertiary Assessment: An Impending Revolution’, in Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research, eds L. Meyer et al., Ako Aotearoa, Wellington, New Zealand, pp. 259–275.

Heinrich, E. et al. (2007) ‘Literature review on the use of e-learning tools for formative essay-type assessment’, [online] Available at: http://etools.massey.ac.nz/research.htm, Retrieved on 6 September 2010.

Jones, D. et al. (2005, 9–11 November) ‘What makes ICT implementation successful: A case study of online assignment submission’, Paper presented at the Open Learning and Distance Learning Association of Australasia (ODLAA), University of South Australia.

Jonsson, A. & Svingby, G. (2007) ‘The use of scoring rubrics: reliability, validity and educational consequences’, Educational Research Review, vol. 2, pp. 130–144.

Lambert, D. & Lines, D. (2000) Understanding Assessment: Purposes, Perceptions, Practice, TJ International, Padstow.

Linn, R. L. & Miller, M. D. (2005) Measurement and Assessment in Teaching, Pearson Merrill Prentice Hall, Columbus.

Margrain, V., et al. (2009) ‘“Ka pai–well done”: student teacher perceptions of assessment feedback in distance learning’, in Tertiary Assessment & Higher Education Student Outcomes: Policy, Practice & Research, ed. L. Meyer, et al., Ako Aotearoa, Wellington, New Zealand, pp 129–139.

McKenzie, S. (2004) ‘Assessing quality of feedback in online marking databases: An opportunity for academic professional development or just Big Brother?’ Beyond the comfort zone: Proceedings of the 21st ASCILITE Conference, Perth.

McLachlan-Smith, C. & Irons, B. (1998) Ideas to Share: Examples of Succesful Extramural Study Guide Design, Massey University, Palmerston North.

Milne, J., et al. (2007) ‘Survey report on the use of e-learning tools for formative essay-type assessment’, [online] Available at: http://etools.massey.ac.nz/research.htm, Retrieved on 6 September 2010.

Nesbit, P. & Burton, S. (2006) ‘Student justice perceptions following assignment feedback’, Assessment & Evaluation in Higher Education, vol. 31, no. 6, pp. 655–670.

Nitko, A. J. (2004) Educational Assessment of Students, 4th edn, Pearson Education Upper Saddle River, NJ.

Orsmond, P., Merry, S. & Reiling, K. (2005) ‘Biology student's utilization of tutor's formative feedback: a qualitative interview study’, Assessment & Evaluation in Higher Education, vol. 30, no. 4, pp. 369–386.

Reddy, Y. & Andrade, H. (2010) ‘A review of rubric use in higher education’, Assessment & Evaluation in Higher Education, vol. 35, no. 4, 435–448, First published on: 07 August 2009 (iFirst). To link to this Article: DOI: 10.1080/02602930902862859. [Crossref]

Appendix A: Interview questions for Lightwork users

Assignment context

What was the subject area of the assignment?

How many students were in the course?

What was the level of the course?

What type of work did the students submit (essay, report, calculation, …)?

How many files of what file types did the students submit?

Did students submit in groups?

        If yes, how many submitted together?

        If yes, did all team members receive the same mark and feedback?

Marking context

Did you mark by yourself?

Did you have a marking team?

        If yes, how was this team composed? (markers, admin staff, …)

Did you/your markers annotate student work directly?

        If yes, in which program did you do this? (Word track changes, PDF annotation, …)

Efficiency of marking

Do you find that using Lightwork has helped with the efficiency of the assignment marking?

        If yes, in which ways?

Quality of marking

Do you find that using Lightwork has helped with the quality of the assignment marking?

If yes, in which ways?

Feedback from marking team members (if applicable)

Have you had any feedback from your marking team members on marking with Lightwork?

        If yes, what kind of feedback?

Feedback from students

Have you had any feedback from your students in regard to the assignment marking?

        If yes, what kind of feedback?

Knowledge about assessment

Has working with Lightwork had an effect on your knowledge about assessment?

        If yes, what kind?

Closing

Would you provide us with a copy of your marking rubric?

        Could I have a copy of your comments to markers and your frequently used comments?

        Would you allow me to publish your marking rubric in a document/website to help others with the development of marking rubrics?

        Why did you structure or write your marking rubric in this way?

        What worked well about it?

        What did not work so well?

What would you change for next time?

        Would you provide us with some examples of marking sheets and annotated student work?

        Do you have any comments?