ORIGINAL RESEARCH ARTICLE

Online submission, feedback and grading of assessment: what do academic staff really think?

Emma Mayhewa*, Vicki Holmesb, Madeleine Daviesc and Yota Dimitriadid

aDepartment of Politics and International Relations, University of Reading, Berkshire, UK;
bCentre for Quality Support and Development, University of Reading, Berkshire, UK;
cDepartment of English Literature, University of Reading, Berkshire, UK;
dInstitute of Education, University of Reading, Berkshire, UK

(Received: 12 May 2020; Revised: 4 December 2021; Accepted: 22 January 2022; Published: 29 April 2022)

The move to institution-wide adoption of online submission, feedback and grading is increasing significantly within the Higher Education sector. This transition is predominantly driven by the need to improve the student assessment experience, but some institutions now also cite the need to improve the staff assessment experience. Existing studies, however, provide seemingly contradictory evidence surrounding this online marking experience. This article adopts a mixed methods approach to explore academic staff preferences of the assessment experience within a UK-based institution following adoption of online submission, feedback and grading during 2017–2018. It finds that although the majority of colleagues prefer to mark and provide feedback online, the process of marking electronically is highly individual. Online marking is not just a single practice but a set of varied, rich approaches, influenced by individual marker perceptions, preferences and previous experiences, and is often highly emotive. Changes to existing marking practices are seen simultaneously as both challenging and liberating by cohorts of markers. Drawing on the results of a detailed staff survey, this article identifies seven themes that are influential to that experience. These findings have significant implications for how institutions manage change to large-scale adoption of online marking.

Keywords: Online assessment; electronic assessment management; change management; assessment and feedback

*Corresponding author. Email: e.a.mayhew@reading.ac.uk

Research in Learning Technology 2022. © 2022 Emma Mayhew et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Research in Learning Technology 2022, 30: 2458 - http://dx.doi.org/10.25304/rlt.v30.2458

Introduction

The move away from clusters of online submission, marking and feedback of assessment towards institution-wide adoption is generating increased attention across the Higher Education sector (Ferrell 2014; Law 2018; Mayhew, 2018; Verges Bausili 2017). The majority of providers are now actively exploring or implementing change in these areas. In 2014, a JISC-sponsored report outlined the findings of an online survey based on 90 responses from 70 institutions. Ninety-seven per cent of responders confirmed that their institutions were actively looking at online submission. Ninety-six per cent of responders were undertaking work around online feedback. Eighty-nine per cent of responders were exploring or working towards online marking (Ferrell 2014, p. 10). These are notable increases in comparison with previous years. The 2018 UCISA survey, based on 108 responses, found that Electronic Management of Assessment (EMA) was identified as the most challenging technology issue, ranked first in terms of new demand on Technology Enhanced Learning (TEL) teams (UCISA 2018). These trends represent what Verges Bausili (2018, para. 1) identifies as ‘a gradual institutionalisation of e-submission and e-marking technologies in UK higher education’. This ‘institutionalisation’ has been driven, in part, by the pressure to improve the student assessment experience, enhance learning and support student feedback satisfaction rates, stubbornly low in comparison with other aspects of teaching and learning provision (Office for Students, 2018).

Although student experience is a dominant driver for online marking, some institutions have also cited improvements to the staff assessment experience (Irwin, Childs, and Hepplestone 2016; University of Reading 2018). The relatively limited existing literature outlines a range of practical and pedagogic benefits for staff including less paper handling (University of Glamorgan 2012), reduced paper use (Ellis and Reynolds 2013; Rankin and Demetre 2012), less storage and increased feedback legibility (Rankin and Demetre, 2012), easier management of marking (Ellis and Reynolds 2013), more space for comments and marking from any location (University of Glamorgan 2012), the functionality of marking tools including similarity reports (Buckley and Cowap 2013), rubrics and in-text comments (Ellis and Reynolds 2013), QuickMarks (Buckley and Cowap 2013; Djordjevic and Milward 2012; Ellis and Reynolds 2013; Rankin and Demetre 2012), faster marking for some assessments (Buckley and Cowap 2013) and the opportunity to start new conversations about assessment practices (Ellis and Reynolds 2013).

At the same time, a number of staff challenges are identified. In some institutions, academic staff have described online marking as being more tiring than paper-based marking (Rankin and Demetre 2012 reported this finding among 67% of staff) and as raising potential health and safety concerns (Howe 2013). There have also been technical challenges that have led to reduced confidence in existing marking platforms; the use of marking features has been slow (Buckley and Cowap 2013), and systems have been sluggish, less intuitive or more suitable for particular disciplines. It has been unclear how moderation might take place (Buckley and Cowap 2013), which papers have been marked or how to access specific word counts. In addition, markers have raised concerns that feedback might become ‘mechanical’ and depersonalised (Djordjevic and Milward 2012, p. 29) or that online submission and feedback could remove a potentially important staff–student contact point with implications for pastoral care (Keele, n.d).

Solid patterns, however, are difficult to identify. The existing literature presents conflicting views about the staff experience of online marking. For example, staff surveyed at the University of Greenwich who used online marking felt that they were able to provide higher quality feedback online (Rankin and Demetre 2012), whereas Humanities staff surveyed at the Bath Spa felt that online marking may not improve quality (Adams, Meyer, and Anderson 2011). Staff at the University of Exeter identified increased screen time as a significant concern (Djordjevic and Milward 2012), whereas staff at the University of Huddersfield reported that increased screen time had no impact or a preference for screen reading over paper (Ellis and Reynolds 2013). Over three-quarters of the 37 surveyed staff at the University of Greenwich found that marking online takes more time (Rankin and Demetre 2012), whereas the majority of 11 cross-disciplinary staff at the University of Huddersfield found that they marked more quickly and more efficiently online (Ellis and Reynolds 2013).

To date, there has been little exploration of this seemingly contradictory body of evidence, nor an explanation of why online marking should produce conflicting variance in responses. It is, however, crucial for institutions to understand the transition to online marking and remain sensitive to the ongoing academic staff experience to identify, realise and evidence the benefits of change within this stakeholder group. Academics play a major role in implementing institutional assessment change and have traditionally enjoyed high levels of marking autonomy. Institutions need to understand marker perception in order to identify the conditions or circumstances likely to contribute to adoption and to a more positive experience.

In addition, there is very little research published after 2013 exploring the staff experience of online marking even though marking tools have developed significantly since that time in terms of functionality and the broader user experience.

This article contributes to institutional understanding by addressing the research question what is the impact of the introduction and use of online submission, feedback and grading on the marker experience? The main research question was additionally divided into three sub-questions: What are the main benefits and challenges associated with online assessment? Why might there be conflicting responses? How can the online submission, feedback and marking experience be improved for staff?

As part of a large, institution-wide EMA Programme, three schools at a medium-sized UK pre-92 institution were the first to transition from a variety of marking approaches to almost entirely online submission, feedback and grading during 2017–2018. Markers accessed assessment via Blackboard Grade Centre, then used either GradeMark/Turnitin Feedback Studio or Blackboard Inline Grading as a marking tool.

Support was provided in multiple formats for administrative and academic staff including hands-on sessions exploring new processes and practice, drop-in sessions, one-to-one sessions and online materials. This support was provided by the central Technology Enhanced Learning team, with an academic partner in each school providing local advice.

Method

The project was reviewed by a University Research Ethics Committee and gained ethical approval. The research team included three mid-career academics from different disciplines with extensive experience of marking offline and varied experiences of marking online and a senior technology enhanced learning lead. The team had active involvement in the development and implementation of the EMA Programme.

The first author researched the literature around EMA change management processes and shared the search protocols with the other three authors. Each author reviewed the documents from the literature independently, and then, all four authors worked together to agree on the previously identified benefits, challenges and gaps in the literature. The key themes emerging informed the development of the research questions. The discussion also allowed for the four authors to discuss values, personal interests, possible biases (Twining et al. 2017) and consider the process they would follow to support trustworthiness of the research.

The team wanted to explore participants’ views of the transition to online marking, which aligned with the four authors’ social constructivist epistemological background. The authors believed that discursive practices support the construction of knowledge (Denzin and Lincoln 2017). However, as the EMA project was a university-wide intervention, an anonymous questionnaire was felt ethically to be a more appropriate data collection tool in order to protect colleagues’ identity, allow them to express their views more freely and engage larger numbers of participants from the three schools. The 20-item survey included both multiple-choice and open-ended questions to capture participants’ experiences.

The multiple-choice questions focused on establishing basic information about each participant’s school, background experience, and the marking tools and functionality that they used. The survey also included further multiple-choice questions asking colleagues to identify any benefits from a range of statements, the impact on practice including access and speed of marking, and the overall preference for offline or online marking. The vast majority of the survey questions, however, were open-ended, free text questions, aiming to encourage individual comments and reflections.

This concurrent mixed methods approach supported addressing the research questions and helped to triangulate the data on the reception, take up and action around online marking. The study adopted the pragmatic paradigm, consistent with mixed method approaches, as it focused on ‘why’ and ‘how’, reflecting on action and doing.

There was further refinement of the wording, style and focus of the questions following a questionnaire pilot. The updated version was circulated online and in hard copy to academic colleagues, in the 2018 summer term. One of the authors was from one of the participating schools and was not involved in distributing the questionnaires at their school.

A thematic analysis approach driven by the literature themes to the data was followed. The authors did not use a qualitative data analysis software and coded the data manually exporting it in a spreadsheet. Braun and Clarke’s (2006) six-step thematic analysis was followed. All researchers had access to the data and familiarised themselves with the questionnaire entries. Author 2 first reviewed the data and through manual coding and colour coding of the data reviewed these themes to saturation. This codebook was then reviewed along with a sample of comments by Researchers 1 and 3. At this stage, coding was further reviewed and refined, staff well-being emerged as an additional theme, and relevant narrative from the data to be included was agreed. Researcher 4, then, reviewed the thematic and sub-thematic map before the team produced the report relating the analysis back to the literature and the study research questions.

Results and discussion

In total, 47 responses to the questionnaire were received, representing an approximate response rate of 27%. While this figure is lower than anticipated, the respondents represented a cross section of academics. In terms of digital literacy, 4% of the 46 responders who answered this question rated themselves as having low confidence, 61% as average and 35% as highly confident. In terms of experience, 2% of the 46 responders who answered this question had marked online once in the last 12 months, 41% had marked online twice or more in the last 12 months and 57% had marked online both in the last 12 months and in previous years. Measures were put in place to secure a higher return rate as suggested in the literature (Nulty 2008). Senior colleagues from each school distributed the questionnaires, sent reminders, extended the duration of the survey and highlighted that responses will inform further roll-out of online marking. These actions increased the response rate.

The majority of survey responders expressed strong satisfaction with the overall experience of marking online. Seventy-five per cent of the 40 who responded to the question preferred online marking and feedback, with the remaining 25% having no strong preference for either. This is a more positive response in comparison with earlier studies; Rankin and Demetre’s research shows just over a third of responders preferred to mark online (2012). Djordjevic and Milward found that, overall, the marker experience was worse than expected (2012). As one participant involved in this study states, however, it is ‘not as simple as which one do I prefer’, as individuals identified both positive and negative aspects of their experience of online marking and feedback, regardless of their overall preference.

The survey asked responders to identify any key benefits of online submission, feedback and grading. They were able to choose from a list of statements associated with practical functionality, efficiency and pedagogy; 43 out of 47 responders identified at least one key benefit. The percentage of responders who highlighted each statement as a ‘main’ benefit is shown in Figure 1. Reduced paper handling, remote access and the use/reuse of QuickMarks (a bank of frequently used comments that can be reused when marking) were most consistently identified but others reported more challenging experiences, such as difficulty navigating around documents. Just as there are seemingly conflicting views expressed within the existing literature, there are seemingly conflicting views within these survey results. This reflects the complex environment in which significant variations in individual experiences, impacted by personal perceptions and preferences, are evident.

Fig 1
Figure 1. Main benefits identified by responders.

In order to enhance our understanding of seemingly contradictory views and highly individual responses to change, this article identifies and explores seven key themes influential to the staff experience. The subsequent discussion argues that institutions should be cognisant of each theme and remain aware of the complexity of academic staff responses to changes surrounding practice, in order to effectively and efficiently support the delivery of change.

Reduced paper handling

The significant reduction in paper handling was identified as an important benefit by 57% of markers, as shown in Figure 1. In follow-on, open-ended questions asking responders to explain their responses, some colleagues reported reduced anxiety about losing or destroying scripts: ‘I do not miss carrying piles of essays and worrying about losing scripts’, ‘I do not miss worrying about spilling coffee/tea on them or destroying them in any way’. Others state that saving paper is a significant relief from ‘the guilt that I’ve printed half a tree’. One responder enjoyed the freedom from staring at piles of paper and noted the positive impact on students of not spending money and time printing and submitting hard copies.

In contrast, responders also identified some unexpected anxieties surrounding the loss of paper handling and the resulting impact on familiar and effective marking practices. Over a quarter of respondents commented at some point in their open-ended replies on the physical nature of marking and interaction with papers. Some responders simply expressed a preference for hard copies, reflecting nostalgically on the merits of pen and paper and the ability to ‘scribble comments and annotations’. Three colleagues specifically mourn the loss of ‘the tactile interaction with the assignment’ and the movement of their hand on the paper. This psycho-physical response is also driven by the flexibility of hard copy annotation, such as the ability to draw circles around parts of the text.

Others found it easier to read paper copies and miss being able to gain an overview of the whole piece. One responder, when asked to identify any negative impacts in a free text field, noted that they like to spread work out on a table to see several pages. Another likes to gain a ‘feel’ for the work by flipping or cross-checking sections, references or appendices. Other responders highlight a sense of disorientation – the ability to spatially locate themselves within the assignment and navigate their way through it. Some colleagues mourn the physical aspects of hard copy when marking a whole set of assignments, both from an organisational perspective (being able to physically stack up assignments to help grade across the scripts) and from a motivational perspective (missing the ‘tangible sense of achievement when you move a script to the “done” pile’).

This loss of hard copy marking is often linked to the sense that something ‘personal’ has been lost. Although some research suggests students do not see online marking as negatively impacting personal relationships with tutors (University of Manchester Humanities 2013), three responders express concerns that feedback might become depersonalised. One notes how they missed being able to annotate scripts quickly with comments that ‘are personal to the student’. When asked if any additional support would be helpful, another asks for further guidance on feedback because ‘electronic systems are inhuman’. Both reveal that while some find reduced paper handling liberating, others retain a sense of nostalgia. for the ‘personal’ implications of pen and paper and remain concerned that digitalisation has caused the loss of valued connections.

Confidence in new online spaces

The transition from physical marking to online marking involves working, literally, in a new space. There is a significant overhead in understanding and engaging with an unfamiliar space, which can be disorientating; for example, colleagues need to identify new ways of monitoring what has been marked or navigating specific assignments. Challenges can be exacerbated by sub-optimal functionality in the digital marking tools and by systems not always performing as they should, presenting barriers to use; in this survey, when asked an open question about any challenges experienced using online marking, responders highlight a lag in the click response, an occasional failure to record audio feedback or the production of an accurate word count without downloading each file. One user found it difficult to check citations, cross out text or insert comments at the right point. Others reported that they could not tick more than one rubric square, they were logged out periodically or they could not open a bibliography in a separate window. There are challenges involving the use and functionality of marking tools given the breadth of assessment work, marker preferences and expectation which, if resolved, would support an improved marker experience. In addition, it takes time for staff to become familiar with the system and identify how they customise it (e.g. producing bespoke QuickMarks).

Several staff have identified ways to address their lack of IT confidence and to ease levels of discomfort with an unfamiliar system by additional training or designing workarounds. However, when asked about any impact on the speed of marking, for example, one colleague admitted that their anxiety around losing comments has led them to write all of their feedback separately before copying it into marking tools, causing significant duplication of effort. Concerns about the reliability of electronic systems are magnified by the ‘mission critical’ nature of feedback provision, so crucial to student success and satisfaction.

While training and support can be provided, it is not a substitute for practice and a sense of familiarity produced by experience. For example, when asked to comment on the impact on the speed of marking over time, one responder said ‘I have already become more comfortable doing it having had a bit of practice’ and when another was asked whether anything had surprised them using online marking, they said, ‘I have been surprised by how streamlined and straightforward it is – once you know what you’re doing!’. For others, there is an additional recognition that unfamiliarity can in itself be positive. When asked to compare the overall experience of online and hard copy marking, one respondent described marking online as a ‘novelty’.

When others were asked what surprised them, they highlighted their own ability to adapt: ‘It was much easier than I had thought it would be’; ‘I was reluctant, but actually there are benefits’; ‘I was surprised it was so much faster after I got used to it’. Many staff are keen to adapt their previous marking approaches and explore the new opportunities. They appear ready to acknowledge that this is a new and as yet unfamiliar way of working and that becoming more familiar and more practised can improve their experience.

Addressing access

A frequently cited benefit of marking online is flexible access, and this is supported by survey responders; 55% of responders identify remote, instant access to assessment as a key benefit and 51% of responders identify access to marks and feedback as an important feature, as shown in Figure 1. These benefits, however, are dependent on a reliable internet connection and reliable online marking tools and services. Eleven respondents report that online marking, far from increasing access, actually constrains the choice of working location. Feedback identifies the need for a broader understanding of location that goes beyond marking at home or in the office. When asked open questions about potential challenges of online marking, and about whether there was anything about hard copy marking respondents missed, staff reported missing the ability to mark in the garden, in the train, in a plane, in the car, ‘whilst watching my kids’ sports lessons’ or while away from IT equipment. Access constraints, when combined with other requirements such as feedback turnaround times, can compound pressure on staff. When asked to comment on whether or not the support provided was timely, one responder reflects on the lack of iPad availability and comments:

You cannot underestimate the stress that is placed on academics to provide high quality feedback within the 15-day-turnaround, and although [online marking] helps with the high-quality aspect, in my experience it has hindered the turnaround time by forcing me to work only at work.

This comment does raise additional questions around the need to think through the standard provision to markers of portable IT equipment (principally laptops) to enable them to mark either on campus or at home, especially when the workload during peak marking periods will exceed normal working hours.

Perhaps even more significantly, however, the comment focuses attention on an underlying issue involving academic workload in an environment where institutional demands, rapid change, student numbers, student needs and student feedback via NSS and other surveys are increasing pressures. The statement, ‘forcing me to work only at work’, suggests that this respondent completes much of their work at home and in other environments; the hours worked within a school may be increased by the hours worked in the early mornings, evenings and at weekends. Workload may vary as a result of contractual variations (sessional, part-time or full-time), by seniority (junior lecturer to professor) and in the split between teaching intensive and teaching/research colleagues, and by family responsibilities. These hours will be split between work and home and, although it may not be ideal in terms of feedback quality, will often occupy small ‘gaps’ in time (e.g. ‘whilst watching my kids’ sports lessons’) to at least keep marking moving forwards. The move to online marking and feedback needs to be able to respond to these variable work patterns that are necessitated by broad-based institutional issues across the sector.

For some staff, the lack of availability of additional equipment to support flexible marking and the inability to replicate previous marking practice may lead to a less positive perception. Providing additional devices can help to increase flexibility of marking location and improve satisfaction. It is unlikely, however, that the same solution would suit all colleagues, and the issues involving academic workload remain a significant determinant of staff experience of online marking. The move towards the latter potentially shines a strong light on the creep of the former and suggests that a serious conversation about workload is triggered by the move into this new domain.

Staff well-being

Reported concerns about accessing assessment and marking tools while away from normal working locations, or between other commitments, raise broader questions around staff well-being. When asked if there is anything that they miss, or not, about hard copy marking, one colleague reports physical relief that they no longer develop cramp in their hand from writing long sections of feedback. However, in other open text question about their experiences, 14 others highlight health and well-being concerns about the impact of additional screen time on their eyes causing increased fatigue, the impact on their hands of additional time spent using a mouse and the potential postural damage caused by long periods sitting at a desk. Some responders feel that marking paper copies allowed them to have more breaks and to sit in different locations.

The conflict in respondents’ experiences is produced by different individuals’ needs and concerns, and any move to any new system will reveal similar gaps in perceptions as well as a range of different insights. The institutional move to online marking and feedback cannot account for each individual viewpoint and nor should it try to do so. It must, however, draw on common threads delivered in the form of feedback surveys so that, instead of trying to rationalise diverse needs, it proves capable of speaking to consistent anxieties. As in the case with issues relating to workload, comments relating to the physical impact of online marking need to be analysed in terms of patterns and what may be revealed by them; in this case, responses to the survey indicate some benefits of moving away from paper feedback and also widespread anxiety about its long-term implications in terms of visual health. Ensuring that organisational units such as Occupational Health are cognisant of changes in technological and academic practice is important in monitoring long-term health implications.

Managing changing marking practices

Survey responders highlight considerable variation in existing feedback approaches and in the degree to which they feel that new marking tool functionality impacts the quality and quantity of feedback provided. Although one respondent describes online marking in free text comments as ‘just a different way of doing the same thing’, when asked specifically to describe whether there has been any positive impact or not in a multiple choice question, 73% of those responding reported a positive impact on their marking and feedback practices. This suggests that online marking enables or encourages markers to do different things. This may include achieving greater marking consistency and improvements to the clarity of feedback through the use of rubrics and QuickMarks.

QuickMarks were highlighted by responders as being of particular significance. When asked in a multiple choice question which marking tool features responders had used, 85% of responders report using QuickMarks, and several responders note the tool’s substantial impact in terms of both efficiency and pedagogy. The use and reuse of QuickMarks is identified as the most important benefit of online assessment within the survey (Figure 1). When expanding on these benefits, responders said that they appreciate the ability to embed hyperlinks to online resources within QuickMark comments in order to provide precise, useful feedback to students. Of particular note has been the adoption of a discipline-specific set of QuickMarks in one school which one responder finds ‘helps me to consider a wider range of aspects than I might otherwise have done’. Other responses also speak positively about using a set of marks that has been designed in a series of meetings where appropriate tone, language and content had been agreed. This saved time but, more importantly, it ensured a consistent assessment process for students and paid due attention to the pedagogy underpinning assessment practice.

The benefits associated with the use of marking tool functionality are derived not only from their actual use but also, as Ellis and Reynolds (2013) found, as a trigger for conversations between colleagues, particularly given that markers now have easier access to each other’s feedback. These conversations have led to further questions about marking and feedback consistency, a call for marking exemplars and broad agreement surrounding the appropriate type and quantity of feedback.

Anxieties were expressed, however, about the impact of online marking on teaching practice, for example, around the possible tension between ‘efficiency’ and ‘quality’ of feedback. When asked to provide comments about any positive or negative impacts on assessment and feedback practices, six colleagues felt that they now give ‘better’, ‘richer’, ‘more detailed’ and ‘clearer’ feedback, and five others felt that marking consistency between academics had been enhanced. Others, however, raise concerns. Responders report that marking had become ‘more generalised and I am letting a lot of things go (typos, etc.) because it is so fiddly to mark’, ‘I am giving less and less feedback’ and, when asked about their general experience, one responder said, ‘I can give very fast feedback online, but the quality is less good. Giving equivalent feedback to the stuff I was doing with hard copy would take me too long’. These comments raise issues surrounding speed of marking and the functionality of existing tools. They also highlight how some staff view online marking as a straightforward transfer of their existing marking habits into the online environment, whereas others make use of the new opportunities that online marking offers (‘If I can be smarter with rubrics and QuickMarks it will be beneficial’). Once again, the conflict in questionnaire responses reveals a pattern that suggests a need to support some colleagues in developing an understanding of the functionality of online marking tools in order to support new practices.

The speed of marking

Speed of marking is a persistent theme in the literature exploring the transition to online marking (Ellis and Reynolds 2013; Rankin and Demetre 2012). This study included a multiple choice question asking responders whether, according to their own perceptions, they found it slower, the same or quicker to mark online compared with hard copy marking. The results reflect a broad variation of views as shown in Figure 2.

Fig 2
Figure 2. The impact of online access and online marking on efficiency (number of responders to this question = 45).

Responses are spread across different user groups with no clear correlation between those reporting quicker access and marking and those reporting high confidence with technology or previous use of online systems. Staff have identified multiple reasons for their rating in free text responses. From the 17 responders who report time savings, eight mention the ability to reuse comments; of the 11 responders who do not report time savings, however, five state that, although they can reuse comments, inserting text is ‘clunky’ and time-consuming. Of those who find the process faster, four cite quicker and easier access to assessment; of those who find the process slower, staff report that they ‘put off’ marking until they have reliable internet access. Three colleagues who find the process more efficient explain that they are faster at typing than writing, but two colleagues who find that the process takes more time complain that they are slower at typing than writing. Others find the process quicker because they can see who has submitted; they also find it easier to look back through assignments to compare them, and they find it quicker to read on screen. In contrast, the slower group reports that the interface is ‘clunky’, that it is harder to move around assignments, that it is slower to read on screen and that eye strain reduces marking speed.

For some staff, this scenario is unlikely to improve over time, as a follow-on multiple choice question suggests: of the group of 11 colleagues who consider the online process to be slower, seven do not think that the time they spend marking will reduce. It is worth noting, however, that half of this group nevertheless say that they ‘prefer’ online marking when asked to comment whether, having considered their overall experience, they prefer online or hard copy marking. The benefits of marking online seem to outweigh any concerns around the speed of marking. Overall, of the 45 responders to this question, just over half feel that the time that they spend marking and giving feedback will reduce. This appears in part to be attributable to practice and familiarity (‘I was surprised it was so much faster after I’d got used to it’), to technical skills, an ability to adapt (‘It will become quicker because I will modify my marking practices to work around the strengths and weaknesses of the system’), a desire to make the most of the opportunities offered (‘After I have identified common issues and then written QuickMarks for an assessment the feedback process gets quicker’), perceptions of the technology, access to equipment, and previous marking experiences (which are in themselves diverse). The variation across the quantitative and qualitative data suggests that views about the speed of online marking are heavily influenced by multiple factors. This indicates that the introduction of online marking and feedback needs to acknowledge the holistic nature of these factors in contributing to each individual’s experience rather than regarding these as disconnected issues to be addressed separately. Although more intensive, working with individuals to understand their context, and seeking to identify and address which factors have the most negative impact for them, is likely to create a more positive outcome. As before, however, any move to a new system will not be able to accommodate every individual’s preferences, although supporting individuals to best manage the system in terms of their practices and needs encourages as smooth and sympathetic a transition as possible.

Adopting and embedding new practices

Eighty-three per cent of respondents made use of the support and training available, with 63% rating this as Excellent or Good. Respondents found guided ‘hands-on’ learning and use of scenarios valuable in acquiring new knowledge and skills, and in being able to practise and apply these skills to real-world experience. Comments that the training was ‘comprehensive’ but at the same time ‘quick and at the right level’ show that training activities need to be well-pitched, focused and time efficient in order to be of value.

The survey responses also showed that giving voice to individual perspectives within a large-scale change management project was critical to support each person in contextualising new approaches within their existing practice. Building in time for discussion within large training sessions and providing 1:1 drop-ins (‘it was extremely helpful to be able to ask specific questions in the training session’) and being approachable and responsive (‘he never made me feel as if it’s my fault for needing to ask for help’) were cited as positive contributors to adoption of new practice.

Our experience demonstrates the value of support and development activities for academic staff in successful adoption, by enabling staff to develop new practices in meaningful and personalised ways. These activities helped to reduce barriers when transitioning from the familiar to the unfamiliar and to engender a positive learning environment in which colleagues reported an increased confidence and self-belief.

Conclusion

This article has presented the findings of a staff survey designed to understand the marker experience of transition to, and ongoing use of, online submission, feedback and grading at an UK-based institution. Although the majority of colleagues prefer to mark and provide feedback online, citing a broad range of benefits, there are significant variations in individual perceptions, preferences and experiences across seven key themes: the physicality of paper marking, the adoption of new online spaces, access, well-being, marking practices, speed and support. This variation indicates that how people mark and the practices they use are hugely influential; online marking is not a unified practice but is instead a set of varied and rich approaches, heavily influenced by previous experiences. The interplay between the different factors suggests that it is the combined experience and views of these factors which influence an individual’s responses and attitudes to online marking. As a result, changes to existing practices are seen simultaneously as both challenging and liberating by cohorts of markers. This explains the seemingly contradictory evidence about online marking both in the existing literature and in the results of the survey presented in this study.

The authors acknowledge that their active role in the institutional EMA project dictated the data collection approach and limited the opportunity for follow-up and in-depth interviews. As a result, open-ended questions were included in the survey as a way to elicit more detailed responses. The study reports on a purposeful, however, relative small number of participants. The findings may not be generalisable across all Higher Education institutions although the process of peer checking through the data analysis process supports the credibility of the findings and the transferability of the key themes across other Higher Education settings. This article argues that Higher Education institutions embarking on large-scale transition towards online marking should be cognisant of each of the seven themes and be mindful of the conditions or circumstances that are likely to contribute to a more positive experience. In particular, institutions would be well advised to acknowledge the individuality of the marker; recognise the importance of building familiarity for users negotiating a learning curve; create an evidence base to demonstrate the new opportunities that can be achieved by online marking; encourage the adaptation of practice rather than replication of offline practices; and provide ongoing training at the point of need together with suitable equipment to access marking systems. These steps can ease the path to adoption as well as mitigate the impact of working online on staff well-being. Technology can be a positive and a negative disruptor. Recognising that the move from offline to online marking is not a single institutional change process but thousands of individual change processes and being mindful of this complexity is likely to lead to a more nuanced and effective approach to change.

References

Adams, J., Meyer, P. & Anderson, R. (2011) ‘E-feedback for better learning and experience’, International Conference on Teaching & Learning in Higher Education (ICTLHE), [online] Available at: http://www.academia.edu/1194839/E-feedback_for_Better_Learning_and_Experience
Braun, V. & Clarke, V. (2006) ‘Using thematic analysis in psychology’, Qualitative Research in Psychology, vol. 3, no. 2, pp. 77–101. doi: 10.1191/1478088706qp063oa
Buckley, E. & Cowap, L. (2013) ‘An evaluation of the use of Turnitin for electronic submission and marking and as a formative feedback tool from an educator’s perspective’, British Journal of Educational Technology, vol. 44, no. 4, pp. 562–570. doi: 10.1111/bjet.12054
Denzin, N. K. & Lincoln, Y. S. (2017) The SAGE Handbook of Qualitative Research, 5th edn, Thousand Oaks: SAGE.
Djordjevic, A. & Milward, S. (2012) OCME: Online Course Management Evaluation. Report, JISC, Bristol, [online] Available at: https://as.exeter.ac.uk/media/level1/academicserviceswebsite/aboutus/biss/iws/documents/OCMEFinalReportv1.pdf
Ellis, C. & Reynolds, C. (2013) EBEAM Final Report, [online] Available at: https://ipark.hud.ac.uk/sites/default/files/EBEAM_Project_report_compressed.pdf
Ferrell, G. (2014) Electronic Management of Assessment (EMA): A Landscape Review. Report, JISC, Bristol, [online] Available at: http://www.eunis.org/wp-content/uploads/2015/05/EMA_REPORT.pdf
Howe, R. (2013) SaGE Survey 2013, Report, University of Northampton, UK, [online] Available at: http://blogs.northampton.ac.uk/sage/2013/05/05/sage-survey-findings-and-moving-forward/
Irwin, B., Childs, J. & Hepplestone, S. (2016) Assessment Journey: A Programme to Provide a Seamless and Improved Assessment Experience for Staff and Students, Report, Sheffield Hallam University, [online] Available at: http://shura.shu.ac.uk/12116/2/Irwin%20Assessment%20journey.pdf,%20Accessed%2026%20Jun%202019
Law, J. (2018) ‘Jury ‘is still out’ on online marking’, Times Higher Education, March 24, [online] Available at: https://www.timeshighereducation.com/blog/jury-still-out-online-marking
Mayhew, E. (2018) ‘Implementing electronic management of assessment: Four key barriers faced by higher education providers moving to online submission and feedback’, Research in Learning Technology, vol. 26, article no. 2083. doi: 10.25304/rlt.v26.2083
Nulty, D. (2008) ‘The adequacy of response rates to online and paper surveys: what can be done?’ Assessment & Evaluation in Higher Education, vol. 33, no. 3, pp. 301–314. doi: 10.1080/02602930701293231
Office for Students (2018) NSS Summary Data, [online] Available at: https://www.officeforstudents.org.uk/advice-and-guidance/student-information-and-data/national-student-survey-nss/get-the-nss-data/
Rankin, S. & Demetre, J. (2012) ‘ The experience of Online marking and the future development of Online marking practice’, Compass: Journal of Learning and Teaching, vol. 3, no. 6. doi: 10.21100/compass.v3i6.155
Twining, P., et al., (2017) ‘Some guidance on conducting and reporting qualitative studies’, Computers & Education, vol. 106, pp. A1–A9. doi: 10.1016/j.compedu.2016.12.002
UCISA (2018) Report on Technology Enhanced Learning Survey, [online] Available at: https://www.ucisa.ac.uk/bestpractice/surveys/tel/TEL_survey_report_2018
University of Glamorgan (2012) Evaluation of Assessment Diaries and GradeMark at the University of Glamorgan. Final Report, [online] Available at: http://jiscdesignstudio.pbworks.com/w/file/67927168/JISC_Assessment_and_Feedback_-_Glamorgan_Final_Project_Report_post_JISC_feedback_Nov_2012.docx
University of Keele (n.d) STAF Project Final Report, [online] Available at: https://www.webarchive.org.uk/wayback/archive/20140614073219/http://www.jisc.ac.uk/whatwedo/programmes/bcap/keele.aspx
University of Reading (2018) About the Programme: EMA Programme [online] Available at: https://sites.reading.ac.uk/ema/about-the-ema-programme/
University of Manchester Humanities (2013) eAssignment Progress Report 2012–2013, [online] Available at: http://www.humanities.manchester.ac.uk/tandl/policyandprocedure/documents/eAssessment_Project_Report_2013_14_May2014_v.2.2.pdf
Verges Bausili, A. (2017) ‘From piloting e-submission to electronic management of assessment (EMA): Mapping grading journeys’, British Journal of Educational Technology, vol. 49, no. 3, pp. 463–478. doi: 10.1111/bjet.12547.