ORIGINAL RESEARCH ARTICLE

Implementing electronic management of assessment: four key barriers faced by higher education providers moving to online submission and feedback

Emma Mayhew*

Department of Politics and International Relations, University of Reading, Reading, Berkshire, United Kingdom

(Received 4 May 2018; final version received 17 October 2018; Published 12 December 2018)

Abstract

Adoption of online submission and feedback for formative and summative assessment is increasing significantly across the higher education sector. The majority of institutions in the UK have now identified themselves as moving away from pocketed, disparate use towards embedding institution-wide online assessment practices. Providers are driven by a range of benefits for staff, students and the broader institution. Research has started to explore the impact of change but there has been very little sector-wide analysis exploring the challenges faced by institutions moving to adopt online submission and feedback. This paper adopts a qualitative approach to explore barriers faced by providers that have the potential to prevent, delay or limit the benefits to be derived for institutions currently approaching or undertaking change. It outlines the results of an extensive literature review, which highlights four key challenges surrounding change design, stakeholder management, policy and process as well as technical integration. This article argues that providers intending to implement institution-wide change in the future should be cognisant of these barriers, and those currently undertaking change should be cognisant of the experience of others to inform their own good practice, policy and pedagogy.

Keywords: EMA; e-assessment; technology-enhanced learning

*Corresponding author. Email: e.a.mayhew@reading.ac.uk

Research in Learning Technology 2018. © 2018 E. Mayhew. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Research in Learning Technology 2018, 26: 2083 - http://dx.doi.org/10.25304/rlt.v26.2083

Introduction

As a potential transformative major change process, the move towards institution-wide adoption of online assessment is attracting considerable attention among higher education institutions. The sector has witnessed a significant shift away from pocketed, isolated use of online submission, feedback and grading for all forms of formative and summative assessment toward the adoption of institutional approaches.

In 2014, a report based on responses from 70 institutions and sponsored by Jisc (the UK higher education sector support and advisory body on digital technology) found that 97% of responders said that their institutions had, or were looking at, online submission. Ninety-six percent already enjoyed or were investigating moving toward online feedback. Eighty-nine percent used or were exploring online marking (Ferrell 2014a, p. 10). This is a significant increase in comparison to previous years. Similarly, the 2016 Universities and Colleges Information Systems Association Survey of Technology Enhanced Learning, based on responses from 110 providers, found usage of e-submission tools had increased from 85% across institutions in 2014 to 93% in 2016. In 2014, 71% of institutions were supporting e-assessment tools but in 2016 this had increased to 85% for summative assessment (Walker et al. 2016, p. 34). The annual 2016 The Heads of eLearning Forum (HeLF) survey on Electronic Management of Assessment (EMA), based on responses from 53 institutions, reported that 41% of responders estimated that online submission, as the only form of submission, was the most common form of practice across their institution (Newland and Martin 2016, p. 10). These trends represent what Vergés Bausili (2018, para. 1) identifies as ‘a gradual institutionalisation of e-submission and e-marking technologies in UK higher education’.

This change, and the increasing pace of change, is occurring in response to the prospect of significant benefits for institutions. These benefits are mainly focused on improving teaching and learning provision – delivering a consistent and improved student assessment experience (Brunel University n.d.; Glover et al. 2015; University of Bristol 2018); supporting student engagement and enabling richer feedback (University of Aberystwyth n.d.); improving satisfaction, standardisation and consistency (Brunel University n.d); meeting expectations (Stödberg 2012; University of Hertfordshire 2013); improving broader assessment design (Farrell and Rushby 2016); improving legibility and accessibility (University of Northampton 2013); reducing travel and printing (University of Sussex n.d.); and improving secure storage (University of Edinburgh 2015). Change is also driven by the need to improve the staff assessment experience – making marking easier, reducing the administrative burden of assessment on professional colleagues (University of Reading 2017), managing increasing student numbers and maintaining a comparable position with competitors.

Although the majority of institutions are responding to these drivers and have identified themselves as moving away from pocketed or disparate use, scaling up and embedding online assessment is still complex and challenging. Just as online submission, feedback and grading multiplies throughout the sector, so do its complexities and challenges. These experiences are not unique but instead mirror some of the significant barriers that exist in the adoption of technology in general throughout the sector, documented within the existing literature (Birch and Burnett 2009; Latif 2017; Marshall 2016; Salmon 2016; Schneckenberg 2009).

Despite the scale of movement across the sector, there has been very little attempt, within the existing literature, to draw together contemporary institutional experiences of introducing online submission and feedback of assessment and to identify barriers to change – factors preventing and delaying change as well as those that are limiting the benefits of change. There are a number of important outputs from the EMA Jisc Project (Jisc 2016a) and pockets of published institutional material. Much of the peer-reviewed research offers a broad overview of benefits associated with technology and assessment or it offers focused research looking at specific technologies, particular types of assessment, such as digital exams, or it is context specific. This is important work but given the rapid changes within the sector surrounding the shift from offline to online assessment, there is a need to identify and analyse the broad range of existing publications in this field. Disseminating institutional learning is crucial to help ensure that other providers are able to quickly and fully realise the benefits of online assessment, improving both the student and staff assessment experience.

This article provides a contemporary overview of key barriers that have the potential to prevent, delay or limit the benefits to be derived by providers engaging in major transformative change. This paper begins by outlining the approach to the review and evaluation of existing literature before moving on to consider four key areas of challenge drawn from published work. This article argues that providers intending to implement institution-wide change in the future should be cognisant of these barriers, and those currently undertaking change should be cognisant of the experience of others, in order to inform their own good practice, policy and pedagogy.

Approach

To explore the experiences of providers and, in particular, to identify the barriers to the adoption of online assessment and the realisation of full benefits, this research adopts a qualitative approach to survey the current online assessment landscape. The start set identified draws on six major institutional project reports featured within the Jisc EMA programme. These were produced by the University of Huddersfield, the University of Exeter, the University of Hertfordshire, Keele University, Manchester Metropolitan University and Queen’s University Belfast (Ellis and Reynolds 2013; Djordjevic and Milward 2012; University of Hertfordshire 2013; University of Keele, n.d.; Manchester Metropolitan University 2014; Queen’s University Belfast 2014). These reports were chosen because of their focus on online submission, marking and feedback. From this, a non-discriminative snowball method was adopted using both backward and forward survey work. Reference lists within the start set were used to identify new projects and papers that fitted the criteria – papers that were written in the last 10 years (to ensure continued technical relevance), that were sufficiently focused on online assessment and that were written in English. This captured a broad range of content including peer-reviewed journal articles and non-peer-reviewed material – surveys, conference proceedings, project reports, blog posts, university websites exploring policy, process and guidance. Using forward surveying, relevant literature was identified that cited those within the start set using Google Scholar. Once no new papers were found, the review was extended to explore specific authors and specific conferences. In addition, searches were undertaken in journals with relevant aims and scope publishing frequently in the broad area – principally Research in Learning Technology, the British Journal of Educational Technology and Assessment and Evaluation in Higher Education. Material in other journals was also identified using Google Scholar. Articles were selected using keyword searches within a specific 2008–2018 date range and reviewed for relevance to identify those specifically focused on online assessment within higher education. Additional literature was identified through a series of informal discussions with colleagues involved in online assessment projects across the sector from July 2016 to May 2018, including learning technologists and project managers. This amounted to a broad range of project content produced by 29 different UK Higher Education (HE) institutions and 67 additional peer-reviewed articles and reports. Transcripts were created, where necessary, before relevant content was manually organised into groups. These groups were then analysed, collated and combined until key themes emerged, which were then reviewed and refined. This research focuses on publicly available information and, in this sense, is limited to those institutions that have published material relating to their change experience. There may be additional challenges or alternative experiences of change that are not captured here.

Within the literature available, themes emerged clustered around four key areas – change design, stakeholder management, policy and process as well as technical integration. Each will be considered in turn.

Challenges among institutions moving toward online assessment

Designing a change strategy

A core theme identified within the literature is that significant institutional change requires significant institutional planning across a range of key areas. This includes the level of mandatory change, fit with other strategically important projects or functions and the approach to staged change design.

In terms of mandatory change, a number of approaches have been adopted. The University of Northampton has taken a more directive approach. Following earlier piloting, the university’s position, 2013–2014, was that all assignments should be submitted and marked online unless they met strict exclusion criteria (Howe 2014). Others have taken a different approach. Ellis and Reynolds (2013, p. 14) draw on their experiences at the University of Huddersfield, 2011–2013. They emphasise the benefits of adopting a non-directive approach to change management, providing a degree of agency to academic colleagues, at least until a culture of online assessment has been widely embedded. Encouraging this type of organic change relies on the provision of sensitive, sufficient support, evidenced change from early adopters operating as ‘change agents’, incentives and pressure from student demand. Top-down directive imposition may be more likely to incite greater resistance in some institutions, unless strongly aligned to the broader institutional culture (Ferrell 2014a, p. 17), particularly given typically high levels of academic staff autonomy.

Institutions have also tended to be mindful of fit with other projects and functions because this has an impact on the availability of resources or capacity to coordinate with other supporting elements of change. Larger programmes have wanted to ensure that online assessment projects sit alongside pedagogic or technical elements to realise maximum benefits. Sheffield Hallam’s Assessment Journey Programme incorporated two projects – one focused on online management of assessment and another focused on pedagogical aspects of assessment design (Irwin, Childs, and Hepplestone 2016). The University of Reading’s online management of assessment project is situated within a wider EMA programme that includes significant IT development (University of Reading 2017). By drawing in additional supporting elements of change into one programme, broader activities that help to deliver the benefits of online assessment are seen as more likely to be delivered.

In terms of phased change, the University of Huddersfield adopted a staged approach, spanning 5 years, starting with implementation for first-year undergraduate students and then the remaining years (Ellis and Reynolds 2013, p. 36). Similarly Middlesex University focused on first-year students at the start of their phased e-assessment project, which began in 2010 (Gallacher et al. 2014, p. 1). The University of Sussex also started with first-year undergraduate student submissions of suitable written work in 2014 (University of Sussex 2014). Aberystwyth University has adopted a slightly different strategy – the widespread adoption of online submission and then optional online marking and feedback at a later date (University of Aberystwyth 2016). The University of Reading’s institution-wide EMA programme has made use of Early Adopter pilot schools ahead of fuller roll-out in 2018–2019 (University of Reading 2017). These staged approaches allow space for organic change driven by enthusiasts but they also allow institutions to better understand the technical, policy, process and pedagogical requirements at an early stage and address issues ahead of broader roll-out.

Drawing further on the need to understand the requirements of academic colleagues, Ferrell (2014b) highlights conclusions draw by Keele that it has taken some time for academics to alter the kind of practices they have engaged in for years, a warning against a rushed approach to implementation. Similarly, one respondent to the 2014 HeLF survey commented, ‘Never underestimate the effort involved with winning hearts and minds of colleagues’ (Newland, Martin, and Ringan 2014, p. 2). Stakeholders are understandably wary of changing long-term practices given such high stakes. This is particularly important given that assessment is ‘mission critical’ (Newland, Martin, and Ringan 2014, p. 4.) to any university because of its fundamental role in student learning, attainment and satisfaction.

In addition, a slower, staged approach seems particularly important for major online assessment projects that have occurred during or after significant institutional restructuring. Projects may be operating in an environment where stakeholders are more wary, apathetic to change, risk averse or under pressure. Djordjevic and Milward (2012, p. 20) highlight that pressures surrounding a broader restructuring project at the University of Exeter impacted staff involved in their own online coursework management project (OCME).

In order to avoid delaying or limiting the benefits to be derived from online assessment, institutions have tended to be mindful of the nature, scope and pace of change. They have had to consider how their approach to directive change fits with a particular institutional culture, how the scope of any project might help to deliver other enabling elements, how the form of staged change might fit with current institutional capacity and how to adopt the right pace of change within that institution to maximise learning, especially given the impact on stakeholders – a broader issue to which this report will now turn.

Managing institutional stakeholders

A key recommendation seen in a series of project reports, such as the University of Exeter’s OCME project (Djordjevic and Milward 2012, p. 20) and Keele University’s Supporting staff in the use of Technology for Assessing and giving Feedback Project (STAF) (University of Keele n.d., p. 5), is that while technical solutions are important, it is critical to engage in meaningful stakeholder consultation, particularly in the earlier stages of a major project. Exeter’s OCME project concluded that, ‘While focus on the technical solution is important, the project should initially focus on the people and their perceptions and fears. Only when the stakeholders are engaged should the technology be given serious consideration’ (Djordjevic and Milward 2012, p. 1).

Queen’s University Belfast has demonstrated this kind of sensitivity to the psychology of key stakeholders – their experiences, drivers, expectations and requirements. Their e-AFFECT project adopted a methodology of ‘Appreciative Inquiry’ to support the broader use of technology in assessment involving a non-judgemental review of current practice and collaborative forward planning with stakeholders (Queen’s University Belfast 2014, p. 1). The project team created a positive and supportive environment where colleagues were asked to reflect on what works well and build on this by suggesting and trialling new ideas, moving away from a sense that colleagues are being ‘told what is wrong and how to fix it’ (Queen’s University Belfast 2014, p. 37).

Projects across the sector have encouraged similar positive engagement using a range of approaches from programme focus groups, interviews, surveys and breakfast meetings to show and tell events. Programmes have also worked hard to maintain channels of communication using websites, reports, blogs, postcards, promotional videos, explanatory and update screencasts, webinars, external speaker series, symposiums, lunch and learn events, newsletters, quarterly bulletins and adverts around campus. Others emphasise the importance of visibly demonstrating senior management support (Djordjevic and Milward 2012, p. 21). Queen’s University created a short video summarising their e-AFFECT Project, introduced by senior management (Queen’s University Belfast 2012).

In terms of ongoing project communication and support, Sheffield Hallam’s Assessment Journey Programme is seen by many as defining best practice. Ordered around the Manchester Metropolitan and Jisc ‘Assessment Journey’ concept (Jisc 2016b), the team have created a staff-facing website drawing together information on principles, policy, processes, case studies and videos. A similar student-facing site has been launched to support student engagement, adoption and consistent practice (Sheffield Hallam University n.d.).

Others have been careful to demonstrate broad stakeholder support by drawing academics, professional staff and students into project governance or the project team itself. The University of Reading has created three paid student or graduate partner roles within the EMA programme team, as well as seven funded academic partner roles, and has drawn in a broad range of additional academic and professional staff into four work stream boards, a steering group advising on pedagogical impact and the programme board itself (University of Reading 2017). This kind of practice reflects the importance of moving away from a top-down approach and stressing collaborative change management. This also helps to ensure that changes occurring during the programme are embedded into ‘business as usual’.

For some, stakeholder management has been further supported by evidencing benefit claims. The University of Huddersfield and the University of Exeter have spent time gathering evidence using a range of methodologies to explore the claim that engagement with online assessment will significantly improve the student, professional staff and academic experience of assessment and feedback (Djordjevic and Milward 2012; Ellis and Reynolds 2013).

Evidenced-based change is important for all stakeholders but particularly for academic colleagues. The literature suggests significant effort has been spent focused on this specific stakeholder group. A 2014 Jisc survey found that academic staff resistance presented the most significant challenge to implementation. Just over 80% of respondents reported that resistance was problematic in some way (Ferrell 2014a, p. 16). Key concerns include the pedagogical impact, disciplinary difference, workload, eye strain, broader health and safety issues, IT support, staff and student digital literacy and reductions in personal contact with students. Some partial solutions have been offered. Derby offer a ‘print to mark’ scheme for staff with declared disabilities (University of Derby n.d.-a). Others have re-evaluated their institutional equipment strategy. Sheffield Hallam now offers some academic staff second or larger screens, as well as laptops with docking stations, and runs a mobile device loan service (Sheffield Hallam University n.d.). York St John’s Business School trial also strongly recommended the provision of second monitors in order to help address eye strain and increase efficient working practices (Swift and Dransfield 2011), while Derby allowed all academic staff to request a 22” widescreen monitor (University of Derby n.d.-b.). Other concerns relate to systems reliability and the capacity of existing technology to meet current UK requirements (Ferrell 2014a, p. 17), particularly surrounding double marking, anonymity, moderation and the ability to manage grades and substantive feedback separately. Some colleagues remain concerned that assessment technology is driving pedagogy, restricting current practice or future assessment creativity. For others, difficulties remain but positive, evidenced impact, and the enthusiasm of existing academic users, have gone some way to address resistance and feed into organic increases in usage.

Students have tended to be highly supportive, and this has been helpful for many institutions in terms of supporting change. The HeLF EMA survey (2013) reported that institutional responders rated the overwhelming majority of all student stakeholders as responding positively to online submission (Newland et al. 2013). Other institutional surveys, like those undertaken at the University of Huddersfield, have reported equally high levels of student satisfaction and expectation surrounding online assessment (Ellis and Reynolds 2013, p. 18). This was replicated in a small-scale pilot study in the University of Exeter, which found that the overwhelming majority of students were satisfied with their online assessment experience but remained less satisfied with the actual quality of online feedback (Djordjevic and Milward 2012, p. 28). This highlights the value of supporting academic colleagues to use the full functionality of new marking tools and of running online assessment change programmes that incorporate or run alongside projects designed to improve assessment and feedback in general, such as Sheffield Hallam’s Assessment Journey Programme. The business case states that the programme ‘should consider the pedagogic perspective of assessment design and delivery as well as the system and process elements of assessment change … To be successful, the balance between these two strands is a key consideration’ (Irwin, Childs, and Hepplestone 2016, p. 8). The shift from offline to online submission and feedback is only part of the student assessment experience.

Managing staff and student stakeholders well is fundamental for successful, transformative change. For institutions, this means the adoption of effective and sympathetic engagement approaches and meaningful consultation using a wide variety of communication strategies, as well as the creation of an evidence base to win ‘hearts and minds’ and feed into incremental cultural change. This is often time- and resource-intensive work. These are the same issues faced by providers tackling a third key challenge – the identification and management of changes to process and policy.

Managing process and policy change

In terms of process, institutions have rarely adopted a single, standard process, outlining who does what, when and in which order, to guide submission and feedback. Different departments and faculties often adopt their own practice. Ferrell found that 85% of institutional respondents reported some form of local variation in the adoption of institution-wide processes (Ferrell 2014a, p. 14). This variation is challenging – complexity can overstretch administrative staff and IT teams, encourage workarounds, increase duplicated efforts, hamper attempts to automate simple tasks, lead to variations in the student experience and confusion within the student body.

Most institutions have a decentralised, federal structure so often devolve responsibility for process to local levels. Alternatively, where there is centrally established policy and process, local interpretation is variable. This may not have been apparent previously – Lonsdale (2017) has described the e-assessment journey within the School of Nursing and Midwifery at Keele University as one in which process and responsibility have been put under a large magnifying glass.

Aside from variable interpretation and implementation, institutions can also be managing different assessment tools (University of Bradford n.d.). The market in commercially available products is relatively narrow – in 2014, Ferrell found that nearly half of all institutions were using either Strategic Information Technology Services (SITS)/Blackboard/Turnitin or SITS/Moodle/Turnitin. However, a number of institutions have adopted more than one assessment tool, adding complexity. Only 16% of institutions reported having a ‘highly standardised’ approach to EMA tools. Fifty-four percent had one preferred approach with additional variations available. Twenty-eight percent of institutions reported significant variation (Ferrell 2014a, p. 12).

While usage of different marking tools requires multiple business processes, institutions also require variation within those processes to manage different types of assessment. Some forms, such as artwork, performance pieces, films, animations, audio recordings or oral presentations require careful management within broader process maps. A number of institutions continue to report difficulties with the limited functionality of some systems (Ferrell 2014a, pp. 27–33), including their ability to cope with second and blind marking, anonymity, offline marking, peer assessment, group work, file size, scientific notation, non-essay-based submissions, moderation workflows, separate release of marks following the provision of feedback and auditing changes to marks throughout the process. These may require additional workarounds. The need to manage workarounds because of limits to the functionality of some marking tool complicates processes but also can, in itself, be a challenge to staff engagement with those tools.

More broadly, disciplinary differences can also justify variance in process. As Keele University’s STAF project noted, ‘Recommendations that were acceptable, or even already in place, in some academic areas were unacceptable in others’ (University of Keele n.d., p. 4). Institutions tend to recognise this variance, including the University of Exeter, which accepts ‘the need for flexibility within colleges and departments to meet individual pedagogical needs’ (Djordjevic and Milward 2012, p. 20).

For these reasons, very few institutions have adopted a ‘one size fits all’ approach based on only one process but instead adopt a small range of processes as reflected in Keele’s STAF project (Ferrell 2014b, para. 5).

Alongside process, policy is also challenged by the move from offline to online assessment. This transition often forces a review of the current application of policy. It may expose non-compliance, highlight the need for revision and expansion of existing policy and necessitate the creation of new policy requiring institution-wide dissemination and implementation. HeLF data does show a significant increase in the number of universities reporting the adoption of institution-wide policy related specifically to online assessment. In 2016, 64% of responders to the HeLF Survey confirmed that their institutions had an institution-wide policy or set of protocols on e-submission, compared to 24% in the 2013 survey (Newland and Martin 2016, p. 5; Newland et al. 2013, p. 3).

Existing policy assuming offline, hard copy submission and feedback is unlikely to provide sufficient guidance for colleagues to respond to a broad range of scenarios. These might include the submission of files containing a virus, inaccessible files or large-scale systems failure. This has prompted providers, like the University of Manchester, the University of Reading and the Bloomsbury Colleges, to develop new, detailed guidance (University of Manchester 2014; University of Reading 2018; Bloomsbury Learning Environment 2017). Transition might not only trigger additional policy. As part of the e-AFFECT project, Queen’s University developed a set of educational principles for assessment and feedback (Queen’s University Belfast 2014, p. 3). Although important to ensure consistency and transparency, work on policy and underpinning principles is likely to be complex and resource-intensive, involving a number of stakeholders and key committees.

The move to online assessment represents a clear opportunity for institutions to understand, chart and even start to address variations in process to move toward agreement in terms of when and how different marking tools should be used, how different types of assessment should be handled and how any workarounds should be managed. This move also represents an opportunity to review existing policy, interpretation and compliance. However, if institutions are to fully benefit from new approaches to policy and process, they must also start to address the complex and difficult challenge of key systems integration, that is, the integration of the institutional virtual learning environment (VLE) and student records systems (SRS).

The challenge of systems integration

Ensuring the seamless transfer of assessment data from one system to another, by achieving a level of systems integration using single data entry and automated transfers between the VLE and SRS, is crucial if institutions are to take full advantage of new assessment processes, particularly online marking. It is crucial if institutions are to move away from the use of spreadsheets and manual data input, to and realise greater efficiencies in the marks journey process, to address difficulties such as rounding variation, to enhance institutional assessment reporting capacity and make processes more scalable in the face of greater student numbers.

Despite these benefits, exchanging data between systems remains highly problematic. Most institutions use separate VLEs such as Moodle, Blackboard Learn or FutureLearn, integrated tools for marking and feedback and an SRS such as SITS, provided by Tribal. Integration is difficult because it necessitates the systemisation of both policy and process surrounding mark calculations; it reduces flexibility in the interpretation of process, particularly surrounding the imposition of penalties; and it exposes difficulties surrounding variable data quality and interpretations of policy. Integration requires consistency of process even when an institution might not have experienced consistent implementation of policy. In addition, integration relies on an ongoing and consistent conversation between the VLE and SRS. When marks are delayed by academic misconduct or extenuating circumstances, when marking calculations vary or when student enrolment on modules change, this breaks the basic data share conversation. Very common scenarios within higher education add a level of complexity that then must be built into integration. Additional difficulties surround timing technical releases within the constraints of the academic cycle and inexperience managing complex IT change within institutions. The University of Exeter have highlighted their own inexperience in managing the development of bespoke software development to meet user requirements, for example (Djordjevic and Milward 2012, p. 21).

These difficulties may go some way to explain why only 8% of respondents to the HeLF 2016 survey reported that their institution had achieved a level of systems integration that would allow assessment records to be automatically created in the VLE from the SRS. Even fewer participants reported that their institution had achieved the transfer of marks back from the VLE to their SRS, although most confirmed that increased systems integration was an area under development or consideration (Newland and Martin 2016). One of the most developed examples is SOAS, where grades entered into Turnitin Feedback Studio within Moodle are transferred to the SRS using a plug-in, developed in house, to automate bulk transfer. This was designed to reduce academic, administrative and IT support workloads and enhance quality assurance (O’Sullivan 2016). In 2016, Bedford was exporting marks from Blackboard Grade Centre into a CSV file, which would then be uploaded to SITS, but was running a project to automate data transfer to streamline data transfer by August 2017 (University of Bedford n.d.). However, progress is variable. Most recently, in June 2016, Sheffield Hallam announced that their attempts to find a technical solution addressing single mark entry via Blackboard/SITS integration has been significantly delayed. Instead the project team has had to re-evaluate and improve existing mark entry processes. Although providers might embark on the aspiration to achieve single mark entry, the complexities involved may mean that a more effective use of resources is to rely on manual input to move data from the VLE to the SRS and instead focus on improving this process. Systems integration leading to seamless, automatic assessment data transfer largely remains, as Ferrell has described, ‘a holy grail’ for providers (Ferrell 2014a, p. 5) and may actually demand a level of complexity that is currently not financially viable in comparison to the effort required to support the manual transfer of data from the VLE to the SRS.

Summary

This purpose of this article was to outline the findings of a literature review of key project reports and relevant material focusing on the transition from offline to online submission and feedback within the UK higher education sector. Providers are rapidly moving away from isolated and pocketed use towards institutional adoption and the normalisation of online assessment practices. This article argues that although the range of deliverable benefits to academic and professional staff and students is significant, scaling up and embedding institutional programmes remains highly challenging. In particular, designing an effective staged change strategy with the right scope and an effective approach to directive or non-directive approaches to adoption is difficult. The effect of broader institutional cultures and prior experiences of change will also impact this change process. Stakeholder management is complex. It requires sensitive understanding of institutional cultures and, in particular, consistent and meaningful engagement with key stakeholders. Reviewing and revising existing policy and process is challenging and time-consuming, especially given differences in interpretation, differing pedagogical needs and often use of different marking tools. Achieving sufficient levels of systems integration in order for providers to benefit fully from automated transfer of assessment data remains technically difficult and problematic in terms of systematising existing process and policy.

This article finds that a number of institutions have been able to address some of these concerns, leaving some in sector-leading positions and able to benefit from successful change. However, the institutionalisation of online submission and feedback remains complex, demanding and, given the sector-wide scale of change in this area, represents a significant and immediate challenge for many providers within the sector.

References

Birch, D. & Burnett, B. (2009) ‘Bringing academics on board: encouraging institution-wide diffusion of e-learning environments’, Australian Journal of Educational Technology, vol. 25, p. 1.

Bloomsbury Learning Environment. (2017) Procedures for Systems Failure in Relation to Online Submission, [online] Available at: https://moodle.ble.ac.uk/course/view.php?id=153

Brunel University. (n.d) About Digital Assessment @Brunel, [online] Available at: https://www.brunel.ac.uk/about/education-innovation/Digital-Assessment-Brunel/About-Digital-Assessment-Brunel

Djordjevic, A. & Milward, S. (2012) OCME: Online Course Management Evaluation, [online] Available at: https://as.exeter.ac.uk/media/level1/academicserviceswebsite/aboutus/biss/iws/documents/OCMEFinalReportv1.pdf

Ellis, C. & Reynolds, C. (2013) EBEAM Final Report, [online] Available at: http://jiscdesignstudio.pbworks.com/w/file/fetch/66830875/EBEAM%20Project%20report.pdf

Farrell, T. & Rushby, N. (2016) ‘Assessment and learning technologies: an overview’, British Journal of Educational Technologies, vol. 47, no. 1, pp. 106–120. doi:10.1111/bjet.12348.

Ferrell, G. (2014a) Electronic Management of Assessment (EMA): A Landscape Review, [online] Available at: http://www.eunis.org/wp-content/uploads/2015/05/EMA_REPORT.pdf

Ferrell, G. (2014b) Technology Supporting Assessment and Feedback at Keele, [online] Available at: https://ema.jiscinvolve.org/wp/2014/08/06/technology-supporting-assessment-and-feedback-at-keele/

Gallacher, D., et al., (2014) Assessing with e-Ase, [online] Available at: https://eprints.mdx.ac.uk/12213/1/Assessing%20With%20eAse.pdf

Glover, I., et al., (2015) ‘Making connections: technological interventions to support students in using, and tutors in creating, assessment feedback’, Research in Learning Technology, vol. 23. [online] Available at: https://journal.alt.ac.uk/index.php/rlt/article/view/1665

Howe, R. (2014) Requesting Exemptions for 2014/2015 Academic Year, University of Northampton, [online] Available at: http://blogs.northampton.ac.uk/sage/

Irwin, B., Childs, J. & Hepplestone, S. (2016) ‘Assessment journey: a programme to provide a seamless and improved assessment experience for staff and students’, [online] Available at: http://shura.shu.ac.uk/12116/2/Irwin%20Assessment%20journey.pdf

Jisc. (2016a) ‘Electronic management of assessment’, [online] Available at: https://www.jisc.ac.uk/rd/projects/electronic-management-of-assessment

Jisc. (2016b) ‘The assessment and feedback lifecycle’, [online] Available at: https://www.jisc.ac.uk/guides/transforming-assessment-and-feedback/lifecycle

Latif, F. (2017) ‘TELFest: an approach to encouraging the adoption of educational technologies’, Research in Learning Technology, vol. 25. [online] Available at: https://journal.alt.ac.uk/index.php/rlt/article/view/1869/html

Lonsdale, P. (2017). ‘New dogs, new tricks: an e assessment journey’, PowerPoint presentation, Blackboard Users Conference, 5th-6th January 2017, Durham, UK.

Manchester Metropolitan University. (2014) Traffic Project, [online] Available at: http://jiscdesignstudio.pbworks.com/w/page/50670987/TRAFFIC%20Project

Marshall, S. (2016) ‘Change, technology and higher education: are universities capable of organisational change?’, Research in Learning Technology, vol. 18, no. 3, pp. 179–192. doi:10.1080/09687769.2010.529107.

Newland, B., Martin, L. & Bird, A. (2012) ‘An overview of the current UK institutional use of online submission, marking and feedback’, [online] Available at: https://www.slideshare.net/barbaranewland/an-overview-of-esubmission?qid=130bc300-0e6c-4203-8aff-4f229034ce42&v=&b=&from_search=9

Newland, B., et al., (2013) HELF Electronic Management of Assessment Survey Report 2013, [online] Available at: https://drive.google.com/file/d/0B8aF5QN3s_UDUHhyaGFPZWVDZDg/edit

Newland, B., Martin, L. & Ringan, N. (2014) ‘Electronic management of assessment – critical success factors in institutional change’, in Proceedings of EdMedia 2014 – World Conference on Educational Media and Technology, eds J. Viteli & M. Leikomaa, Association for the Advancement of Computing in Education (AACE), Tampere, Finland, pp. 200–203.

Newland, B. & Martin, L. (2016) Electronic Management of Assessment in UK HE 2016: A HeLF Survey Report, [online] Available at: https://drive.google.com/file/d/0Bz7E74T5Am22bXpIRmxxV0RyRWM/view

O’Sullivan, L. (2016) ‘BLE eAssessment and feedback technical development’, [online] Available at: https://docs.google.com/document/d/1WoKC1ig83zV5RJjsoO8pvoUziir25x6x_0F73Tk7AcI/edit

Queen’s University Belfast. (2012) ‘JISC QUB e-AFFECT’, [online] Available at: https://www.youtube.com/watch?time_continue=25&v=tUnLSGrtsSU

Queen’s University Belfast. (2014) E-Assessment and Feedback for Effective Course Transformation: Final Project Report 2014, [online] Available at: http://jiscdesignstudio.pbworks.com/w/page/50671059/e-AFFECT%20Project

Salmon, G. (2016) ‘Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions’, Research in Learning Technology, vol. 13, no. 3, pp. 201–218. https://doi.org/10.3402/rlt.v13i3.11218

Schneckenberg, D. (2009) ‘Understanding the real barriers to technology-enhanced innovation in higher education’, Educational Research, vol. 51, no. 4, pp. 411–424. doi:10.1080/00131880903354741.

Sheffield Hallam University. (2016) The Assessment 4 Students, [online] Available at: https://academic.shu.ac.uk/assessment4students/

Sheffield Hallam University. (n.d.) Assessment Essentials, The Assessment Journey Programme, [online] Available at: http://academic.shu.ac.uk/assessmentessentials/

Stödberg, U. (2012) ‘A research review of e-assessment’, Assessment and Evaluation in Higher Education, vol. 37, no. 5, pp. 591–604. doi:10.1080/02602938.2011.557496.

Swift, N. & Dransfield, M. (2011) Interim Project Report on E-Submission and Marking, York S John University, [online] Available at: https://www.yorksj.ac.uk/media/content-assets/academic-development/documents/Project-Phase-1-Report.pdf

University of Aberystwyth. (n.d.) E-Submission FAQs, [online] Available at: https://www.aber.ac.uk/en/media/departmental/learningteaching/en_esubmission_faqs.pdf

University of Aberystwyth. (2016) E-Submission FAQ-Feedback, [online] Available at: https://www.aber.ac.uk/en/academic/e-submission/2014_policy/

University of Bedford. (n.d.) Electronic Management of Assessment, [online] Available at: https://uoblearntech.wordpress.com/ema/

University of Bradford. (n.d.) ‘E-submission of coursework: options. Which tool should I use?’, [online] Available at: https://www.bradford.ac.uk/elearning/e-SubmissionOptions/page_02.htm

University of Bristol. (2018) ‘The benefits of EMA’, [online] Available at: https://www.ole.bris.ac.uk/bbcswebdav/courses/ap16686_Test_2016/EMAProject/index.html?_ga=2.256352075.1536759855.1524831946-567613791.1476434192#/id/co-05

University of Derby. (n.d.-a) Electronic Marking, [online] Available at: https://www.derby.ac.uk/services/centre-for-excellence-learning-teaching/learning-teaching-and-assessment/assessment-and-feedback/esubmission/electronic-marking/

University of Derby. (n.d.-b) Wellbeing, [online] Available at: https://www.derby.ac.uk/services/centre-for-excellence-learning-teaching/learning-teaching-and-assessment/assessment-and-feedback/esubmission/electronic-marking/well-being/

University of Edinburgh. (2015) Reasons to use GradeMark, [online] Available at: https://www.ed.ac.uk/information-services/learning-technology/assessment/grademark/introduction

University of Hertfordshire. (2013) ITEAM Evaluation Report, [online] Available at: http://jiscdesignstudio.pbworks.com/w/file/fetch/83308807/ITEAM%20AF%20Strand%20A%20Final%20Evaluation%20Report%20Updated%2002-03-14.pdf

University of Keele. (n.d.) STAF Project Final Report, [online] Available at: https://www.webarchive.org.uk/wayback/archive/20140614073219/http://www.jisc.ac.uk/whatwedo/programmes/bcap/keele.aspx

University of Manchester. (2014) ‘Faculty of humanities policy for online submission, plagiarism detection, marking and online feedback’, [online] Available at: http://www.humanities.manchester.ac.uk/tandl/documents/FinalpolicyonlinesubplagiarismdetectionmarkingonlinefeedbackFebruary14_000.pdf

University of Northampton. (2013) ‘SaGE so far…’, [online] Available at: http://blogs.northampton.ac.uk/sage/category/sage-radar/

University of Reading. (2017) About the Programme: EMA Programme, [online] Available at: https://sites.reading.ac.uk/ema/about-the-ema-programme/

University of Reading. (2018) ‘Online submission protocols’, in Section 6, Annex 3, Assessment Handbook, [online] Available at: http://www.reading.ac.uk/web/files/qualitysupport/6_Conduct_of_Assessment_withannexes.pdf

University of Sussex. (n.d.) Electronic Submission and Feedback, [online] Available at; http://www.sussex.ac.uk/adqe/standards/examsandassessment/esubmission

University of Sussex. (2014) ‘Giving teachers more time for teaching and students more time for learning’, [online] Available at: http://blogs.sussex.ac.uk/elearningteam/2014/11/25/reducing-paperwork-electronic-submission-for-all-first-year-students/

VergésBausili, A. (2018) ‘From piloting e-submission to electronic management of assessment (EMA): Mapping grading journeys’, British Journal of Educational Technology, vol. 49, pp. 463–478. doi:10.1111/bjet.12547.

Walker, R., et al., (2016) ‘Universities and colleges information systems association 2016 survey of technology enhanced survey of technology enhanced learning for higher education in the UK’, [online] Available at: https://www.ucisa.ac.uk/-/media/Files/publications/surveys/TEL%20Survey%202016_Nov16