Improving marking effectiveness and feedback provision in an OSCE assessment using Microsoft Forms: A pilot study in Sport and Exercise Therapy

An objective structured clinical examination (OSCE) has been recognised as a reliable but workload-intensive assessment method across health sciences studies. Though a variety of digital marking tools have been employed to improve marking and feedback provision for OSCEs, many of these require specialist software or maintenance. This pilot study examines the development and trialling of Microsoft Forms as a marking and feedback instrument for an OSCE within a Sport and Exercise Therapy module. This study aims to assess whether the use of a non-specialist digital tool, such as Microsoft Forms, might be able overcome limitations in current assessment procedures and ultimately provide a more effective method for marking and feedback provision for an OSCE. Results from OSCE examiners ( N = 8) and students ( N = 30) who participated in the pilot indicate that Microsoft Forms does have the potential to provide a more effective experience for examiners and ultimately improve upon feedback provision for students when compared with a paper-based marking tool. However, concerns around the form’s ease-of-use may ultimately influence its adoption as a marking instrument above current paper-based methods.


Introduction
As with traditional studies in medicine or pharmacy, students of other allied health professions, such as musculoskeletal therapy or occupational therapy, are often assessed using practical, hands-on assessments to evaluate their mastery in clinical skills such as administering treatment or working with patients.
In the context of this pilot study, students studying on the BSc (Hons) Sport and Exercise Therapy course at a university in the UK are required to undertake an examination in which they are assessed on their skill to discuss the various components of subjective history taking, perform an objective assessment and administer safe and effective spinal mobilisations on a hypothetical patient presenting with a spinal

Literature review
Objective structured clinical examinations have been used as an assessment tool in medicine studies globally since the 1970s with a broad agreement amongst educators that they are an authentic and reliable assessment tool (Harden et al., 1975;Rushforth, 2007).More recently, other allied health studies have also adopted the OSCE as a key assessment tool, specifically in the fields of nursing and pharmacy (Kristina & Wijoyo, 2019;Rushforth 2007) and most recently in the training of physical therapy and musculoskeletal specialist professions, such as physiotherapists, occupational therapists and sports therapists (Snodgrass et al., 2014;Swift et al., 2016).
The OSCE typically involves the assessment of clinical skills using simulated patients (SPs) who have been informed as to their role in the assessment, including skills such as patient communication, diagnosis and administration of therapy or treatment.Traditionally, OSCE examiners record students' ability to perform the skill using a paper-based form or rubric.
Several key challenges have been identified in the implementation of the traditional OSCE, despite its recognition as a reliable assessment strategy.Wardman et al. (2018) and Luimes and Labrecque (2018) point out that feedback from an OSCE is most effective for student learning when it is both personalised and delivered in a timely manner.Yet, many authors point out that the translating of paper-based forms into discernible individualised feedback for students is a lengthy process, resulting in students receiving limited actionable feedback oftentimes several weeks after the assessment has taken place, resulting in reduced satisfaction (Ashby et al., 2016;Cham & Cochrane, 2020;Meskell et al., 2015;Snodgrass et al., 2014).Others cite the administrative workload on assessors when processing assessment documents across large cohorts of students (Cham & Cochrane, 2020) and risks such as transcription 3 (page number not for citation purpose) errors and data security when managing paper-based assessment forms (Judd et al., 2017;Meskell et al., 2015).Harrison et al. (2015) highlight that the delivery of comprehensive OSCE feedback is often limited to students who have failed the exam due to the workload it demands.
One suggested alternative to the traditional OSCE format was the introduction of an electronic system for marking.For instance, Snodgrass et al. (2014) piloted the use of an iPad and a licenced assessment-support software during an OSCE for students of physiotherapy and occupational therapy and found that the use of an electronic marking tool improved examiners' perceptions of providing equitable student feedback and reduced administration time post-examination.Judd et al. (2017) reported similar findings by using bespoke marking software on an iPad.Several authors reported that OSCE examiners tended to prefer an electronic marking tool when compared to the traditional paper-based form (Judd et al., 2017;Meskell et al., 2015;Swift et al., 2016).Swift et al. (2016) also reported that the use of an electronic form resulted in less fatigue amongst examiners.
Moreover, Cham and Cochrane (2020) reported that student satisfaction improved significantly in their study of an iPad-based alternative to OSCE marking due to the quality of individualised feedback and the speed at which it was received.These findings are echoed by Daniels et al. (2019), whose participants commented positively on the potential impact of receiving immediate feedback via a tablet-based OSCE marking tool on their ability to develop as student-clinicians.
Based on the aforementioned literature, it may seem obvious that the introduction of an electronic marking system for OSCE assessment would be a logical improvement upon a paper-based marking form.However, Bennett et al. (2017, p. 679) found that academics are often reluctant to adopt a digital assessment solution because of limitations in the support in its use or development, stating that logistical and developmental limitations of tools along with a lack of support often led to unwanted compromises when using digital technologies for assessment, or even abandonment of initiatives entirely.These sentiments are reflected in frameworks such as the Technology Acceptance Model, or TAM (Davis, 1989(Davis, , 1993)), which posits 'perceived ease-of-use' and 'perceived usefulness' as factors that greatly influence the adoption of new technologies, and Van Der Vleuten's (1996) Utility Formula, which proposes that characteristics such as practicality and reliability can impact on the development of new assessment methods in health sciences education.
In the cases of Snodgrass et al. (2014) and Judd et al. (2017), the development of such electronic solutions was supported by an external service provider and an internal applications developer, respectively.Meskell et al. (2015) reported that examiners required bespoke training to use the electronic OSCE system in their study, whilst Cham and Cochrane (2020) discuss how the development of an electronic OSCE tool for optometry studies took more than 1 year to implement.Such investments are inevitably costly for universities or courses wishing to develop a new electronic marking tool, requiring either a third-party marking application or a specialist in-house developer.

Research aim
This pilot study aimed to identify an alternative solution to address these constraints using existing tools within the suite of learning technologies already available at the participating university, specifically Moodle-based plugins, or Microsoft applications, which would require less extensive development and would take advantage of the existing digital capabilities of examiners.The trial and development of this alternative assessment tool were commissioned by the university's BSc (Hons) Sport and Exercise Therapy course team to address the aforementioned limitations of their marking procedures and feedback provision for an OSCE.
The analysis of feedback gathered during the pilot considers: the characteristics a tool would need to provide a more effective marking experience for examiners, more comprehensive and contextualised feedback to students, a sustainable solution to cut paper waste and a digital means of storing marking forms to improve data security.

Digital marking form development
First, the original paper-based marking form and exam structure were scrutinised by the research team consisting of experienced musculoskeletal clinicians and lecturers in Sport and Exercise Therapy, alongside the assessment criteria and rubric, to determine which aspects of the paper form needed to be translated into a digital format.The intention of this was to ensure that all elements of the original OSCE could be represented in a potential digital version.
For instance, the team identified those elements of the OSCE that were standardised for all students, such as the assessment of their breadth and depth of knowledge into the red flag conditions and symptoms relevant to spinal pathology, due to its clinical safety implications.The rubric consisted of sections aligned to assessment criteria, with 16 grade boundaries ranging from A1 to F3.For the second part of the exam, students were assigned a nature of injury (pertaining to the joint, muscle or nerve) and spinal area (cervical, upper thoracic, lower thoracic or lumbar) at random.Clinical scenarios were designated proportionally to the relative frequency of the injury in patients.Each type of injury had its own set of rubric criteria that needed to be incorporated into the digital form.The original marking form also allowed for hand-written comments to be made beside each section.
The OSCE required examiners to physically follow students and SPs through each stage of the assessment.Therefore, it was determined that a mobile-optimised tool would be preferred to allow examiners to mark via a tablet.Finally, feedback collected would have to be saved securely and made available privately for students in accordance with institutional policy.
Once the requirements for the design and administration of the digital alternative had been identified, the research team then explored a variety of digital tools with the aim of replicating the experience, including a simple cloud-based document and marking rubrics in Moodle.The team limited their exploration to tools already available at the participating institution to minimise additional costs or training requirements.The characteristics of the TAM were also considered during the selection of a tool, as 'perceived ease-of-use' and 'perceived usefulness' would need to be achieved in order for any later adoption of the new marking approach (Davis, 1989(Davis, , 1993)).
Ultimately, Microsoft Forms was selected due to its variety of question types, which could align to the rubric, for example, Likert scale, free text, audio-input, its compatibility with tablet interfaces and its ability to save data securely (Microsoft, 2021).Additional benefits included the ability to output individual responses as PDF, (page number not for citation purpose) which could be shared with students as feedback via Moodle, the data analytics dashboard, which could provide an overview of responses for assessors, and the Microsoft Excel output, which could allow for marks to be calculated using formulae.With respect to the TAM, Microsoft Forms was already widely in use at the participating institution as a survey tool, thus fulfilling the requirement for 'ease-of-use' in principle (Davis, 1989(Davis, , 1993)).
The development phase lasted approximately 4-months.Each rubric criterion was created as a Likert scale question in the Microsoft Form.Likert scale questions were then grouped according to distinct procedures or processes measured on the OSCE.Microsoft Forms allowed for separate sections to be created so that each phase of the exam could be recorded before moving on to the next.Conditional outcome questions allowed examiners to identify which of the random allocations had been designated to the student, which then opened the appropriate rubric for that allocation.
Free-text input boxes were added at the end of each section for examiners to type short comments following each phase of the exam.The functions for audio feedback and file upload were also included as optional questions at the end of each section if an examiner wanted to draw a diagram and upload this as part of the feedback, for example.A final free-text section was added at the end of the exam for examiners to provide more generalised feedback in the form of strengths, areas for improvement and feed-forward recommendations.
Following the creation of the digital form, an internal risk assessment was conducted to ensure the protection of student data and to minimise the risks of losing data.Initial testing of Microsoft Forms on an iPad Pro (2018 version) found that data in the form would be cached in the case of a loss of Wi-Fi, closing a browser window and even in the case of the iPad powering off and on again, simulating a battery failure.The risk assessment, therefore, determined that Microsoft Forms would be a secure and viable means through which to log student assessment data and feedback.

Phase 1 testing
To test the administration of the digital form in practice, an initial trial was completed by 4 test examiners.Each tester was asked to complete all sections of the form in a timely manner, as if they were assessing a student.These initial trials of Microsoft Forms served as a proof of concept.A second round of testing subsequently took place in the laboratory where the exam would be hosted.The four testers reported increased difficulty in typing free text comments on the iPad in a simulated therapy space.Portable keyboards and small movable tables were introduced to assist testers in completing the form whilst retaining the mobility required for the examination.

Phase 2 testing
Following the initial testing, the Microsoft Form was evaluated by the original four testers and four additional course examiners to confirm its alignment with existing marking criteria.The digital form was then piloted by four of the aforementioned examiners who marked alongside the traditional paper-based assessors during the official module OSCE.On the day of the pilot, five students volunteered to be marked using both traditional and digital assessment forms.To reduce disruption to proceedings, the examiners piloting the digital marking form were limited to an observation-only role.
The Microsoft Form and traditional paper-based examiner transcripts were internally moderated for consistency of grading and feedback, as per institutional procedures for this level of exam.If required, any arising comments could be discussed between the moderating examiners: in the case of the pilot, no discrepancies were identified.
All module students were given the opportunity to contact the teaching team to receive their written feedback via the traditional paper form, whilst those five students marked with the digital form were sent their feedback by email after the release of all students' grades.

Participant feedback and analysis
To conclude the pilot, the eight examiners who evaluated Phase 2 and all students who undertook the assessment (N = 30) were invited to be surveyed regarding their opinions on digital feedback methods.Ethical clearance was sought and granted prior to gathering participant data.Invited participants were made aware of the purpose of this more formalised feedback procedure, and that participation in the project was optional.Responses were collected via an online survey with a series of open-ended questions, and all responses were anonymised.Questions for participants were written as to allow participants to identify their own themes with the intention that these could be compared with those previously identified within the literature and inform developments for OSCE marking and feedback provision.
For examiners, survey questions asked them to consider the potential benefits and drawbacks of using the developed Microsoft Form or similar digital marking tool to mark the OSCE assessment.A further question asked them to indicate what characteristics the form would need to be an effective marking tool.The four examiners who utilised Microsoft Forms in practice were additionally asked to describe their experiences using the form to mark the OSCE exam and to relate their feedback specifically to their marking practice.
Students who undertook the assessment were also surveyed via an online survey tool (N = 30) to gain their perspectives on their current method of receiving feedback and whether the process could be made more effective, with the use of the digital form.Those students whose assessment was additionally marked using the Microsoft Form (N = 5) were asked to comment specifically about their experience of receiving digital feedback.
A thematic analysis of the survey responses was employed to identify common perspectives amongst the qualitative data with the aim to determine which characteristics of a digital marking tool would be desirable for examiners, what aspects of marking using Microsoft Forms might influence its adoption over current marking and feedback procedures, and perspectives from students around the receipt of feedback.Braun and Clarke's (2006) approach served as the primary basis on which themes were categorised, first through an initial familiarisation of all responses, followed by the coding and categorisation of related statements.Themes were identified primarily through a deductive approach, where factors such as the quality, comprehensiveness and timeliness of feedback, ease-of-use of the marking tool and developmental complexities had previously been indicated within the literature.Statements were then further sub-categorised into the benefits and drawbacks of employing the form amongst examiner-participants, and into current practice and perceived improvements of employing a digital feedback tool amongst student-participants to reflect the challenges identified within the literature regarding traditional OSCE assessments.Sustainability was also included as an emergent theme identified by two participants.
Corresponding themes amongst the examiner and student feedback were highlighted to explore how the digital form might improve the effectiveness of OSCE marking and feedback provision, as per the research aim.

Examiner feedback
The thematic analysis revealed several common themes identified by examiners (N = 8) relating to both the potential benefits and drawbacks of using Microsoft Forms for OSCE marking.Potential benefits of the digital form include ease-of-use, improved efficiency, improved quality and clarity of feedback, and improved sustainability.Potential drawbacks include concerns around ease-of-use, limitations to feedback input, IT-related risks and developmental complexities (see Table 1).
Similar themes regarding the potential benefits of Microsoft Forms were reflected in the responses relating to the ideal characteristics of a digital marking tool.Participants identified characteristics consistent with the developed form such as a 'clickable rubric for marks awarded', 'easily able to move through sections', '[aligned] with learning objectives' and 'voice recordable feedback'.Other preferred characteristics such as 'the ability to write [with a stylus]' were not yet compatible with the developed version of the form.
For those examiners who utilised Microsoft Forms during the assessment, specific recommendations were documented.Notably, all four of the examiners mentioned the production of free-text feedback as an area of challenge, such as this statement from Examiner 3: I found the form easy to use when selecting tick boxes and options, but a bit more time-consuming when using the free text options.Using a stylus to write directly onto the screen might have enhanced this experience.

This sentiment is echoed by Examiner 1:
Limited in what you can and can't do -I like to use doodles to aid descriptions particularly with handling modifications for mobilisations which is not an option here.
Examiner 2 mentioned the use of a portable keyboard as a potential solution: Having a portable keyboard and small table to type comments was hugely beneficial in allowing me to produce free text comments quickly for each section.
Other recommendations included 'new examiners simply [observing] the exam in the first instance... before attempting to complete the assessment form alongside the Concerns over ease of use 6.1 It needs to be user friendly so everyone can pick up the tool and use it.6.2 The familiarisation time needed to get used to the marking tool.6.3 Forms might need to be continuously revised before it can be usable in an easy and efficient fashion.6.4 A bit more time-consuming when using the free text options.*6.5 Difficult to come back to sections.*6.6 Difficult to keep up with the speed of the [exam].*6.7 Time it can take to type in free text with the keyboard ([an] iPad pen might have been more useful).*

9
(page number not for citation purpose) exam' and 'More developments... to align the structure of the form to the narrative that experienced examiners might follow'.Finally, Examiner 2 commented specifically on the use of Microsoft Forms compared with marking tools available in Moodle: The Microsoft Form is ideal for use on a portable device, whereas other marking tools that we have available are constrained within Moodle and optimised for desktop use.
When asked whether they would prefer to retain current methods of marking and feedback or progress to the digital form, 6 out of 8 examiners responded positively to the digital method.Those who did not recommend a switch to the digital method both advised that a better system of recording free-text comments was needed within the Microsoft Form.

Student feedback
The thematic analysis of student responses (N = 30) revealed a similar categorisation of themes, which reflect the potential benefits of digital feedback methods identified by the examiner participants.When discussing their current methods of receiving feedback, key themes included the quality, timeliness and communication of feedback.When asked specifically about potential improvements to the current method, response themes centred around addressing limitations identified in current feedback practices (see Table 2).
All students surveyed were also introduced to the Microsoft Form developed for their OSCE assessment and asked whether the form might address some of the concerns raised in their feedback.Twenty-two out of thirty participants responded that they would prefer examiners to adopt Microsoft Forms or a similar digital marking device when compared with the current method of requesting their paper-based feedback.Those who indicated that they preferred the paper-based method (N = 8) tended to respond positively to current feedback practices and did not denote a specific reason for their hesitancy towards a digital alternative.
The participants who were additionally marked using Microsoft Forms (N = 5) were asked to comment on their digital feedback following the initial survey of all students.Interestingly, the same themes arose from these students' responses, which address the limitations of the paper-based feedback method (see Table 3).

(page number not for citation purpose)
Out of those students who received a version of their feedback using the Microsoft Form, all responded that they would prefer the digital form to the current paperbased version of their OSCE feedback.Only two identified a potential drawback of using a digital device, with one commenting that 'sometimes technology does not work and could crash', whilst a second commented that '[the] teacher may not be able to write everything out, since you need to type and not write'.

Discussion
Although only a small number of participants consisting of examiners and students took part in this pilot, the results of the survey revealed a generally positive response towards the adoption of digital feedback methods for the OSCE exam.Notably, those who engaged specifically with Microsoft Forms reported that this tool for assessment and feedback would be able to address the concerns raised by all participants regarding the current marking and feedback practices for the OSCE assessment.
Many of these themes were also those highlighted within the literature, as anticipated through the deductive approach taken within the thematic analysis.Several authors commented on the importance of timeliness of feedback (e.g.Ashby et al., 2016;Daniels et al., 2019;Snodgrass et al., 2014), a factor that is echoed by the student recommendations in Table 2 and addressed as a potential benefit of the OSCE Microsoft Form by examiners and students in Tables 1 and 3, respectively.Others note the importance of both timeliness and individualisation of feedback within OSCE feedback (Luimes & Labrecque, 2018;Wardman et al., 2018).Whilst the individualisation of feedback may not be improved exclusively by the use of the OSCE Microsoft Form, it may help to address the need for feedback to be both individualised and comprehensive, as noted in recommendations amongst all participant groups.Marking via the digital form may help to reduce the administrative workload of compiling such comprehensive feedback, as noted by Cham and Cochrane (2020).
Specifically for examiner participants, one recurring theme is Microsoft Forms' ease-of-use for marking as a key factor for its adoption and implementation as a primary assessment tool.Similar to the findings of Bennett et al. (2017), assessors expressed the need for a digital device to be 'easier' and 'quicker' to use for marking when compared to their current paper-based method, regardless of its potential benefits for students.Such perspectives align with the TAM (Davis, 1989(Davis, , 1993)), which proposes that the primary determinant for the adoption of a new technological solution is through its perceived ease-of-use, and to a lesser extent with the Utility Formula (Van Der Vleuten, 1996), which indicates practicality as a key factor in developing health-sciences assessments.This may help to explain why some examiners would be hesitant to adopt Microsoft Forms over the current method.Overcoming concerns around ease-of-use, particularly as they relate to the production of free-text comments (Snodgrass et al., 2014), may therefore be crucial in realising change.
Other limitations to note are those proposed by Bennett et al. (2017) and Meskell et al. (2015) who commented, respectively, on development time and bespoke training requirements as factors that may impact the adoption of a digital OSCE solution.Both themes are echoed within the potential drawbacks identified by examiners in Table 1.Given that all the examiners were able to review the form, even if they did not test it in practice, it is evident that they felt more development may be required, thereby mitigating some of the potential benefits of using a more common office application for marking and feedback purposes.

Conclusions and recommendations
This study detailed the development of a Microsoft Form, which was piloted as an improved marking and feedback tool for an OSCE on a BSc (Hons) Sports and Exercise Therapy level-5 module.The purpose of the pilot was to develop and trial the use of a non-specialist tool to overcome the limitations of using a paper-based marking instrument within the OSCE.Microsoft Forms was selected because it did not require a specialist developer to maintain and was already in use at the participating institution, thus taking advantage of the existing digital capabilities of markers.Its variety of question types aligned with the assessment's existing marking rubric and format.The mobile-friendly interface could facilitate marking in an exam where the examiner may need to move along with the student and their simulated patient.Moreover, initial testing deemed that the risks of losing participant data during the exam were minimal due to the form's auto-save features and automatic data caching.
A final version of the Microsoft Form was trialled by examiners to secondarily mark students during a spinal mobilisations OSCE.Module examiners and students undertaking the module were also surveyed on their views of current assessment and feedback practices and whether a digital solution, such as Microsoft Forms, could improve the effectiveness of marking and feedback delivery.
(page number not for citation purpose) Results revealed that perspectives towards a digital form as an alternative OSCE marking and feedback tool were generally positive amongst both student (N = 30) and examiner participants (N = 8), particularly as they related to potential improvements in the comprehensiveness, timeliness and individualisation of feedback.However, concerns indicated by examiner participants surrounding the form's ease-of-use, such as the need for improved recording of free-text comments and additional familiarity with the tool, may be crucial in influencing the team's adoption of the digital form in place of the paper instrument.Additional benefits included a reduction in paper wastage and printing costs and improved data security.
Arguably the most significant conclusion that can be drawn from this study is that 'perceived ease-of-use' continues to be a dominating influence in the adoption of digital, even amongst those technologies, which may already be familiar to users (Davis, 1989(Davis, , 1993)).One proposed solution is that improving confidence amongst examiners in using Microsoft Forms for this alternative purpose may improve perceptions of ease-of-use and subsequent utilisation as a marking and feedback tool (Joo et al., 2017).For instance, trialling the use of the digital form for marking an exam with fewer components or criteria may allow examiners to gain confidence in using the tool for marking, thereby improving its ease-of-use.Hesitancies suggested by students around the receipt of digital feedback may also be alleviated through greater confidence gained by examiners.As argued by Greener and Wakefield (2015), educators' confidence in utilising digital learning tools may have an important impact on students' engagement with those tools.
One clear limitation of this study is the relatively small number of participants, particularly those examiners who participated in trialling Microsoft Forms during a live exam.A greater number of test examiners would undoubtedly be able to provide a richer variety of feedback to inform the development of the digital form and to judge whether it could viably replace a paper marking form in all circumstances.Factors identified within the literature review, including efficiency, administrative workload, possible transcription errors and the comprehensiveness of feedback, might be more thoroughly evaluated with a greater sample of test-examiners utilising the form as primary or second markers whilst completing the formalised marking and feedback procedures of the OSCE.
Further developments to the Microsoft Form may also aim to address limitations identified by the participants.For instance, a bank of common feedback comments from which examiners could select several options may reduce the need to hastily type free-text feedback comments alongside each criterion, though this might create a risk of further generalising feedback.
It is evident that the utilisation of digital assessment tools has the potential to improve the effectiveness of marking and feedback provision, particularly in the context of an OSCE, where it is essential that students are evaluated based on their performance in a live simulated environment.The results of this pilot study indicate that Microsoft Forms does, indeed, have the potential to meet the criteria of an OSCE assessment instrument.Notably, Microsoft Forms' role as a non-specialist application means that there is a potential for reduced development time and user training requirements once the application has been proven effective as an assessment tool in this context.This research demonstrates the value of exploring such non-specialist applications for use in Higher Education assessment, particularly as practical, simulated and authentic assessment becomes more prevalent (Sambell et al., 2019).

Table 1 .
Examiner (N = 8) feedback on the adoption of a new digital assessment tool.

Table 2 .
Student (N = 30) feedback on current assessment methods and a proposed improvement.Talking through the grade in person would be beneficial.

Table 3 .
Student (N = 5) feedback on receiving OSCE feedback using the developed Microsoft Forms.Clear writing, easy access and more organized.3.3Another benefit is that it is easier to send feedback digitally, and it is also clearer to understand than someone's handwriting.