MESH360: a framework for designing MMR-enhanced clinical simulations

Thomas Cochranea*, Stephen Aiellob, Stuart Cookc, Claudio Aguayoa and Norm Wilkinsonb

aCentre for Learning and Teaching, Auckland University of Technology, Auckland, New Zealand; bParamedicine, Auckland University of Technology, Auckland, New Zealand; cGarden City Aviation, Nelson, New Zealand

(Received: 4 November 2019; Revised: 13 December 2019; Accepted: 16 December 2019; Published: 13 February 2020)


This article evaluates the results of two prototype iterations of a design-based research project that explores the application of mobile mixed reality (MMR) to enhance critical care clinical health education simulation in Paramedicine. The project utilises MMR to introduce critical elements of patient and practitioner risk and stress into clinical simulation learning scenarios to create more authentic learning environments. Subjective participant feedback is triangulated against participant biometric data to validate the level of participant stress introduced to clinical simulation through the addition of MMR. Results show a positive impact on the learning experience for both novice and professional paramedic practitioners. The article highlights the development of implementation and data triangulation methodologies that can be utilised to enhance wider clinical simulation contexts than the original context of Paramedicine education. We argue that our collaborative transdisciplinary design team model provides a transferable framework for designing MMR-enhanced clinical simulation environments.

Keywords: immersive reality; biometrics; design-based research; critical care health education

This article is part of the special collection Mobile Mixed Reality Enhanced Learning edited by Thom Cochrane, James Birt, Helen Farley, Vickel Narayan and Fiona Smart. More papers from this collection can be found here.

*Corresponding author. Email: Thomas.cochrane@aut.ac.nz

Research in Learning Technology 2020. © 2020 Thomas Cochrane et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Research in Learning Technology 2020, 28: 2357 - http://dx.doi.org/10.25304/rlt.v28.2357


Since 2016 the authors have established a collaborative transdisciplinary project team (MESH360) that has explored the potential of mobile mixed reality (MMR) to enhance health education (https://www.researchgate.net/project/MESH360). The MESH360 team is focused upon preparing higher education health care students with the practice and critical diagnostic capabilities they will need as professional health practitioners in the 21st century. A design-based research methodology informs the design and refinement of low-cost MMR technologies to increase the authenticity of both low-fidelity and high-fidelity clinical simulation learning environments (Cochrane et al. 2016, 2018b, 2009). Through the initial scoping and literature analysis stage of the project, we identified a paucity of literature and longitudinal research that critically engages with the intersection of learning theory, the design of clinical simulation learning environments and the development of student critical thinking (Aguayo, Cochrane, and Narayan 2017; Stretton, Cochrane, and Narayan 2018). The specific context that this article explores is within Paramedic education. In this article, we detail the development of a framework that the MESH360 team employs in the implementation and data gathering process to critically evaluate the first two prototype MMR-enhanced clinical simulation learning environments. Uniquely we triangulate subjective participant feedback with biometric data stress indicators to gain insights into the impact of the MMR-enhanced scenarios upon participant learning. The MESH360 collaborative transdisciplinary development team provides a model for supporting the implementation of MMR within a design-based research (DBR) framework in clinical health education simulations.

Literature review

In this section, we outline the body of knowledge surrounding the key concepts that underpin our research, including low-fidelity clinical simulation, high-fidelity clinical simulation, pedagogical frameworks, biometrics, MMR in clinical health education, design-based research and design principles.

Low-fidelity clinical simulation

Fidelity is a measure of how authentic a simulation activity is in replicating a real-world experience (Laschinger et al. 2008). Low-fidelity clinical simulation involves the use of clinical scenarios that are described through text, imagery or verbally. Textbooks are the classic low-fidelity way of describing and illustrating a clinical scenario. Augmented reality (AR) has received a resurgence of interest in education with the development of technologies such as the Microsoft Hololens (Leonard and Fitzgerald 2018) and has been used to enhance textbook experiences through image recognition-triggered playback of multimedia. However, developing authentic learning experiences for devices such as the Hololens relying on high-end computing capacity is still prohibitively expensive, and most current examples bring little more authentic learning affordances than that of an interactive textbook. In Paramedicine education practice, low-fidelity clinical simulation is used to introduce students to a high-fidelity clinical simulation exercise.

High-fidelity clinical simulation

High-fidelity clinical simulation typically involves a high level of learner interaction with an authentic learning environment. Examples include mixing AR and virtual reality (VR) technologies to simulate instrumental feedback for virtual practice of clinical procedures (Birt, Moore, and Cowling 2017) and the use of high-fidelity mannequins (Kaufman 2010). High-fidelity mannequins are capable of replicating human physiology and biometric data and can simulate patient verbal responses to a student/practitioner via remote control and verbal questions from an expert observer in a co-located observation room – typically via a one-way window. However, the environment that the mannequins are usually deployed in during clinical simulation assessments is often sterile, non-authentic and low fidelity. Typically, mannequins are placed in environments that minimise learner and observer distractions, with expert observers placed behind one-way mirrors providing real-time learner feedback and assessing their practice. This does not authentically replicate the messiness and complexity in real-world critical care environments. While these high-fidelity simulation settings provide safe learning environments, they can be enhanced through the introduction of immersion affordances (possibilities offered by technological tools), for example, through mobile head-mounted display (HMD) VR and cave automated virtual environment (CAVE) type display and audio feedback (Muhanna 2015) that better reflect the environmental stressors and interactions that guide professional practitioners in real-life critical care situations. The design of clinical education simulation experiences should be guided by appropriate pedagogical frameworks.

Pedagogical frameworks

In a 2018 systematic review of the literature (Stretton, Cochrane, and Narayan 2018), the authors identified a general lack of engagement with critical learning theory and the development of higher level critical thinking and creativity in the design of clinical simulation environments. The literature evidences that the design of clinical simulation learning environments is predominantly based around the development of learner core practice-based competencies. We argue that clinical simulation environments can be designed that focus upon ‘ontological pedagogies’ (Cochrane and Narayan 2014; Dall’Alba and Barnacle 2007) or the process of learners becoming professionals rather than knowledge transfer or simply competency based. In response to this, we have sought to ground our research and learning design on the principles of heutagogy. Heutagogy (or self-determined learning) is a learning framework that emphasises the development of learner capabilities to navigate the unknown (Hase and Kenyon 2007). Our core graduate capabilities map closely to the key principles of heutagogy – learner agency, self-efficacy and capability, reflection and metacognition, and non-linear learning (Blaschke and Hase 2019). According to Blaschke and Hase (2019), heutagogy builds upon several learner-centred learning theories including social constructivism, connectivism and the neuroscience of learning. Heutagogy does not deprecate the role of the teacher in learning, but focuses upon a change in the roles of the teacher and the learner whereby learners take a more active role in their learning, and teachers become designers and facilitators of authentic learning experiences rather than deliverers of content. Heutagogy exists on a continuum of pedagogical practice and strategies from teacher directed to student determined, or the PAH continuum (Pedagogy Andragogy Heutagogy). Thus, learners are scaffolded from a dependence upon the teacher towards developing the capabilities to navigate and problem solve in new and unknown environments. Blaschke and Hase (2019) argue that digital technologies and social media can support the design of learning environments that leverage the principles of Heutagogy. Measuring the impact of learning innovations is inherently difficult and often reliant upon subjective feedback from learners/participants. To add rigor to the evaluation of our learning innovations, we explored triangulating subjective participant feedback with biometric feedback data.


Physiological responses to simulated stress environments have been found to replicate those experienced in real life (Aguayo et al. 2018; Ming-Zher, Swenson, and Picard 2010). Wearable technologies are emerging as key tools for monitoring and measuring a users’ biometric health data (Edgerton 2019). Wearable technologies ‘constitutes a non-invasive way to evaluate the level of engagement, subjective experience and differential biological responses of learners to educational content and environments’ (Cochrane et al. 2018a). Measuring learning and the impact of the design of learning interventions is notoriously subjective. However, we argue that measuring participant biometric stress indicators (such as heart rate, blood pressure, and galvanic skin resistance or sweat) provides tangible quantitative feedback on the level of stress resulting from the introduction of VR-enhanced simulation learning environments and therefore a direct impact upon participant learning (Cochrane et al. 2019). Therefore, biometric data can be utilised to triangulate subjective learner feedback from an MMR-enhanced experience (Cochrane et al. 2018a).

MMR in clinical health education

MMR encompasses the technological spectrum from real-world experience to AR and VR, to fully digitally immersive experiences (Milgram and Kishino 1994). Barr and Foster (2017) note ‘Although the benefits of simulation in medical and health education have been well researched, there is a paucity of research into how to deliver simulation using immersive media (IM) due to its recency as a strategy in paramedicine’ (Barr and Foster 2017, p. 121). Foster (2017) summarises three key affordances of mobile VR in health (paramedic) education that facilitate authentic connections between higher education and professional practice:

Several contexts have been identified as being highly relevant to immersive learning environments, including clinical and critical care health, automation, high-risk environments, environments that are prohibitively costly to reproduce, and educational environments that utilise simulation. These can be directly mapped to mixed or immersive reality learning environments. Figure 1 illustrates mapping of the immersive reality continuum to a continuum of risk-level learning contexts: modified from Eames and Aguayo (2019).

Fig 1
Figure 1. Milgram and Kishino’s (1994) immersive reality continuum mapped to levels of learning environment risk.

To guide the design of our MMR scenarios for clinical health education, we chose design-based research as a foundational methodology.

Design-based research

As our research essentially builds upon the foundation of mobile learning (Cochrane 2013a, 2013b), we leverage the maturing body of mobile learning research to guide the design of our projects. Cook and Santos (2016) argue that DBR provides a state-of-the-art framework for designing authentic mobile learning environments. We have chosen the McKenney and Reeves (McKenney and Reeves 2019; Reeves, Herrington, and Oliver 2005) model of design-based research. According to the McKenney and Reeves model, the four iterative stages of DBR include analysis and exploration, design and construction, evaluation and reflection, redesign and dissemination. The goal of DBR is the development of transferable design principles that can be utilised in contexts beyond that of the original research project and thus potentially generate a wide impact upon teaching and learning. For example, Narayan, Herrington and Cochrane (2019) describe the development of transferable design principles for utilising mobile devices to facilitate heutagogical learning.

Design principles

In a previous paper, we identified key design principles (DP) from the literature for designing authentic mobile learning and scaffolding innovative pedagogies (Cochrane et al. 2017), summarised here as five design principles:

These five design principles informed the iterative design and development of a variety of projects in health care education disciplines at the university (Cochrane et al. 2018b). However, while the design principles have guided project development at a high level, each context requires the development of a targeted research and implementation methodology. Through two iterations of the MESH360 MMR project, we have refined a research and implementation methodology that we believe can be used as a transferable framework across the health disciplines, and we trace the development of this framework across two project design cycles in paramedic-enhanced simulation scenarios in the rest of this article.


In this section, we outline the key elements of our research methodology.

Research questions

Our over-arching research question is: ‘How can we design clinical simulation learning environments that are more authentic (than current practice), facilitate the development of higher order critical thinking and are cost-effective?’ The iterations of our design-based research project tackle these issues and build our expertise as a learning design team to address them.

Transdisciplinary design team

The MESH360 project team is comprised of a collaborative transdisciplinary team of researchers, development team leaders, practitioners, students and professionals (Table 1).

Table 1. Transdisciplinary design-based research project team.
Team members Role in project
Academic advisors Principal investigator and educational technologist
Digital development team Co-principal investigator and immersive reality application development team
Paramedicine lecturers Paramedic lecturers and core members of the MESH360 enhanced simulation project development, development of simulation environment
Embodied reports, Santiago, Chile Biometric data researchers and tracking software development
Paramedic students and practitioners Simulation participants: 30 student volunteers from year 1–3, 5 invited professional paramedics

DBR phases

We used McKenney and Reeves (2019) four-stage model of DBR (Figure 2) as a methodology for the project.

Fig 2
Figure 2. MESH360 Design principles mapped to McKenney and Reeves (2019).

DBR provides a structured, four-phase iterative framework (McKenney and Reeves 2019) for designing MMR-enhanced clinical simulation learning environments for health education (Cochrane et al. 2017). The four phases of our project and timeline are:

Phase 1: Analysis and exploration – 2016–2017: Identification of the critical pedagogical issues surrounding the design of immersive reality (XR) learning environments and exploration of supporting literature to identify initial design principles to address these issues (Cochrane et al. 2017).

Phase 2: Design and construction – 2017: Prototyping of the design of an XR learning environment and pedagogical intervention informed by the identified design principles (Cochrane et al. 2018a).

Phase 3: Evaluation and reflection : Evaluation of the prototype XR learning environment design through user feedback and refinement of the design principles (Initial prototype evaluation 2018).

Phase 2–3 Loop: Iterative redesign and re-evaluation of the prototype XR learning environment (2017–2019).

Phase 4: Theory building : Development of transferable design principles and dissemination of findings (2019).

This article evaluates the first two iterations of prototype design and evaluation phase 2–3 loop, 2017–2018.

Implementation framework

We implemented several stages and strategies throughout each project to collect participant data for analysis on the impact of the MMR-enhanced clinical simulations (Table 2). This also involved on-going testing of rapidly developing new XR technologies. University ethics approval was sought prior to the initial data collection in 2017, and this was granted for a period of 3 years, followed by a modified ethics approval in 2018 to cover the new developments in the research project scope as the project moved from static scenario critique in 2017 to more advanced enhanced clinical simulation scenarios in 2018. Through these two project iterations, we developed an optimal implementation framework, as outlined in Table 2.

Table 2. Mobile XR-enhanced critical care simulation implementation framework.
Data collection activities Implications for enhancing critical care clinical simulation
Recruitment of volunteer participants and ethics procedures Participants invited via Facebook, Instagram and announcement on the Learning Management System (LMS) to respond via email to a project email account. Respondents were then emailed a simulation booking, instructions, consent form and information sheet – in accordance with the MESH360 ethics approval from the university ethics committee AUTEC 17/29
Pre-survey to gather participant demographic and prior experience data Anonymously coding participants to map the impact of the VR-enhanced simulation
Biometric sensors worn on wrist by participants Measure participant stress levels via GSR and HR
Pre-VR experience – head-mounted display (HMD) Define baseline levels of stress
VR experience – HMD with eye-tracking, hot spot activation, user navigation of scenario, wirelessly mirrored to monitor for observation Creating an authentic simulation within a real context that approximates real-world stress and risk elements and participant diagnostic skills via multi-sensory VR experience
Post-VR initial diagnosis Participant initial diagnosis based upon VR simulation
Traditional clinical simulation treatment of high-fidelity mannequin Participant treatment of mannequin based upon VR scenario
Post-clinical simulation diagnosis Participant final diagnosis of VR + clinical simulation
Post-simulation participant interview – videoed Brief subjective participant feedback on the impact of the VR experience
Post-simulation participant survey Purposive sampling of subjective participant feedback on the impact of the VR experience

This implementation framework will continue to be refined in future iterations of the project. The focus of each project iteration reported in this article was as follows:


Paramedic’s scope of practice, in pre-hospital and out-of-hospital environments, requires a comprehensive understanding and application of a range of clinical procedures. These procedures require paramedics to work autonomously or as part of multidisciplinary teams, and to take a multi-system-based approach to managing patients’ conditions. In addition, they often work within multidisciplinary teams to manage patients who have sustained multiple injuries following traumatic events, such as road traffic collisions. Thus, we designed the first iteration of our mobile VR learning experience in 2017 to help Paramedicine students develop critical scene awareness, to help prepare them for the wide variety of real-world contexts in which they will practice.

Research question

How can mobile VR and biometric sensor technology be utilised to study Paramedicine student’s response to analysing critical care environments?


The MESH360 project aimed to give paramedic students a 360° overview of a critical care scenario before entering the simulation suite where they would then ‘treat’ a high-fidelity mannequin. The pre-simulation scenario was designed to allow the student a period of time to critically evaluate the scene, thereby allowing students to gain information to make informed decisions during the simulation. Typically, when doing simulations the teacher will provide scenario context information to the student verbally. Although this works, it can interfere with the students’ train of thought as they focus upon entering the simulation suite and potentially interrupt their learning. By engaging in 360° VR, the student experiences an authentic environment from which they can learn from by exploration in a more authentic manner as an on road paramedic does in real-life practice. A detailed outline of the development and deployment of the 2017 prototype is covered in a prior conference paper (Cochrane et al. 2018a), summarised here.

A 360° photograph was taken of a scene using a wireless LG 360 camera. The VR environment was captured from authentically staged scene contexts using a 360° camera. The scene was a building site with multiple potential hazards that included chemicals, people hiding and exposed electrical and water pipes. The image was peer reviewed to identify hazards and ensure the risks were clearly represented. The image was subsequently imported into the Seekbeak web-based VR platform where each hazard was converted to an invisible individual ‘hotspot’. During the study, the VR scenario was viewed by all participants on a smartphone using a Google Cardboard compatible VR headset and using the Seekbeak Web-based platform.

The initial project prototype was tested with 45 undergraduate AUT Paramedicine students volunteering to participate in a 1-min mobile VR simulation in August 2017 (Figure 3). Initial feedback and evaluation of the 2017 prototype experience included the need to redesign the embodied biometric instruments for ease of use and reliability; redesign of the enhanced simulation environment through adding development of multiple authentic scenarios that are linked through the VR experience; addition of new forms of user interaction; and exploring higher definition HMD. We addressed these issues in the design of the 2018 VR scenario that moved from a static 360 scenario viewed via a Google Cardboard compatible HMD to the development of a virtual ambulance callout involving a combination of 360° video and static scenario exploration via an Oculus Go HMD.

Fig 3
Figure 3. 2017 Prototype deployment with students.


The redesigned prototype was tested in 2018 with 30 volunteer Paramedicine students across all 3 years of the degree, and volunteer professional paramedics.

Research questions

The 2018 iteration of the MESH360 Paramedicine project explored the following research questions:

How effective is immersive reality for authentically preparing tertiary paramedic students and upskilling workplace paramedic professionals to develop the critical decision-making capabilities they need to best respond to unfamiliar high-risk critical care incidents?

What are the key elements of an implementation framework that can guide the scalable development of accessible immersive reality learning environments that enhance critical care simulation training for authentic real-world high-risk first responder scenarios?


The 2018 MESH360 project aimed to give paramedic students a 360° overview of a critical care scenario before entering the simulation suite where they would then ‘treat’ a high-fidelity mannequin. The pre-simulation scenario was designed to allow the student a period of time to critically evaluate the scene, thereby allowing students to gain information to make informed decisions during the simulation. By engaging in 360° VR, the student experiences an authentic environment from which they can learn from by exploration in a more authentic manner as an on road paramedic does. A detailed outline of the development and deployment of the 2018 prototype is covered in a prior conference paper (Cochrane et al. 2019), summarised here.

The 2018 VR experience consisted of a 4-min long ‘Ambulance Ride’ VR experience, created using 360° cameras and edited with a combination of WondaVR and an interactive VR scenario via Seekbeak, which was deployed as follows:

  1. Pre-VR and Water scene: Presentation of ‘calm’ scenario, a nature scene, to smoothly introduce participants to the VR experience (15 s),
  2. Amb Start: Transition to static 360 Panorama of the back of the ambulance (45 s) to gain baseline biometric data via heart rate monitor app on smartwatch worn by participant. Transition to 360 video of the back of the ambulance ride including ambient sound (1 min),
  3. Job: Presentation of job description by radio call first, then by text box providing more details, followed by a job update (cardiac arrest update) increasing complexity of job (radio first, then text box),
  4. VR2A: Arrival at accident scene in a garage with patient for participants to explore,
  5. VR2B: Close-up scene of patient with emergency care equipment laid out exactly as in the physical high-fidelity mannequin simulation suite.

The participant VR experience was wirelessly mirrored to a monitor for the research team to follow their progress and exploration of the VR scenario (Figure 4). Participant heart rate throughout the scenario was monitored by an observer checking the display of a smartwatch worn by the participant. Following the VR simulation experience, participants were asked to provide a preliminary diagnosis of the patient and were then ushered into the adjacent physical simulation suite with a high-fidelity mannequin and equipment to demonstrate treatment procedures while observed through a one-way window.

Fig 4
Figure 4. 2018 Prototype deployment with students.


In this section, we analyse the results of the 2017 and 2018 prototype design and evaluation of MMR-enhanced clinical simulation. For each iteration, we analyse participant’s pre- and post-survey responses, VR scenario eye-tracking heat maps and the development of biometric triangulation of subjective participant feedback.


2017 Participants pre-simulation survey

In the 2017 project iteration, we used SurveyMonkey to host anonymous pre- and post-scenario participant prior experience and post-simulation feedback. The SurveyMonkey surveys were linked via Quick Response (QR) Codes for access via participants’ mobile devices. As we used a free SurveyMonkey account, we were limited to 10 questions per survey and therefore created two pre- and two post-surveys for a total of 20 questions pre- and post-simulation:

In summary, 67% of the participating students had never used an HMD before the project, and only one student had any significant prior experience with an HMD.

2017 Participant post-simulation survey

In summary, participating students’ feedback upon the prototype experience was very positive with 96% positive feedback and 94% agreeing they would like further use of MMR in their course; 92% of participating students agreed that MMR enhanced the quality of simulation-based learning, with 94% agreeing that the experience made them more aware of their critical environment, and 82% of participants found the use of the HMD easy. Participant feedback included:

Having a visual representation of a scene would be immensely beneficial as I find it hard to visual a scene when described to me. At the start of a scenario we have to ask lots of question about the scene which becomes tedious and takes away from the physical aspect of being part of a scenario. Having the VR will give you an accurate and consistent description of a scene instead of a subjective individual imagination of a scene. (Participant 1)

It was a bit disorientating at the start, forgot that there was room behind me but made for a very interactive experience in scene safety and danger identification. Much better than what is done in a classroom setting. (Participant 2)

I felt like I was at the scene and was completely different from what I have experienced before which made me want to interact with the scene more. (Participant 3)

Some participants found the quality of the HMD and the inability to adjust focal length initially distracting, but found the concept and overall experience valuable.

2017 Heat map and biometric feedback

The initial prototype biometric analysis indicated that there was a definite correlation between critical incidents in the VR scenario and participant biometric stress responses. Figure 5 illustrates the data recorded by the mobile VR environment backend (Seekbeak) as a participant explored the scenario visually, showing the attention given to critical elements in the virtual environment via a gaze heat map.

Fig 5
Figure 5. Example of Seekbeak gaze heat map data.

Figure 6 illustrates the biometric data recorded over time while a participant explored the virtual scenario wearing an HMD. The three scales show the overlap between skin conductance Galvanic Skin Resistance (GSR), heart rate and subjective feedback via the pedal, illustrating a correlation between the three.

Fig 6
Figure 6. Example of embodied feedback data (blue = skin conductivity, green = subjective pedal and red = heart rate).

Triangulated with time-stamped Seekbeak hotspot activation data, this provides a graphical correlation between participant biometric stress indicators and the identification of critical elements in the virtual scenario. Evaluation of the 2017 project iteration highlighted the following areas for redesign (Aguayo et al. 2018):


The second iteration of the project was informed by the feedback and lessons learnt from the first iteration in 2017. In the 2018 project iteration, we used Google Forms for the participant’s pre- and post-simulation prior experience and feedback.

Participants pre-simulation survey

In summary, only 16% of 2018 participants had previously taken part in the 2017 project. Of the 30 (2018) participants, we received 23 responses to the pre-simulation survey; 48% of participants were female and 52% of participants were male; 76% of participants were Paramedic students (21% were first-year students), while 24% were professional practicing Paramedics. This enabled a comparison between feedback on the MR-enhanced simulation experience from novices and experts. Only 16% of participants had more than 4 years of professional practice experience; 72% of participants indicated that they believed traditional mannequin-based simulation scenario assessments were effective or highly effective in providing them with the information needed to treat a real-life patient. Exposure to mannequin-based simulation depended upon progression through their degree (for students) or regularity of training (for professionals); 57% of participants had never used a VR headset prior to the project, and only one participant had used a VR headset more than five times prior to the project. Due to their limited exposure to VR headsets, participants reported a lack of negative effects from HMD VR usage. Only 13% of participants reported a visual impairment that they thought might create difficulty in viewing a small screen via the HMD.

Participant post-simulation survey

Participant feedback included:

The experience was interesting and I think it could be a promising form of education in the ambulance setting. The virtual reality aspect made me a little dizzy but was good for giving an accurate insight into the scene. More guidance before the scenario would have been helpful as I felt unsure of what exactly to start with. (Participant 1)

The experience created a realistic scene which made it an engaging environment to be in and fully be aware of the situation, and have a good understanding of the job and the scene. This made me want to experience virtual reality more when continuing my degree. (Participant 2)

On a personal level clinically I feel I missed a few things but I really enjoyed the experience. It felt as though you were in a real ambulance, arriving at a real scene, with a real patient which made treating the patient very realistic. (Participant 3)

In summary, of the 30 (2018) participants, we received 19 post-simulation responses.

Post-simulation participant survey responses identified key trends in the impact of the MMR-enhanced clinical simulation. All participants reported that they found the VR experience immersive and authentic; 10% of participants did not find the VR experience engaging; however, only one participant disagreed that VR simulation increases the quality of clinical simulation learning and should be used more in future training. Feedback from the post-survey indicated that participants saw the potential of the value added by VR-enhanced clinical simulation, but generally wanted a higher fidelity VR experience, reducing the effects of vertigo. Suggestions for future improvements included more guidance leading into the VR experience and adding a virtual second crew member into the scenario to replicate the practice of working in teams in real critical care call-outs.

Heat map and biometric feedback

Figure 7 indicates that participants spent significant time exploring the 360 scene with the virtual patient discovering clues for diagnosis and treatment procedures.

Fig 7
Figure 7. Eye-tracking heat map of VR accident scene.

In the 360 scenario with the high-fidelity mannequin surrounded by a selection of test equipment, the eye-tracking heat map (Figure 8) indicates that participants quickly scanned the available equipment resources as they decided upon an appropriate treatment procedure.

Fig 8
Figure 8. Eye-tracking heat map of VR pre-clinical simulation.

Figure 9 shows a selection of participant heart rate data (Y axis = BPM) measured via a smartwatch, and readings observed at eight different timestamps throughout the VR scenario as described in Table 2 (X axis = Pre-VR, Water scene, Ambulance start, Job notification while in ambulance, Cardiac arrest update while in ambulance, VR garage scene 2A, VR simulation room scene 2B, and post physical clinical simulation with the high-fidelity mannequin). The lowest four results are from participants who were professional Paramedics (indicated by P), while the rest were Paramedic students (S), including two first-year students (indicated by Y1). A baseline heart rate for each participant was determined during the ‘calming’ forest ‘water scene’ experience at the beginning of the VR scenario. As indicated by heart rate changes, stress levels tended to be highest in first-year students, and all student stress levels significantly increased when the job call came through in the VR scenario. There was one Y1 student ‘anomaly’ with the lowest student HR readings; however, this was from a student who was also a professional athlete. Conversely, practicing paramedics’ stress levels were the highest pre-simulation, and their stress levels tended to decrease as the VR scenario progressed. This biometric data aligned with the subjective feedback trends received from the participants through the post-simulation survey and post-simulation short video interviews.

Fig 9
Figure 9. Example of participant heart rate data during VR simulation.

Participant interviews

In 2018, we added short interviews of each participant immediately post-simulation. These were recorded via smartphone video for later analysis. In these short interviews, all participants agreed that the VR experience enhanced the traditional simulation and helped inform their patient diagnosis. Participants were asked three questions:

Participants commented that the VR experience increased the authenticity of the pre-clinical simulation briefing and generally helped prepare them for the physical clinical simulation better than the traditional verbal briefing. Most participants felt that the VR experience introduced higher levels of pre-clinical simulation stress than a traditional verbal briefing and better represented real-world practice. All participants agreed that it would be beneficial to make more use of VR simulation in future paramedic training scenarios. Suggestions for future improvement included being able to reproduce the VR scenario on the walls of the physical simulation room – like a Star Trek ‘Holodeck’ experience.


Here we explore the common emergent themes regarding the implementation of our design principles across the first two iterations of the MMR-enhanced clinical simulation project.

Revisiting the research questions

While each project iteration focused upon a specific research question linked to a predetermined pedagogical goal, they are both linked by a common research methodology (McKenney and Reeves 2019) and a research question: How can we design clinical simulation learning environments that are more authentic (than current practice), facilitate the development of higher order critical thinking and are cost-effective?

Thematic analysis of subjective participants’ data (Tobias and Duffy 2009; Vasilevski and Birt 2019) revealed that the most positive participant feedback was from the second-year students and the professional paramedics – both groups were the most enthusiastic about the value added by the VR-enhanced simulation to their learning. This indicated that there was a ‘sweet spot’ for the impact of the VR-enhanced simulation between non-experienced novices and third-year students who are highly experienced with traditional clinical simulation techniques. Practicing paramedics believed that VR provided a more authentic training experience than their prior educational experiences (Cochrane et al. 2018a).

Refining the design framework

In this section, we outline the refinement of our developing design framework for integrating MMR into the health education curriculum in light of the impact of the first two iterations. Similar to Luckin (2008), the design framework (Table 2) identifies a supporting MMR ecology of resources (Figure 2). The framework represents the outcome of reflection upon two prototype implementation cycles and addresses the key points from what we learnt from the first prototype data collection event (August 2017) and considerations in preparation for the second biofeedback data collection event in late July 2018. We categorised these under four main areas: methodology enhancement, participant data, pre-MR experience and during the MR experience.

Methodology enhancement

Participant data

Pre-MR experience

During the MR experience

The implementation framework summarised in Table 2 will therefore be revised and updated to respond to this feedback in the 2019 project iteration and will be reported in subsequent articles from the authors.

Transferable design principles

The key contribution of this study iteration was the development of an implementation framework (Table 2) that effectively comprises a set of transferable design principles (McKenney and Reeves 2019). We envision that the same methodology will be able to be modified and implemented in the development of a broader suite of critical care clinical simulation scenarios and also in other educational contexts and faculties across the university.

Limitations and future directions

The main limitations of our MMR development methodology are reliance upon higher education practitioners who have the desire and time to explore innovation in their practice through the integration of new technologies into their curricula. The practitioners are supported through the development of a design team that includes academic advisors and an MMR development team (Table 1). In the future, we hope to expand the project into all seven health disciplines in the university, and this will require the identification and support of key practitioners in each of the remaining five disciplines. Converting interest into commitment is the main limitation of our approach (Porter and Graham 2016), but the benefit is the sense of ownership and empowerment that practitioners gain as a result.

The 2019 iteration of the project aims to better integrate the flow of the learning experience between the VR pre-simulation and the actual clinical mannequin-based simulation. The third iteration ‘the MESH360 project’ in 2019 will refine the design principles and implementation framework established in the first two iterations of the DBR project.


In this article, we have discussed the development of a design-based research methodology and implementation framework (Table 2) to guide the design and implementation of MMR-enhanced clinical simulation within the health education curriculum. We have found that VR-enhanced clinical simulation creates a more authentic simulation experience, particularly for novice practitioners. We have also found that triangulating subjective participant feedback with participant biometric stress indicators provide evidence of the impact upon the effectiveness of the VR-enhanced learning experience. Refinement of the implementation framework will help us maximise the authenticity of clinical simulation training.


Aguayo, C., Cochrane, T. & Narayan, V. (2017) ‘Key themes in mobile learning: prospects for learner-generated learning through AR and VR’, Australasian Journal of Educational Technology, vol. 33, pp. 27–40. doi: 10.14742/ajet.3671

Aguayo, C., et al., (2018) ‘Embodied reports in paramedicine mixed reality learning’, Research in Learning Technology, vol. 26. doi: 10.25304/rlt.v26.2150

Bannan, B., Cook, J. & Pachler, N. (2015) ‘Reconceptualizing design research in the age of mobile learning’, Interactive Learning Environments, vol. 24, pp. 1–16. doi: 10.1080/10494820.2015.1018911

Barr, N. & Foster, J.-A. (2017) ‘Using consensus group methods to improve simulation using immersive media in paramedicine’, Journal of Paramedic Practice, vol. 9, pp. 121–125. doi: 10.12968/jpar.2017.9.3.121

Birt, J., Moore, E. & Cowling, M. (2017) ‘Improving paramedic distance education through mobile mixed reality simulation’, Australasian Journal of Educational Technology (AJET), vol. 33, pp. 69–83. doi: 10.14742/ajet.3596

Blaschke, L. & Hase, S. (2015) ‘Heutagogy, technology, and lifelong learning for professional and part-time learners’, in Transformative Perspectives and Processes in Higher Education, eds A. Dailey-Hebert & K. S. Dennis, Springer International Publishing, Switzerland, pp. 75–94.

Blaschke, L. M. & Hase, S. (2019) ‘Heutagogy and digital media networks: setting students on the path to lifelong learning’, Pacific Journal of Technology Enhanced Learning, vol. 1, pp. 1–14. doi: 10.24135/pjtel.v1i1.1

Burden, K. & Kearney, M. (2016) ‘Conceptualising authentic mobile learning’, in Mobile Learning Design: Theories and Application, eds D. Churchill, J. Lu, K. F. T. Chiu & B. Fox, Springer, Singapore. pp. 27–42. doi: 10.1007/978-981-10-0027-0_2

Cochrane, T. (2013a) ‘Mlearning as a catalyst for pedagogical change’, in Handbook of Mobile Learning, eds Z. Berge & L. Muilenburg, Routledge, New York, pp. 247–258. doi: 10.4324/9780203118764.ch22

Cochrane, T. (2013b) ‘A summary and critique of mlearning research and practice’, in Z. Berge & L. Muilenburg, Handbook of Mobile Learning, Routledge, New York, pp. 24–34.

Cochrane, T. (2014) ‘Critical success factors for transforming pedagogy with mobile Web 2.0’, British Journal of Educational Technology, vol. 45, pp. 65–82. doi: 10.1111/j.1467-8535.2012.01384.x

Cochrane, T., et al., (2019) ‘Developing a mobile immersive reality framework for enhanced simulation training: MESH360’, in ASCILITE 2019: 36th International Conference on Innovation, Practice and Research in the Use of Educational Technologies in Tertiary Education, eds S. C. Y. Wei, C. K. Mun & A. Alphonso, 2–5 December 2019, ASCILITE, Singapore University of Social Sciences (SUSS), Singapore, pp. 389–394.

Cochrane, T., et al., (2018a) ‘Designing immersive mobile learning mixed reality for paramedic education’, in 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), IEEE, University of Wollongong, Wollongong, Australia, pp. 80–85.

Cochrane, T., et al., (2017) ‘A DBR framework for designing mobile virtual reality learning environments’, Australasian Journal of Educational Technology (AJET), 33, 54–68. doi: 10.14742/ajet.3613

Cochrane, T., et al., (2016) ‘Designing virtual reality environments for paramedic education: MESH360’, in Show Me The Learning. Proceedings ASCILITE 2016 Adelaide, eds S. Barker, S. Dawson, A. Pardo & C. Colvin, 28–30 November 2016, Ascilite, University of South Australia, Adelaide, Australia, pp. 125–135.

Cochrane, T. & Narayan, V. (2014) ‘Cultivating creative approaches to learning’, in Experiences in Self-Determined Learning, eds L. M. Blaschke, C. Kenyon & S. Hase (eds.), CreateSpace Independent Publishing Platform, pp. 149–170.

Cochrane, T. & Narayan, V. (2016) ‘Principles of modeling COPs for pedagogical change: lessons learnt from practice 2006 to 2014’, in Implementing Communities of Practice in Higher Education: Dreamers and Schemers, eds J. Mcdonald & A. Cater-Steel, Springer, Singapore, pp. 619–643.

Cochrane, T. & Narayan, V. (2017) ‘Design considerations for mobile learning’, in Instructional-Design Theories and Models, eds C. Reigeluth, B. J. Beatty & R. Myers, Routledge, New York.

Cochrane, T., et al., (2018b) ‘Authentic interprofessional health education scenarios using mobile VR’, Research in Learning Technology, vol. 26, p. 2130. doi: 10.25304/rlt.v26.2130

Cook, J. & Santos, P. (2016) ‘Three phases of mobile learning state of the art and case of mobile help seeking tool for the health care sector’, in Mobile Learning Design, eds D. Churchill, J. Lu, T. K. F. Chiu & B. Fox, Springer, Singapore, pp. 315–333. doi: 10.1007/978-981-10-0027-0_19

Dall’alba, G. & Barnacle, R. (2007) ‘An ontological turn for higher education’, Studies in Higher Education, vol. 32, pp. 679–691. doi: 10.1080/03075070701685130

Eames, C. & Aguayo, C. (2019) Using Mobile Learning in Free-Choice Educational Settings to Enhance Ecological Literacy [Online], Teaching and Learning Research Initiative, Available at: http://www.tlri.org.nz/tlri-research/research-progress/cross-sector/using-mobile-learning-free-choice-educational-settings.

Edgerton, J. (2019) ‘Wearable technology and intermittent health care monitoring: the wave is here, the tsunami is coming’, The Journal of Thoracic and Cardiovascular Surgery, vol. 157, pp. 244–245. doi: 10.1016/j.jtcvs.2018.07.060

Foster, J.-A. (2017) ACODE Award 2017 [Online], University of the Sunshine Coast, Australia. Available at: https://youtu.be/r16K8WCHZhM

Hase, S. (2014) ‘An introduction to self-determined learning (Heutagogy)’, in Experiences in Self-Determined Learning, eds L. M. Blaschke, C. Kenyon & S. Hase, CreateSpace Independent Publishing Platform, Seattle, pp. 1–9.

Hase, S. & Kenyon, C. (2007) ‘Heutagogy: a child of complexity theory’, Complicity: an International Journal of Complexity and Education, vol. 4, pp. 111–118. doi: 10.29173/cmplct8766

Hellhammer, D. H., Wüst, S. & Kudielka, B. M. (2009) ‘Salivary cortisol as a biomarker in stress research’, Psychoneuroendocrinology, vol. 34, pp. 163–171. doi: 10.1016/j.psyneuen.2008.10.026

Kaufman, D. (2010) ‘Simulation in health professional education’, in Educational Gameplay and Simulation Environments: Case Studies and Lessons Learned, eds D. Kaufman & L. Sauvé, IGI Global, Hershey, PA, USA, pp. 51–67. doi: 10.4018/978-1-61520-731-2.ch003

Kearney, M., et al., (2012) ‘Viewing mobile learning from a pedagogical perspective’, Research in Learning Technology, vol. 20, pp. 1–17. doi: 10.3402/rlt.v20i0.14406

Laschinger, S., et al., (2008) ‘Effectiveness of simulation on health profession students' knowledge, skills, confidence and satisfaction’, International Journal of Evidence-Based Healthcare, vol. 6, pp. 278–302. doi: 10.1111/j.1744-1609.2008.00108.x

Leonard, S. & Fitzgerald, R. (2018) ‘Holographic learning: a mixed reality trial of Microsoft HoloLens in an Australian secondary school’, Research in Learning Technology, vol. 26.

Luckin, R. (2008) ‘The learner centric ecology of resources: a framework for using technology to scaffold learning’, Computers & Education, vol. 50, pp. 449–462. doi: 10.1016/j.compedu.2007.09.018

Mckenney, S. & Reeves, T. (2019) Conducting educational design research, Routledge, London.

Milgram, P. & Kishino, F. (1994) ‘A taxonomy of mixed reality visual displays’, IEICE TRANSACTIONS on Information and Systems, vol. 77, pp. 1321–1329. https://search.ieice.org/bin/summary.php?id=e77-d_12_1321

Ming-Zher, P., Swenson, N. C. & Picard, R. W. (2010) ‘A wearable sensor for unobtrusive, long-term assessment of electrodermal activity’, IEEE Transactions on Biomedical Engineering, vol. 57, pp. 1243–1252. doi: 10.1109/TBME.2009.2038487

Muhanna, M. A. (2015) ‘Virtual reality and the CAVE: taxonomy, interaction challenges and research directions’, Journal of King Saud University – Computer and Information Sciences, vol. 27, pp. 344–361. doi: 10.1016/j.jksuci.2014.03.023

Narayan, V., Herrington, J. & Cochrane, T. (2019) ‘Design principles for heutagogic learning: implementing student-determined learning with mobile and social media tools’, Australasian Journal of Educational Technology (AJET), vol. 35. doi: 10.14742/ajet.3941

OECD. (2015) Students, Computers and Learning, OECD Publishing, Paris, Pisa.

Porter, W. W. & Graham, C. R. (2016) ‘Institutional drivers and barriers to faculty adoption of blended learning in higher education’, British Journal of Educational Technology, vol. 47, pp. 748–762. doi: 10.1111/bjet.12269

Reeves, T., Herrington, J. & Oliver, R. (2005) ‘Design research: a socially responsible approach to instructional technology research in higher education’, Journal of Computing in Higher Education, vol. 16, pp. 97–116. doi: 10.1007/BF02961476

Stretton, T., Cochrane, T. & Narayan, V. (2018) ‘Exploring mobile mixed reality in healthcare higher education: a systematic review’, Research in Learning Technology, vol. 26, p. 2131. doi: 10.25304/rlt.v26.2131

Tobias, S. & Duffy, T. (eds.) (2009) Constructivist Instruction: Success or Failure?, Routledge, London.

Vasilevski, N. & Birt, J. (2019) ‘Analysing construction student experiences of mobile mixed reality enhanced learning in virtual and augmented reality environments’, Research in Learning Technology, vol. 27. doi: 10.25304/rlt.v28.2329