ORIGINAL RESEARCH ARTICLE
Daniel Clark*
E-Learning Team, Rutherford College, University of Kent, Canterbury, UK
(Received: 10 March 2022; Revised: 13 May 2022; Accepted: 25 May 2022; Published: 27 June 2022)
This study presents an evaluation of an online game-based student access, retention, progression and attainment (ARPA) initiative at the University of Kent. The initiative, a narrative-based simulation of a condensed student journey from pre-enrolment to graduation, is designed to prepare and support students in their transition to and participation in Higher Education. Student retention continues to be a perennial issue across the Higher Education sector, and studies have indicated that the more knowledgeable and informed students are about their university environment, the less likely they are to leave before completing their studies. Many institutions have developed interventions with the express purpose of addressing these concerns. Recognising the contextualised and subjective nature of such interventions, a realist evaluative framework was adopted to better understand the initiative under scrutiny, asking what works, for whom and in what circumstances. Participant interviews were utilised to assess the efficacy of the initiative in supporting students and in helping them to navigate often unfamiliar institutional cultures, practices and expectations. A revised programme theory is presented, enabling deeper insight into the merit of the initiative and its overall worth as a mechanism for change within the ARPA paradigm.
Keywords: Higher Education; gamification; retention; student success
*Corresponding author. Email: d.r.clark@kent.ac.uk
Research in Learning Technology 2022. © 2022 D. Clark. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.
Citation: Research in Learning Technology 2022, 30: 2782 - http://dx.doi.org/10.25304/rlt.v30.2782
Introduced in 2019, the University of Kent’s One Hour Degree (OHD) is an online narrative-based simulation game of a condensed undergraduate student journey from pre-enrolment to graduation. The OHD provides a safe space for students to engage in realistic university-related scenarios, encouraging independent decision-making and risk-taking, and was conceived as part of a suite of access, retention, progression and attainment (ARPA) activities delivered by Kent’s Student Success team. OHD participants are presented with academic and personal choices, representing over 100 million unique pathways invoking both positive and negative outcomes; participants score knowledge and wellbeing points throughout that dictate their final ‘degree’ outcome. The OHD was designed to address issues affecting the transition to and progression within university study and developed in response to issues concerning the availability of information for new students (University of Kent 2019b). Kent’s Student Success team identified themes concerning the prevalence of information; students commented that they felt ill-informed and unprepared for study, citing a lack of knowledge of where to seek support, low confidence in making reasoned decisions concerning their time and activities, and an unfamiliarity with the university environment (University of Kent 2019a, p. 1). Students made frequent mistakes, and because of their lack of knowledge, the repercussions were unknown and, therefore, had a detrimental impact later on (University of Kent 2019a, p. 3). Some students noted that they had no prior knowledge of Higher Education (HE), and this further compounded their poor decision-making and reluctance to seek help, putting them at risk of dropping out (University of Kent 2019a, p. 1).
The Office for Students (OfS) has set long-term access and retention targets and highlighted the role Higher Education Providers (HEPs) play, providing information to students, enabling them to prepare for, and succeed in, university study (Office for Students 2018, 2020). HEPs have sought to redress these issues through the Student Success movement, an umbrella of activities in flexible learning, employability, internationalisation, assessment and ARPA (Advance HE 2019). Whilst our understanding of retention has been enhanced by recent research, the persistence of non-completion indicates that this remains a prevalent issue (HEPI 2021; HESA 2021), particularly when considering the efforts of various programmes aimed at reducing it. Attention must, therefore, turn to the efficacy of ARPA initiatives themselves.
Realist evaluation has gained credence as a methodological framework for evaluating complex initiatives such as the OHD (see Dytham and Hughes 2017; Formby, Woodhouse, and Basham 2020; Ryan 2020; Pickering 2021). Realist evaluation affords a degree of pragmatism, in which it eschews simplistic notions of programme success or failure; instead, it seeks to uncover the conditionalities and contextualities of a programme, exploring what worked for whom, in what circumstances and why (Pawson and Tilley 1997). The purpose of this study is to evaluate participant experiences of the OHD within this framework.
There are three research questions: (1) Do students feel better informed and have increased awareness of the demands of university study because of the OHD; (2) Does engagement with the OHD make students feel more confident in making decisions; (3) What aspects of the OHD work for whom, in what circumstances and why?
Studies have indicated that activities undertaken before and during the first weeks of a student’s first year of study are crucial in preparing and integrating students into the cultures and expectations of university life (Cook and Rushton 2009; Lowe and Cook 2003; Roberts et al. 2003; Yorke and Longden 2008). ARPA initiatives have focussed on the correlation between successful pre-entry activities and retention (Billing 1997; Currant and Keenan 2009; Pennington et al. 2018), and many HEPs have developed online activities to exploit this correlation; the OHD being an example of this.
Research has focussed on the capacity for such interventions to develop confidence of where to seek help (Crosling, Heagney, and Thomas 2009; Currant and Keenan 2009; Lefever and Currant 2010; O’Donnell, Kean, and Stevens 2016; Tchen et al. 2018; Turner et al. 2017; Webb and Cotton 2018), with evidence suggesting that the more informed students are, the less likely they are to leave university before completion (Brooman and Darwent 2014; Laing, Robinson, and Johnston 2005; Money et al. 2016).
The OHD acts to ‘spread the load’, enabling preparation ahead of study (Keenan 2009) and elucidating the ‘jargon’ of HE, where orientation into the ‘codified structures of the university’ (Currant and Keenan 2009, p. 3) is crucial in enabling students to adapt into the culture of an institution and to develop an enhanced sense of belonging (Turner et al. 2017, p. 811).
There is a paucity of longitudinal studies in this area, and only a limited number have evaluated the efficacy of online game-based initiatives (Buzzo and Phelps 2016; Hamshire, Whitton, and Whitton 2012; Krause and Williams 2015; Piatt 2009; Whitton et al. 2014). Similar initiatives have combined game-based elements; however, the OHD is novel as it is entirely game-based (University of Kent 2019a).
The OHD is situated amongst literature extoling game-based activities for their immersive qualities (Carenys and Moya 2016; Sailer et al. 2017; University of Kent 2019a), including their ability to coalesce real life into an ‘alternative reality’ (Whitton et al. 2014; Zahedi et al. 2021). By encountering realistic scenarios, participants hone their critical decision-making capabilities in a ‘safe’ environment, in which they have a tangible influence over outcomes (Moseley 2008; University of Kent 2019a).
Research indicates that online activities can improve a student’s capacity to transition to and participate in HE (Currant and Keenan 2009; Knox 2005; Turner et al. 2017); however, further research is needed to evaluate the efficacy of game-based initiatives themselves (Piatt 2009; Whitton et al. 2014). Existing studies lack depth and do not adequately address the complexities associated with their use and have been superficial in their overly simplistic measurement of success or failure. Given that non-continuation remains a persistent issue, there is a need to dive deeper and to understand the contextualities and the ‘mechanics of explanation’ (Pawson and Tilley 1997, p. 55).
Realist evaluation offers this deeper view; it enables the evaluator to examine the underlying mechanisms – the elements that ‘intervene between the delivery of a programme and the occurrence of outcomes’ (Weiss 1997, p. 46) – to elucidate the ‘black box’ of intervention, focussing not on the effects of an intervention, but the conditions in which those effects are produced (Astbury and Leeuw 2010). Therefore, realist evaluation affords a more nuanced understanding; it is cognisant to the breadth of the OHD and the multiplicity of its participants; thus, this study presents a valuable contribution to a developing field.
Realist evaluation is a theory-driven approach and is informed by the realist evaluation cycle (Pawson and Tilley 1997). A suitable programme theory is used to articulate and hypothesise the context-mechanism-outcome (CMO) configurations that underpin how programme’s activities are thought to produce outcomes.
The OHD was developed in the absence of a programme theory, but it was apparent that there was a need for a dedicated theory to articulate how the OHD was expected to work. Drawing on appropriate guidance (Funnell and Rogers 2011; Harries, Hodgson, and Noble 2014; Knowlton and Phillips 2012) and existing literature, a programme theory was developed in consultation with the Student Success team (see Figure 1) and further broken down into its hypothesised CMO configurations (see Table 1).
Figure 1. OHD programme theory presented through a logic model.
Realist evaluation is nonprescriptive in its approach to data collection. Semi-structured interviews were selected to capture qualitative data concerning experiences of the OHD. Interviews are highly effective at highlighting contexts and mechanisms that produce variable outcomes (Dalkin et al. 2012; Manzano 2016) and for revealing the complexities of a given programme or intervention.
The OHD is completed anonymously; however – upon completion – every participant is given the opportunity to enter their email to be contacted for further feedback. This arrangement pre-dates this evaluation; however, it was used as a mechanism to recruit prospective participants here. To date, 292 students have completed the OHD, of which 42 provided their email.
Empirical research has indicated that the first 5–10 interviews yield the majority of new information within a dataset, with seldom new information emerging as the sample size increases (Francis et al. 2010; Guest, Bunce, and Johnson 2006; Morgan et al. 2002). Given the timing of the study (end of the spring term), the researcher anticipated a low rate of return, and therefore, all 42 students were invited to interview – 18 students responded and were interviewed.
Ethical clearance was obtained, and interviews were conducted online. Interview design focussed on eliciting data based on propositions concerning the efficacy of the OHD (Manzano 2016). By adopting the teacher-learner cycle approach (Pawson and Tilley 1997), participants were introduced to the programme theory (or elements of it), allowing them to accept, reject or refine theories contingent to their own experiences. Interviews sought to capture programme ‘stories’ (Pawson and Tilley 1997) in order to test the hypothesised CMO configurations.
The interviewees consisted of 13 males and five females; all were registered students at the University of Kent. Fifteen were Home/EU students, and three were Overseas students. They were in various stages of study; nine were in Stage 1 (2020 entry), eight in Stage 2 (2019 entry) and one in Stage 3 (2018 entry); all but two had completed the OHD during their first year of study. Secondary data were derived from a pre-existing survey made available to all OHD participants by the Student Success team in early 2020.
Consistent with the realist approach, analysis focussed on identifying and refining CMO patterns across the dataset. Interviews were coded following a theory-driven inductive approach using NVivo (Manzano 2016). All data were de-identified. Realist analysis is non-sequential and iterative, and therefore, consistent with Emmel’s iterative approach (2013), analysis commenced in parallel with data collection, affording a degree of agility and enabling emergent CMO configurations to be tested and refined in subsequent interviews (Manzano 2016; Pawson 2013).
Interview transcripts were re-read in order for the researcher to develop ‘hunches’ about how the OHD works (Dalkin et al. 2021). Top-level thematic codes were applied to each ‘hunch’. A more detailed analysis was then undertaken. Elements of the dataset relating to impacts or outputs were coded as outcomes. Once outcome patterns had been identified and disaggregated, the generative mechanism for each outcome (where available) was identified and coded as such. Similarly, data relating to the conditionality or situatedness of a participant were identified and coded as context. Finally, the individual C, M and O coded themes were drawn together to form refined CMO configurations, which could then be compared with the initial programme theory.
The interview data provided insight into the students’ experiences of the OHD, and, consistent with the realist approach, a summary of the CMO themes uncovered is presented in Table 2. The quotations included below are representative of each area of scrutiny. ‘S’ represents the student participants 1–18.
The following themes relate to the contextual constraints or contingencies that frame experiences of the OHD in relation to the research questions.
S4: It felt a little alien. That might be because I’m a mature student, but it felt like it was designed for traditional students, you know, those younger than me. I largely switched off at that point.
S7: I’m not massively social. I’m shy, so it was hard to relate.
S9: My experience of university is always going to be different to someone living on campus.
Students indicated that the narrative of the OHD was predicated on that of a ‘traditional’ student, and some did not identify with this persona.
S12: I did it just before I came to Kent…It did help with the transition because I had a list of stuff that was likely to happen, like induction week, getting my student card, meeting my advisor. When it came up, I was like oh yeah, I remember this from the game. So that was good.
S16: I think it was too late for me. I’d been here so long it was almost pointless. It didn’t really tell me anything I didn’t know.
The data indicate that early engagement with the OHD influences its efficacy, and there was a lower perceived benefit in those that undertook the initiative later.
S17: It was in an email about student support. I mean I clicked the link, but it was like ‘what is this?’ I just didn’t know what it was for. I only really finished the game because it said something about getting employability points [a Kent-based reward scheme] if you complete it.
S11: [the School’s Student Success lecturer] showed it to me in a tutorial. Because I was struggling, she explained why it would be good for me, especially the bits in the first year. When I actually started playing it, it sort of clicked, that these stresses were normal.
S15: Our student success lecturer went through it at the start of a lecture and that was useful. She checked with us a week later to see how we got on.
The data indicate that perception of the OHD was influenced by its introduction. In cases where students had found out about the OHD via email or happened upon it, there was an element of confusion concerning its purpose. There was a greater degree of comprehension when it was introduced in a structured way by a knowledgeable practitioner.
S2: I just sort of clicked the responses they expected me to click so I never really made any mistakes.
S6: I actually went out of my way to make the wrong choices. Well, I wanted to see what the consequences were. I think I wanted to see if he’d get kicked out of uni. In a way, it was sort of interesting to see how a couple of wrong decisions can make a big difference.
Appetite for risk-taking influenced how students engaged with the narrative; some students actively followed what they considered to be the expected pathway.
S3: I knew very little about uni so it helped
S13: I chatted to my friend from work. He was in his second year at *redacted* I think when I was applying for uni. I just sort of asked him whether he enjoyed it.
When participants were asked whether they spoke to family or friends about university prior to enrolment, the responses indicated that those with no familial experiences of HE found that the OHD exposed them to the activities, expectations and norms of university life, whereas others were able to obtain ‘hot’ knowledge from family members and friends.
S1: Way more than one hour!
S3: It took longer than I expected
In both the interview data and the secondary pre-existing survey data, there were indications that the OHD was a time-consuming activity, and that its title is misleading.
Mechanisms represent a combination of resources delivered by an intervention and the stakeholders’ reasoned responses to those resources (Dalkin et al. 2015; Pawson and Tilley 1997) framed in relation to the research questions.
S7: It was just unrealistic. It’s just a false environment. Not all students have difficulty choosing study over their social life, some students are just quiet.
Some students felt that the scenarios were unrealistic compared to the actualities of lived experience. This impacted upon the OHD’s ability to give students access to authentic situations.
S8: It was useful knowing who to ask if I encountered problems.
S3: I stuck with it because each level sort of introduces something new that’ll help you.
S14: I guess I just didn’t realise it was important.
The data indicated that where there was a perceived benefit to engaging with the OHD, students were more likely spend the time required to complete the game.
S10: When you make the right decision, it’s like ‘well done, here’s your points’. So, I was like, oh, I better start clicking the opposite of what I think in case I’m missing something useful.
Some students commented that it felt unusual to learn about where to seek help by making mistakes. One noted that they had to make ‘tactical errors’ for fear of missing out on ‘hot’ knowledge afforded only to those making the wrong choice, rather than the right one.
S3: It’s better than reading a website. I guess when you’ve done it in the game, you’ll remember it for when it happens in real life.
S1: It was kinda addictive.
S18: Why do it in a game when I could easily do it for real.
When asked how a game-based initiative might help students to be more informed, develop confidence and manage expectations, students commented that the gamified experience was novel, and that this made them curious about the different pathways in the game. The game-based experience itself instilled a sense of wanting to ‘get to the next level’. Conversely, some students felt that a game was not an appropriate means of inducting students into university.
Some outcomes were broadly in line with those hypothesised in the initial programme theory; however, there also emerged a set of unintended outcomes based on additional context-mechanism linkages.
S18: Maybe break it down into three games – one per year. I’m not going to remember the stuff about doing a dissertation or final exams.
S7: Because I was bored, I sort of clicked through.
Data indicate that whilst all those interviewed ‘completed’ the game, many paid less attention to the narrative in the latter parts of the game. Levels of interest waned once the narrative moved beyond a student’s frame of reference (e.g. the first year).
S4: I did find the study skills stuff useful, it gave me a good idea of who to see if I needed extra help.
S6: I guess money is a big deal. Yeah, I don’t remember seeing much about that in the game.
Participants indicated that the OHD raised awareness of where to seek support; however, this is limited to study support, with some commenting they were unsure where to seek help for issues relating to wellbeing and finance. The secondary data support this; there are numerous references to the absence of non-academic support, such as finance:
‘no mention of money’
‘needs more info on loans and fees’
S5: The stuff about exams and progressing to year two and three was good though…like, recognising that what you do now impacts something later on. Like, if I put the effort in now, it’ll make things easier in year two.
S1: It was interesting to see what’s in store – a sneak preview.
Students indicated that the OHD helped them to feel better informed of the ‘long game’ of university study, with some commenting that it helped them to visualise a ‘road map’ of progression.
S2: You can be confident, but still make the wrong choice.
S16: Life isn’t always like that though… some stuff you can predict, like exams being stressful, but some stuff isn’t predictable. You just have to react to that stuff when it comes. I don’t know how you can prepare for that.
Relative to part 2 of the research questions, students perceived little to no change in the confidence of their decision-making.
Pawson and Tilley (1997) describe the winners, losers, pros and cons of an intervention, noting that evaluators should anticipate as much within their findings. The data reject elements of the initial programme theory whilst supporting other aspects of it. The secondary data support the view that experiences of the OHD have been varied.
Where students failed to identify with the narrative (C1) and where the game’s scenarios were unrealistic (M1), programme mechanisms failed to ‘fire’ for some because the contextual conditions were not conducive. Equally, where the OHD was undertaken later (C2) and where there was lower perceived benefit to participation (M2), there was a reduced incidence of intended outcomes (O2 and O3). Timing (C2 and C6) and means of introduction (C3) influenced perception of the OHD (M2), and this configuration framed engagement with the narrative (O1) and awareness of where to seek help (O2). The game’s central tenet of ‘mistake making’ was contingent upon reasoning, with ‘hot’ knowledge only accessible to those whose conceptualisation of mistakes (M3) aligned with the initial programme theory. Appreciation of the ‘roadmap’ of the student journey (O3) was an unanticipated outcome.
The results from this study demonstrate the complexities of applying online game-based initiatives within the field of student ARPA.
Research has cited online game-based activities as having an emancipatory effect on the learning environment (Martin and Benton 2017; Sailer et al. 2017; Zahedi et al. 2021), transplanting participants into an alternate reality to create immersive learning experiences (Whitton et al. 2014). However, this study has shown there to be an added layer of complexity when applied to the ARPA arena; a student’s capacity to identify with the gamified environment is crucial in triggering the required mechanisms that give rise to the desired outcomes. Those who saw the greatest benefit of the OHD were those able to identify with its narrative; so, for mature students, commuting students and others who could not relate to the scenarios and characters of the OHD, the perceived benefits diminished.
Because of its narrative, therefore, the OHD risks replicating the exclusionary influences that it is designed to eradicate. This is consonant with the concepts of habitus and field (Bourdieu 1977) that have been applied to recent ARPA literature (see Burke 2012; Crozier and Reay 2008; Pickering 2021; Reay, David, and Ball 2005). Fields, in this sense, can be defined as ‘mutually supporting combinations of intellectual discourses and social institutions’ (Robbins 1993, p. 151), as such, if an individual’s habitus encounters an unfamiliar field, it can result in feelings of ‘disquiet, ambivalence, insecurity and uncertainty’ (Reay, David, and Ball 2005, p. 28), as exemplified by the reaction of some to the narrative.
Unlike other (time-limited) online transition activities (Crosling, Heagney, and Thomas 2009; Currant and Keenan 2009; Lefever and Currant 2010; O’Donnell, Kean, and Stevens 2016; Tchen et al. 2018; Turner et al. 2017; Webb and Cotton 2018), the OHD spans the entirety of the student journey, observing Crafter and Maunder’s view of transition as a continuum (2012). However, this study has demonstrated that levels of interest waned once the narrative moved beyond a student’s frame of reference (e.g. the first year). Whilst some indicated that the OHD helped them to visualise the ‘student journey’, this outcome is likely to be ephemeral in that it is, for most, an abstract concept.
Notwithstanding, this study has shown that the OHD was more effective when introduced in a structured manner as this helped to convey its perceived benefits. The presence of a practitioner was transformative as it enabled participants to surpass the threshold concept of the OHD’s purpose. Other such studies have highlighted the flexibility afforded when participants engage ‘organically’ in their own time (Keenan 2009; O’Donnell, Kean, and Stevens 2016); however, the OHD’s complexity warrants structure. Where this was provided – contingent upon the aforementioned – students were more motivated to engage and felt better informed as a result. To invoke this outcome, the mechanism of ‘perceived value of participation’ needed to be triggered; this is crucial because, unlike other activities that have a clear and obvious goal (e.g. an assessment mark), the benefits of engaging may not be immediately apparent. It remains unclear whether this translated into students feeling more confident in making decisions. Decision-making is inherently tied to risk (Davies and Williams 2001), and attitudes to risk are highly contextualised. Risk taking is cited as a benefit of gamification (Kapp 2013; Sailer et al. 2017); however, as shown in this study, a student’s predisposition to risk impacted on decision-making within the OHD. This is a flaw in the OHD’s design, as its ethos is predicated on risk taking and learning through error (University of Kent 2019a), and, for some, this mechanism did not ‘fire’.
This study has two key limitations. First, there is a longitudinal gap; whilst some second/third-year students were interviewed, there is a need for additional data concerning the impact on the full student lifecycle. It is not yet possible to capture these data; however, understanding how the OHD maps against the actualities of the complete student journey is crucial. Second, this study elicited data from those who completed the OHD; it does not include those who did not complete or engage at all. This fell outside of the study’s scope; however, it is an obvious area for further scrutiny. The participants here are, largely, already motivated and engaged with their studies. It can generally be assumed that those who will benefit the most from initiatives such as this are also those hardest to reach.
From the results of this evaluation, a refined programme theory is presented in Figure 2 and based on the OHD in its current state of operation.
Figure 2. A refined programme theory for the OHD.
This study was one of the first of its kind to apply a realist framework to the evaluation of an online game-based ARPA initiative. The realist framework has enabled this study to delve deeper into its mechanics, allowing us to explore what it is about such programmes that work for whom and in what circumstances. The findings of this study are relevant on both a local level and more broadly for those looking to implement similar initiatives.
Advance HE. (2019) Essential Frameworks for Enhancing Student Success. York: Advance HE. Available at: https://www.advance-he.ac.uk/sites/default/files/2020-05/Enhancing%20Student%20Success%20in%20Higher%20Education%20Framework.pdf
Astbury, B. & Leeuw, F. L. (2010) ‘Unpacking black boxes: mechanisms and theory building in evaluation’, American Journal of Evaluation, vol. 31, no. 3, pp. 363–381. doi: 10.1177/1098214010371972
Billing, D. (1997) ‘Induction of new students to higher education’, Innovations in Education and Training International, vol. 34, no. 2, pp. 125–134. doi: 10.1080/1355800970340208
Bourdieu, P. (1977) Reproduction in Education, Society and Culture, SAGE Publications, London.
Brooman, S. & Darwent, S. (2014) ‘Measuring the beginning: a quantitative study of the transition to higher education’, Studies in Higher Education, vol. 39, no. 9, pp. 1523–1541. doi: 10.1080/03075079.2013.801428
Burke, P. J. (2012) The Right to Higher Education – Beyond Widening Participation, Routledge, London.
Buzzo, D., & Phelps, P. (2016) ‘JourneyMap: Visualising the time-bound student journey’, in EVA London 2016: Electronic Visualisation and the Arts (British Computer Society), eds N. Lambert, J. Bowen & G. Diprose, London, 12–15 July, pp. 178–185.
Carenys, J. & Moya, S. (2016) ‘Digital game-based learning in accounting and business education’, Accounting Education, vol. 25, no. 6, pp. 598–651. doi: 10.1080/09639284.2016.1241951
Cook, T. & Rushton, B. (2009) How to Recruit and Retain Higher Education Students – A Handbook of Good Practice, Taylor & Francis, London.
Crafter, S. & Maunder, R. (2012) ‘Understanding transitions using a sociocultural framework’, Educational and Child Psychology, vol. 29, no. 1, pp. 10–18.
Crosling, G., Heagney, M. & Thomas, L. (2009) ‘Improving student retention in higher education: improving teaching and learning’, The Australian Universities’ Review, vol. 51, no. 2, pp. 9–18. doi: 10.4324/9780203935453
Crozier, G. & Reay, D. (2008) ‘The socio cultural and learning experiences of working class students in higher education’, Teaching and Learning Research Briefing, (44) June. Teaching and Learning Research Programme, London.
Currant, B. & Keenan, C. (2009) ‘Evaluating systematic transition to higher education’, The Brookes Ejournal of Learning and Teaching, vol. 2, no. 4, pp. 1–12.
Dalkin, S., et al., (2021) ‘Using computer assisted qualitative data analysis software (CAQDAS; NVivo) to assist in the complex process of realist theory generation, refinement and testing’, International Journal of Social Research Methodology, vol. 24, no. 1, pp. 123–134. doi: 10.1080/13645579.2020.1803528
Dalkin, S. M., et al., (2012) ‘Understanding integrated care pathways in palliative care using realist evaluation: a mixed methods study protocol’, BMJ Open, vol. 2, no. 4. doi: 10.1136/bmjopen-2012-001533
Dalkin, S. M., et al., (2015) ‘What’s in a mechanism? Development of a key concept in realist evaluation’, Implementation Science, vol. 10, no. 1, p. 49. doi: 10.1186/s13012-015-0237-x
Davies, P. & Williams, J. (2001) ‘For me or not for me? Fragility and risk in mature students’ decision-making’, Higher Education Quarterly, vol. 55, no. 2, pp. 185–203. doi: 10.1111/1468-2273.00182
Dytham, S. & Hughes, C. (2017) Widening Participation Research And Evaluation: Where Are We Now?, University of Warwick, Coventry, UK, p. 8.
Emmel, N. (2013) Sampling and Choosing Cases in Qualitative Research: A Realist Approach, SAGE, London.
Formby, A., Woodhouse, A. & Basham, J. (2020) ‘Reframing widening participation towards the community: a realist evaluation’, Widening Participation and Lifelong Learning, vol. 22, no. 2, pp. 184–204. doi: 10.5456/WPLL.22.2.184
Francis, J. J., et al., (2010) ‘What is an adequate sample size? Operationalising data saturation for theory-based interview studies’, Psychology & Health, vol. 25, no. 10, pp. 1229–1245. doi: 10.1080/08870440903194015
Funnell, S. C. & Rogers, P. J. (2011) Purposeful Program Theory: Effective Use of Theories of Change and Logic Models, Wiley, New York.
Guest, G., Bunce, A. & Johnson, L. (2006) ‘How many interviews are enough? an experiment with data saturation and variability’, Field Methods, vol. 2006, no. 18, pp. 59–82. doi: 10.1177/1525822X05279903
Hamshire, C., Whitton, N. & Whitton, P. (2012) ‘Staying the course - A game to facilitate students’ transitions to higher education’. In 6th European Conference on Games Based Learning, Cork, ed P. Felicia, Ireland, 4–5 October, pp. 624–630.
Harries, E., Hodgson, L. & Noble, J. (2014) Creating Your Theory of Change, NPC, London.
HEPI. (2021) A Short Guide to Non-Continuation in UK Universities, Higher Education Policy Institute, Oxford.
HESA. (2021) Non-Continuation: UK Performance Indicators | HESA. Available at: https://www.hesa.ac.uk/data-and-analysis/performance-indicators/non-continuation
Kapp, K. M. (2013) The Gamification of Learning and Instruction Fieldbook, John Wiley & Sons, New Jersey.
Keenan, C. (2009) ‘Stepping Stones 2HE: fresh thinking for introducing PDP to freshers’, in Enhancing Student-Centred Learning in Business and Management Hospitality Leisure Sport Tourism, eds J. Buswell & N. Becket, Threshold Press, Newbury, pp. 169–178.
Knowlton, L. W. & Phillips, C. C. (2012) The Logic Model Guidebook: Better Strategies for Great Results, SAGE, London.
Knox, H. (2005) ‘Making the transition from further to higher education: the impact of a preparatory module on retention, progression and performance’, Journal of Further and Higher Education, vol. 29, no. 2, pp. 103–110. doi: 10.1080/03098770500103135
Krause, M. & Williams, J. (2015) ‘A Playful Game Changer: Fostering Student Retention in Online Education with Social Gamification’. In Second ACM Conference on Learning @ Scale, Association for Computing Machinery, ed G. Kiczales, New York, 14–18 March, pp. 95–102. https://doi.org/10.1145/2724660.2724665
Laing, C., Robinson, A. & Johnston, V. (2005) ‘Managing the transition into higher education: an on-line spiral induction programme’, Active Learning in Higher Education, vol. 6, no. 3, pp. 243–255. doi: 10.1177/1469787405059575
Lefever, R. & Currant, B. (2010) ‘How can technology be used to improve the learner experience at points of transition’, Higher Education Academy, York, pp. 1–90
Lowe, H. & Cook, A. (2003) ‘Mind the gap: are students prepared for higher education?’, Journal of Further and Higher Education, vol. 27, no. 1, pp. 53–76. doi: 10.1080/03098770305629
Manzano, A. (2016) ‘The craft of interviewing in realist evaluation’, Evaluation, vol. 22, no. 3, pp. 342–360. doi: 10.1177/1356389016638615
Martin, C. & Benton, T. (2017) ‘Character creation: gamification and identity’, Teaching Media Quarterly, vol. 5, no. 2, pp. 1–8.
Money, J., et al., (2016) ‘Co-creating a blended learning curriculum in transition to higher education: a student viewpoint’, Creative Education, vol. 7, pp. 1205–1213. doi: 10.4236/ce.2016.79126
Morgan, M., et al., (2002) Risk Communication: A Mental Models Approach, Cambridge University Press, New York.
Moseley, A. (2008) ‘An alternate reality for higher education? Lessons to be learned from online games’. In ALT-C, ed S. Schmoller, Leeds, 9–11 September, pp. 92–95
O’Donnell, P., Kean, M. & Stevens, G. (2016) Student Transition in Higher Education – Concepts, Theories and Practices, Higher Education Academy, York.
Office for Students. (2018) Providing Information, Advice and Guidance for Students – Our Research. Available at: https://www.officeforstudents.org.uk/advice-and-guidance/student-information-and-data/providing-information-advice-and-guidance-for-students/our-research/
Office for Students. (2020) Information, Advice and Guidance for Prospective Students. Available at: https://www.officeforstudents.org.uk/publications/coronavirus-briefing-note-information-advice-and-guidance-for-prospective-students/
Pawson, R. (2013) The Science of Evaluation – A Realist Manifesto, SAGE Publications, London.
Pawson, R. & Tilley, N. (1997) Realistic Evaluation, SAGE Publications, London.
Pennington, C. R., et al., (2018) ‘Transitioning in higher education: an exploration of psychological and contextual factors affecting student satisfaction’, Journal of Further and Higher Education, vol. 42, no. 5, pp. 596–607. doi: 10.1080/0309877X.2017.1302563
Piatt, K. (2009) ‘Using alternate reality games to support first year induction with ELGG’, Campus-Wide Information Systems, vol. 26, no. 4, pp. 313–322. doi: 10.1108/10650740910984646
Pickering, N. (2021) ‘Enabling equality of access in higher education for underrepresented groups: a realist “small step” approach to evaluating widening participation’, Research in Post-Compulsory Education, vol. 26, no. 1, pp. 111–130. doi: 10.1080/13596748.2021.1873410
Reay, D., David, M. E. & Ball, S. (2005) Degrees of Choice: Social Class, Race and Gender in Higher Education, Trentham Books Ltd, Stoke on Trent, UK.
Robbins, D. (1993) ‘The practical importance of Bourdieu’s analyses of Higher Education’, Studies in Higher Education, vol. 18, no. 2, pp. 151–163. doi: 10.1080/03075079312331382339
Roberts, C., et al., (2003) ‘Supporting student ‘success’: what can we learn from the persisters?’, in Education in a Changing Environment: Inaugural Learning and Teaching Research Conference, eds L. Anderson, 17–18 September, University of Salford, UK, pp. 45–53.
Ryan, F. (2020) ‘A virtual law clinic: a realist evaluation of what works for whom, why, how and in what circumstances?’, The Law Teacher, vol. 54, no. 2, pp. 237–248. doi: 10.1080/03069400.2019.1651550
Sailer, M., et al., (2017) ‘How gamification motivates: an experimental study of the effects of specific game design elements on psychological need satisfaction’, Computers in Human Behavior, vol. 69, pp. 371–380. doi: 10.1016/j.chb.2016.12.033
Tchen, P., et al., (2018) ‘Bridging the gap: an evaluation of self-paced online transition modules for advanced pharmacy practice experience students’, Currents in Pharmacy Teaching and Learning, vol. 10, no. 10, pp. 1375–1383. doi: 10.1016/j.cptl.2018.07.006
Turner, R., et al., (2017) ‘Easing the transition of first year undergraduates through an immersive induction module’, Teaching in Higher Education, vol. 22, no. 7, pp. 805–821. doi: 10.1080/13562517.2017.1301906
University of Kent. (2019a) ‘One Hour Degree proposal’. Internal University of Kent paper. Unpublished.
University of Kent. (2019b) Student Success Project – The One Hour Degree. Available at: https://www.kent.ac.uk/studentsuccess/onehourdegree.html
Webb, O. J. & Cotton, D. R. E. (2018) ‘Early withdrawal from higher education: a focus on academic experiences’, Teaching in Higher Education, vol. 23, no. 7, pp. 835–852. doi: 10.1080/13562517.2018.1437130
Weiss, C. (1997) ‘Theory-based evaluation: past, present, and future’, New Directions for Evaluation, vol. 1997, no. 76, pp. 41–55. doi: 10.1002/ev.1086
Whitton, N., et al., (2014) ‘Alternate reality games as learning environments for student induction’, Interactive Learning Environments, vol. 22, no. 3, pp. 243–252. doi: 10.1080/10494820.2011.641683
Yorke, M. & Longden, B. (2008) The First-Year Experience of Higher Education in the UK – Final Report, Higher Education Academy, York.
Zahedi, L., et al., (2021) ‘Gamification in education: a mixed-methods study of gender on computer science students’ academic performance and identity development’, Journal of Computing in Higher Educuation, 33, pp. 441–474. doi: 10.1007/s12528-021-09271-5