ORIGINAL RESEARCH ARTICLE
Sarah Corneliusa*, Colin Calderb and Peter Mtikaa
aSchool of Education, University of Aberdeen, MacRobert Building, King’s College, Aberdeen, AB24 5UA;
bCentre for Academic Development, University of Aberdeen, Regent Building, King’s College, Aberdeen, AB24 3FX
(Received: 14 June 2018; final version received: 17 January 2019; Published 13 February 2019)
The use of Massive Open Online Courses (MOOCs) in blended learning contexts is becoming increasingly common, but relatively little is known about the experiences of on-campus learners taking MOOCs. This article reports research that explored the experiences of on-campus learners taking a blended course which included a MOOC. Use of the UK Engagement Survey provided a focus on engagement and permitted comparisons with a wider cohort of on-campus learners. Findings show that there were no differences between learners on the blended course and the wider cohort of on-campus learners for some aspects of engagement. However, learners on the blended course were more engaged than on-campus learners on specific aspects measured by the UKES survey including those which appear related to social learning. Evidence from a small number of interviews is used to explore issues raised, and informed by the Community of Inquiry framework, factors which influence blended learners’ engagement with the MOOC are discussed. Some of the findings support the call for amendments to the community of inquiry framework for MOOC contexts and provide evidence of issues related to social and teaching presence that may need additional consideration.
Keywords: community of inquiry, learners’ experiences, engagement surveys, MOOC, blended learning
*Corresponding author. Email: s.cornelius@abdn.ac.uk
Research in Learning Technology 2019. © 2019 S. Cornelius et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.
Citation: Research in Learning Technology 2019, 27: 2097 - http://dx.doi.org/10.25304/rlt.v27.2097
Massive Open Online Courses (MOOCs) have created new opportunities for learning globally and have been taken by millions of distributed learners. They also have value in on-campus provision, and there are many examples of MOOCs being blended into on-campus courses (e.g. Bruff et al. 2013; Ghadiri et al. 2013; Jaffer, Govender, and Brown 2017; Yousef et al. 2015). At the University of Aberdeen, students on a blended course (referred to by the course code SX1519) joined a global cohort studying on the FutureLearn MOOC, ‘Africa: sustainable development for all?’. This article examines learners’ experiences on the blended course, focusing on learner engagement. The aim of the study was to gain insights from learners to enhance course design and delivery.
Reported benefits of blended learning with MOOCs include enhanced learning experiences, improved student outcomes and reduced costs (Israel 2015). For on-campus students, MOOCs can provide digital experiences and exposure to global perspectives. However, there can be challenges. Practical issues such as timing courses around third-party MOOC delivery and integration of on-campus and MOOC learning platforms have been reported as difficulties (Israel 2015). Limited engagement of on-campus students in MOOC discussions has also been reported (see, e.g., Caulfield, Collier, and Halawa 2013), and face-to-face contact appears to remain important in on-campus scenarios, as Jaffer, Govender, and Brown (2017) found for postgraduate students on a ‘wrapped’ MOOC course.
Research has been undertaken to evaluate engagement within MOOCs. For example, Milligan, Littlejohn, and Margaryan (2013) developed a custom instrument drawn from a range of pre-existing self-regulated learning instruments to describe learner profiles and subsequently categorised participants as Active, Lurker or Passive. The data collected by MOOC platforms also provide detailed footprints of learners’ actions. Aspects such as time spent on task, activities completed and test scores are collected routinely and can also contribute to an understanding of engagement. Data mining was used by Kahan, Soffer, and Nachmias (2017) to investigate participant behaviour on a biology MOOC. Drawing on data such as the number of video downloads and views, threads opened, comments submitted and exams completed, learners were categorised into seven types ranging from tasters (64.8%) to social engagers (0.6%). Whilst such categories provide some insight into the actions taken by learners, it should be acknowledged that on their own they do not reveal when activities have led to meaningful learning (Sinclair and Kalvala 2016).
It should also be acknowledged that there is a lack of conceptual clarity over the term ‘student engagement’ (Buckley 2014a), and there has been considerable interest in broad concepts and theoretical underpinnings (see, e.g., MacFarlane and Tomlinson 2017). Buckley (2014) noted that student engagement can be examined to benefit both political and pedagogical ends. At a fundamental level, he suggested that engagement has been ‘taken to be a process that leads to effective learning’ (2014, p. 4), which involves individuals, institutions or both, working together. In Scotland, the quality of the higher education learning experience is informed by the Student Engagement Framework (https://www.sparqs.ac.uk/upfiles/SEFScotland.pdf). This outlines key elements and features of student engagement, which cover individual, institutional, political and pedagogical issues. Amongst the framework’s six features of effective student engagement is ‘students engaging in their own learning’. It is principally at this level of pedagogic engagement that we focus in this article, examining aspects such as behaviour and barriers to learning. However, we acknowledge that elements of the wider context within which students study play a role in their engagement. This understanding allows the critical application of standard survey tools to explore the experiences of learners on a blended MOOC alongside those of a wider cohort of on-campus learners.
Trowler (2010) considered the methods and instruments developed to measure student engagement. In many respects, these instruments have become the proxy for our concepts of engagement, as exemplified by the widely deployed North American National Survey of Student Engagement (NSSE) developed by Indiana University. Sinclair and Kalvala (2016) suggested that the NSSE focuses on behavioural level dimensions of student engagement, and although they are cautious about the interpretation of results from this and other surveys, they also cited the work by Pascarella, Seifert, and Blaich (2010) which suggested that engagement surveys can indicate learning gain. The UK Engagement Survey (UKES) used in this study draws on a subset of NSSE questions. It was piloted by the Higher Education Academy (HEA) in 2013 and 2014 and was used by subscribing UK Higher Education institutions from 2015 (Howson and Buckley 2016). Factor analysis and scale reliability testing have found that UKES measures distinct but related dimensions of students’ engagement with their studies (Buckley 2014b).
UKES is designed to measure engagement across eight scales and collects additional behavioural data around how students spend their time. The pilot phases of UKES were successfully used to compare engagement of students on two MOOCs by Wintrup, Wakefield, and Davis (2015). Noting that the UKES constructs appeared to be meaningful to learners, this study found evidence of frequent, high levels of engagement for many learners. Differences between the MOOCs under investigation were also noted, and it was suggested that MOOC design and teaching approaches may impact engagement. Wintrup, Wakefield, and Davis (2015) also compared findings with UKES data obtained across the University in the previous year. Whilst acknowledging the difficulties in making comparisons when survey questions had changed, this revealed, for instance, that social, interactive learning was higher in face-to-face programmes. This work prompted our interest in applying UKES to the blended course SX1519. The findings contribute to the growing body of empirical evidence of learners’ experiences with MOOCs and to the range of contexts to which the UKES has been applied.
The theoretical framework for this work is the Community of Inquiry (Garrison, Anderson, and Archer 1999), a widely used pedagogic model which has been validated for the MOOC context (Kovanović et al. 2018). It has also been applied to blended learning settings (e.g. Akyol and Garrison 2011; Wicks et al. 2015), used to measure student engagement (Damm 2016) and to analyse successful MOOCs (Cohen and Holstein 2018). The model articulates three dimensions (cognitive, social and teaching presences) which shape learners’ experiences of enquiry-based learning within a learning community. This model fits well with the pedagogy of SX1519, which is outlined later in this article.
However, some concerns about the applicability of the Community of Inquiry framework to MOOCs have been aired. Gašević et al. (2014) called for research into new theoretical underpinnings, noting that the large cohort size in a typical MOOC and short course duration may impact on social presence and opportunities for enquiry. These issues may be overcome in the blended scenario, when courses are likely to be longer in duration, and structured opportunities for enquiry are possible. Kovanović et al. (2018) have recently proposed updates to the model. Thus, although survey instruments for the evaluation of Communities of Inquiry do exist, this research uses a post hoc approach to the application of the theoretical framework to recognise the state of development of the framework and survey instruments for this context and to allow consideration of an alternative ‘engagement’ perspective.
‘Africa: Sustainable development for all?’ (SX1519) is a blended course designed to provide an interdisciplinary opportunity for curriculum enhancement for first- and second-year on-campus undergraduates. The University’s MOOC with the same title was delivered on the FutureLearn platform and simultaneously available globally. The 6-week MOOC replaced lectures in the on-campus course and comprised recordings of video presentations by educators, case-study videos, text and reading, activities, discussions and weekly tests. Following an induction week, on-campus students completed the MOOC, engaged with supporting tutorials and additional resources (including films and music), then completed their course with a group project and presentation (Figure 1). Assessment included the graded MOOC weekly tests, an additional online assessment, group presentations and individual reflections. SX1519 employed the MOOC to support self-directed independent online learning within a global learning community and used face-to-face approaches to develop a local community engaged in collaborative enquiry and tutorial discussions. Although not explicitly designed to reflect the phases of inquiry-based learning suggested by Garrison, Anderson, and Archer (2001), SX1519 did allow a process which included elements of this, including problem-solving, knowledge exploration, synthesis and resolution.
Figure 1. Blending of the MOOC with on-campus course SX1519.
Quantitative and qualitative data were collected to explore learners’ experiences. The UKES survey was used to obtain data on engagement from learners on SX1519 as well as from a wider cohort of undergraduate learners. To support verification and to provide in-depth insights into learners’ experiences, data were also obtained from a small number of interviews with learners, from discussions with the course coordinator and from the FutureLearn platform.
UKES is one of a number of ‘core’ surveys which are deployed to undergraduates at the University of Aberdeen to evaluate satisfaction and engagement and provide information on the student experience for enhancement purposes. To comply with institutional survey policy designed to limit the number of surveys they are expected to complete, students will generally only complete one core survey each year. In 2016, UKES was deployed with first- and third-year students between March and April.
The 2016 UKES questionnaire1 consisted of 49 largely Likert style questions arranged in nine scales, free text comments for each scale, and optional institutional and National Student Survey (NSS) questions. Institutional questions were not added, but NSS ‘Teaching on my course’ questions were included. Questions on assessment and feedback and on how respondents spend their time were not considered in this study. The survey was implemented prior to course feedback being received, and thus, it was considered inappropriate for respondents to comment on this issue. Time-related questions included topics such as time spent in paid employment, which are not relevant to the research here. Thus, UKES questions in the survey made available to SX1519 participants and considered in this article covered:
SX1519 was timetabled to run alongside the open MOOC from January to March 2016. As SX1519 was largely populated by second-year students who would not receive the institutional UKES questionnaire, an independent iteration of the survey (identical apart from a modification of the introductory text to emphasise that responses should be restricted to experiences with SX1519) was opened specifically for them before the institutional wide survey was live or publicised. Confirmation was obtained from the HEA for use of the UKES for the research and the study was granted ethical approval. All surveys were delivered online via onlinesurveys.ac.uk.
Members of the SX1519 cohort who had completed UKES were invited to participate in focus groups scheduled after the course and the survey had closed. Recruiting students to focus groups was challenging, but a small number (n = 3) participated in individual and pair interviews to provide more detailed accounts of their experiences and interactions on the course. Platform metrics provided by FutureLearn were used to verify accounts and initial findings, for example, to check that students had contributed to discussions as described. The course coordinator also provided contextual information and course documentation to support interpretation and exploration of issues raised.
This article focuses primarily on engagement, and analysis was guided by the question ‘what were learners’ experiences of a blended course incorporating a MOOC?’ Other questions of interest included:
Forty-five of 88 students (51%) from SX1519 and 606 of 4058 students (15%) from the general first- and third-year student cohort responded to the UKES. Analysis indicated that five of the SX1519 students had also responded to the general undergraduate survey, and their responses to the general survey were consequently excluded from analysis.
Single-tailed Mann-Whitney U tests provided pairwise comparisons between the general undergraduate cohort and MOOC course sample for each UKES question item. Where significant differences between the distributions were found, the histograms of the distributions were plotted and tests re-calculated with directional H1s to indicate which cohort was more engaged.
Findings from quantitative analysis and the questions posed above informed focus group discussions. Following verbatim transcription of these discussions, independent thematic coding of data (Braun and Clark 2006) was undertaken by two researchers before themes and codes were agreed. Themes reflected common issues associated with engagement and learners’ experience. These were supplemented by vignettes produced to summarise individual experiences. UKES data are considered first below, then findings from focus groups are reviewed to provide some verification and highlight key issues. Extracts and quotations from qualitative data have been selected to illustrate learners’ collective and individual experiences.
In some dimensions covered by UKES, no significant differences were found between the two cohorts. These included overall satisfaction and all questions related to ‘interaction with staff’ and ‘teaching on my course’. Table 1 presents results of pairwise comparisons for all other questions on UKES. It highlights where significant differences were observed and which cohort was more engaged (column η1 / η2).
UKES item | η1/η2* | Mann–Whitney U test Test of η1 = η2 vs. η1 ≠ η2 is significant at |
Critical thinking: During the current academic year, how much has your course emphasised the following activities? | ||
• Applying facts, theories or methods (e.g. to practical problems or new situations) | UG | 0.0004 |
• Analysing ideas or theories in depth | UG | 0.0003 |
• Evaluating or judging a point of view, decision or information source | 0.6986 | |
• Forming a new understanding from various pieces of information | 0.5811 | |
Learning with others: During the current academic year, about how often have you done each of the following? | ||
• Worked with other students on course projects or assignments | SX1519 | 0.0000 |
• Explained course material to one or more students | 0.9996 | |
• Asked another student to help you understand course material | 0.5686 | |
• Prepared for exams or assessments by discussing or working through course material with other students | SX1519 | 0.0731 |
Reflecting and connecting: During the current academic year, about how often have you done each of the following? | ||
• Combined ideas from different modules when completing assignments | 0.6882 | |
• Connected your learning to real-world problems or issues | 0.1830 | |
• Examined the strengths and weaknesses of your own views on a topic or issue | 0.1572 | |
• Tried to better understand someone else’s views by imagining how an issue looks from his or her perspective | SX1519 | 0.0099 |
• Changed the way you thought about a concept or issue as a result of what you learned | 0.6770 | |
• Connected ideas from your course to your prior experience and knowledge | 0.3797 | |
Course challenge: During the current academic year, how much has your course emphasised taking responsibility for your own learning? | ||
• During the current academic year, how much has your course challenged you to do your best work? | UG | 0.0140 |
Engagement with research and inquiry: During the current academic year, how much has your course emphasised the following activities? | ||
• Learning about the methods of research and analysis in your subject | 0.2088 | |
• Learning about the outcomes of current research in your subject | SX1519 | 0.0869 |
• Formulating and exploring your own questions, problems or scenarios | 0.3540 | |
• Doing research (such as working on your own research project, or working on a research project with staff) | SX1519 | 0.0084 |
Staff–student partnerships: During the current academic year, how much have you been encouraged to do the following activities? | ||
• Contributing to a joint community of staff and students | SX1519 | 0.0033 |
• Working with staff to make improvements to your course | 0.2135 | |
• Working with staff to evaluate teaching and assessment practices | 0.5026 | |
Skills Development: How much has your overall student experience contributed to your knowledge, skills and personal development in the following areas? | ||
• Writing clearly and effectively | UG | 0.0007 |
• Speaking clearly and effectively | 0.9912 | |
• Thinking critically and analytically | 0.7504 | |
• Analysing numerical and statistical information | 0.1857 | |
• Acquiring employability skills (e.g. skills to help you get a job such as CV writing or career planning) | 0.2404 | |
• Becoming an independent learner | 0.9030 | |
• Being innovative and creative | 0.4160 | |
• Working effectively with others | SX1519 | 0.0094 |
• Developing or clarifying personal values or ethics | 0.1499 | |
• Understanding people of other backgrounds (economic, racial/ethnic, political, religious, nationality, etc.) | SX1519 | 0.0201 |
• Exploring complex real-world problems | SX1519 | 0.0009 |
• Being an informed and active citizen | SX1519 | 0.0031 |
* UG – general undergraduate cohort, SX1519 – blended MOOC cohort. |
Figure 2 summarises the findings, with numbers in brackets indicating the number of items in each dimension for which significant differences were found.
Figure 2. Patterns of student engagement in the undergraduate cohort and SX1519 (blended MOOC course).
Whilst the inferential statistical testing indicates that there are differences and which cohort is ‘more engaged’ along each Likert item, it is important to be conscious that the tests do not indicate the magnitude of the differences. However, they do suggest that differences exist and are not due to sampling variability. Figure 2 shows that the general cohort appear to be more engaged in aspects related to critical thinking and challenge, whilst SX1519 respondents report more engagement in aspects of learning with others, reflecting, engagement with research and enquiry, and staff–student partnerships. There were differences in engagement with specific aspects of skills development, with SX1519 respondents reporting higher engagement on aspects related to working with others and understanding people, whilst general cohort respondents were more engaged with the development of writing skills.
Following the statistical analysis, information was obtained from focus group discussions to enrich our understanding of differences and other key aspects of student engagement on SX1519. Whilst in-depth information from the participants was obtained during discussion, the difficulty of drawing general conclusions from these data is acknowledged. Although there were some areas of agreement between respondents, there were also many differences, which reflect the different personalities, learning preferences and motivation of the respondents. As a result, qualitative data are used here in an illustrative manner to highlight some of the key issues which emerged.
Focus group participants were asked about how they got started with SX1519 and specifically the MOOC. Two of the three focus group respondents have never taken a MOOC before SX1519. The importance of induction was acknowledged, and although none of the respondents had problems themselves, they all reported that other students had not been sure how to engage initially, but then picked it up quickly. The course coordinator also reported some initial confusion, for example, over the various software platforms being used for the course. Confusion over the introductory email also led to some students overlooking their invitation to join. Once engaged, and perhaps as a result of the integration of the MOOC into an assessed course, learners reported systematic and thorough approaches to working through MOOC materials. The respondent who had previously undertaken other MOOCs talked about taking SX1519 ‘more clinically’ than a non-university student might have done so. They appeared to engage as requested, for example, posting contributions to discussions when asked to do so. All of the focus group participants had posted messages in the MOOC and suggested that they would be more likely to contribute to a discussion in this form than in a face-to-face classroom. All respondents valued the flexibility of the approach and the variety of media available, together with the choice and control they had using the MOOC. All spoke about ‘active learning’, regarding learning in the MOOC as more active than sitting in a lecture.
As mentioned above, all focus group respondents reported posting comments in the MOOC. One respondent stated that he was not interested in making contributions, which suggests that his contributions may have been made in response to course requirements rather than voluntarily. However, all three respondents learnt from the discussions, albeit in different ways. One found text contributions to be useful alternative explanations of content, another valued interesting and authentic examples provided in comments, whilst the third valued feedback from real-world learners and engaged in some deeper perspective-changing conversations in the MOOC. These comments illustrate social learning taking place in a number of ways – though vicarious learning (e.g. reading and observing other contributions), direct interactions, knowledge exchange and conversation. Respondents also valued the opportunity to learn from others beyond their normal learning context:
Many people who take the course are people who actually live in Sub-Saharan Africa and they give their perspectives about the topics and their opinions. (Female student)
You’re talking about Africa and education and you write something and there is someone who is already in Africa or on the ground doing the work so they would feedback, so then you interact with them, you get to know them…I think…it is one of the very good features of the MOOC. (Male student 1)
You can learn quite a bit from other people’s comments. (Male student 2)
These statements and evidence of some involved interactions give weight to the quantitative data on understanding different perspectives and people of other backgrounds.
Making sense of the nature of other MOOC participants was difficult for the focus group respondents. One assumed that all other participants were students, whilst another commented that ‘there are many African [students] taking the course’. The third had the impression that some participants had professional interests in the topic, in their roles as charity, government or development workers. None of the respondents were able to find or recognise on-campus peers within the MOOC.
Findings from quantitative analysis highlight particular aspects of engagement as significant for the MOOC cohort. Focus group participants corroborate some of these, for example, ‘contributing to a joint community’. Respondents reported:
You’d be more likely to comment in the MOOC (than in a classroom). (Male student 2)
In the classroom environment… you don’t interact…you are just like yourself basically … there’s more interaction with people elsewhere in the world in the MOOC. (Male student 1)
You feel less pressure, your classmates are not looking at you, if you say something silly you won’t be criticised, at least directly. (Male student 2)
Learners acknowledged other benefits of engagement in a MOOC, recognising that the blended learning approach on SX1519 provided opportunities to engage with a larger cohort of learners and be exposed to a wider range of perspectives than possible in a classroom context:
The connectivity that is part of the MOOC is something that is very, very interesting that you don’t find in a four walled [lecture] theatre. (Male student 1)
If you want to engage a lot of people from a large area then I couldn’t really think of any other ways to do it. (Male student 2)
Despite limitations due to small numbers of respondents, the focus group data provide insights into how learners engaged with the MOOC during their on-campus course and support deeper discussion of issues identified by the UKES data.
The majority of the UKES scales with significant differences which favoured the MOOC cohort as more engaged and involved can be conceptualised as including a ‘social’ dimension (see Table 1 and Figure 2). Where the differences favoured the general undergraduate experience, they can be considered to be the dimensions which focus on the rigour of the course – analysing, applying theories and being challenged. The student perception of the MOOC course and their engagement with it was different from the classes of the usual undergraduate campus experience but there were areas where there were no observed differences between the cohorts – in particular, there were no differences across the entire ‘interacting with staff’ scale.
SX1519 is a course which demands that students attempt to empathise with experiences of others living different lives on a different continent, and it is consequently refreshing that students reported more engagement with ‘understanding people of other backgrounds’, ‘exploring complex real-world problems’ and ‘being an informed and active citizen’. However, in this study for many of the dimensions of engagement, there appears to be a consistent component of social learning which differs between the MOOC blend and the general on-campus experience which extends beyond specific curricular differences.
Focus group respondents reported being encouraged to contribute to a community and more likely to contribute to the MOOC than in a face-to-face classroom. The behaviours exhibited, which included undertaking activities systematically and contributing to discussions when requested, differ from those reported in other studies. For example, students have been found to be more likely to interact in face-to-face programmes (Bruff et al. 2013; Wintrup, Wakefield, and Davis 2015). Bruff et al. (2013) reported that learners on a blended course including a MOOC undertook selective reading, mostly to find answers to questions, and they contributed no posts. Caulfield, Collier, and Halawa (2013) report limited participation in forums, and Milligan and Littlejohn (2014) found little exchange of ideas and experience in a MOOC on clinical trials. Sinclair and Kalvala (2016) suggest that online communities are claimed as a positive aspect of MOOCs, but that collaborative learning is under-utilised. Since our findings suggest active engagement, this appears to be an aspect worthy of further investigation. The course coordinator suggested that the willingness to contribute may have been influenced by monitoring of MOOC participation by tutors and the completion of all steps on the course by the need to complete weekly tests successfully. However, there is also evidence that learners felt under less pressure and therefore more confident about participating online rather than face-to-face.
Platform pedagogy may also have an influence on interaction and social learning. Sharples (2013) highlights social learning as one of the three principles (along with storytelling and celebrating progress) underpinning course design on the FutureLearn platform. To help achieve this commenting is visible and accessible on the platform, and this may, in part, explain the levels of contributions reported in this study. In SX1519 learners were required to comment regularly on questions set by educators, and tutors monitored their activity. In addition, face-to-face tutorials drew directly from the comments that Aberdeen students and the wider cohort were posting in the MOOC. For example, tutors used interesting comments during face-to-face discussions to help consolidate and advance learning.
Thus, in this case, course design, platform pedagogy and teaching presence (defined by Kovanović et al. (2018) as instructional activities before and during the course) may have influenced learners’ willingness and ability to contribute effectively and engage in social learning. Effective teaching presence has also supported the blending of the two contexts – global MOOC community and local on-campus cohort – to provide learners with support and opportunities for learning. Commenting on differences between MOOCs, Wintrup, Wakefield, and Davis (2015) proposed that ‘specific forms of learning are sensitive to MOOC pedagogy and curricula and that design and teaching approaches can elicit particular forms of engagement’ (p. 8). Work is underway to explore the influence of MOOC platform and course design on discussion characteristics (Chua et al. 2017) but more evidence is needed to allow firm conclusions to be drawn.
The opportunity to have questions answered by the global community engaged in a MOOC, sometimes by experienced professionals with real-world experience relevant to the course, may have helped SX1519 participants to have confidence in the MOOC discussions, although some participants appear not to have realised that experts were on hand online. The ability of MOOC learners to project a sense of their expertise and authority requires effective social presence, an important element of the Community of Inquiry framework. The inability of focus group respondents to identify peers in the MOOC or to correctly identify the nature of other MOOC participants suggests that their own social presence and appreciation of that of others was insecure and that there was, at least in the MOOC, a lack of group affectivity and cohesion, both important elements of social presence (Rourke et al. 1999). Further work might be needed to ensure that on-campus learners are confident with platform tools which support development of social presence and participants’ understanding of the identity of others undertaking a MOOC. The induction stage could also be used to help learners address any preconceptions or misunderstandings about the nature of other learners and appreciate the expectations and benefits of interaction.
Respondents highlighted flexibility, control, active learning and the ability to interact with other learners beyond the physical classroom as positive features of the blended approach. Barriers to engagement appear to be have been minimal, with initial engagement the only stage at which any problems were acknowledged. The importance of induction, which explains the pedagogic approach to the course and ensures that it is accessible, is clear, and the course coordinator suggested that a more thorough hands-on demonstration of how to access the MOOC would have been helpful. Wintrup, Wakefield, and Davis (2015) also suggest that the initial steps in the MOOC are important for engagement: ‘this first step “inside the shop” needs to be easy and attractive …’ (p. 15). It is clear that a strong teaching presence is needed to set expectations, overcome initial barriers to engagement and support the development of social presence.
Kovanović et al. (2018) have proposed updates to the Community of Inquiry model for MOOC settings, suggesting a six-factor model with additional factors related to course organisation and design (teaching presence), group affectivity (social presence) and the resolution phase of inquiry-based learning (cognitive presence). Elements of these issues have emerged in the context under examination here. Course organisation and design clearly require careful consideration, and here the two communities, MOOC and on-campus, were thoughtfully integrated to provide a coherent learning experience with opportunities for elements of inquiry-based learning. Based on evidence presented here, the case could be made for stronger teaching presence, particularly at the start of a blended course, and additional support to help learners develop effective social presence, particularly in connection with group cohesion and affectivity.
Whilst UKES has previously been employed to explore engagement in MOOCs, implementation in a blended context and comparison with on-campus learners is novel. The role and value of engagement surveys in complex learning settings such as blended learning contexts may yield different patterns of responses, and there is evidence in this study of differences between the experiences of learners undertaking a blended course including a MOOC, and a wider cohort of on-campus students. UKES was not specifically designed for such a context, and thus, qualitative data from learners have been used to validate and supplement quantitative data.
The majority of the UKES scales with significant differences which favoured the MOOC cohort as more engaged and involved can be conceptualised as including a ‘social’ dimension. Where the differences favoured the general undergraduate experience, they reflect dimensions which focus on the rigour of the course – critical thinking and challenge. No differences in engagement were identified in key areas including ‘interaction with staff’ and ‘teaching on the course’.
The design of the blended course, MOOC platform pedagogy and actions of tutors contributed to learners’ engagement. Employing a MOOC in a blended context can provide opportunities to encourage and support social learning on and off the MOOC platform. The importance of induction to a blended pedagogy, which includes explanation of expectations, technical and social issues, is identified as an issue which may impact on engagement. The findings also raise issues worthy of further exploration, for example, the impact of blended course design and teachers’ actions on engagement and participation in discussions, where findings are at odds with some other studies.
Exploration of findings through a Community of Inquiry lens reveals issues around social presence (e.g. inconsistencies in the perception of the identity of other learners) and highlights the key role of teaching presence (particularly, in course organisation and design). Thus, this study also supports some of the proposed updates to the Community of Inquiry framework suggested by Kovanović et al. (2018).
The authors would like to acknowledge the work of the large team behind the MOOC ‘Africa: Sustainable Development for All?’ which was led by Professor Hilary Homans. Thanks are also due to the students who participated in the UKES, and particularly those who contributed to the focus groups.
1https://www.heacademy.ac.uk/download/ukes-2016-questionnaire
Akyol, Z. & Garrison, D. R. (2011) ‘Understanding cognitive presence in an online and blended community of inquiry: assessing outcomes and processes for deep approaches to learning’, British Journal of Educational Technology, vol. 42, no. 4, pp. 233–250. https://doi.org/10.1111/j.1467-8535.2009.01029.x
Braun, V. & Clarke, V. (2006) ‘Using thematic analysis in psychology’, Qualitative Research in Psychology, vol. 3, no. 2, pp. 77–101. https://doi.org/10.1191/1478088706qp063oa
Bruff, D., et al., (2013) ‘Wrapping a MOOC: student perceptions of an experiment in blended learning’, Journal of Online Learning and Teaching, vol. 9, no. 2, pp. 187–199.
Buckley, A. (2014a) ‘How radical is student engagement’, Student Engagement and Experience Journal, vol. 3, no. 2. https://doi.org/10.7190/seej.v3i2.95
Buckley, A. (2014b) UK Engagement Survey 2014: The Second Pilot Year, The Higher Education Academy, York, Available at: https://www.heacademy.ac.uk/system/files/resources/ukes_report_2014_v2.pdf
Caulfield, M., Collier, A. & Halawa, S. (2013) Rethinking Online Community in MOOCs for Blended Learning (web log post), Available at: https://er.educause.edu/articles/2013/10/rethinking-online-community-in-moocs-used-for-blended-learning.
Chua, S. M., et al., (2017) ‘Discussion analytics: identifying conversations and social learners in FutureLearn. MOOCs’, LAK-MOOCs-2017 MOOC Analytics: Live Dashboards, Post-Hoc Analytics and the Long-Term Effects, Workshop proceedings. Available at: http://ceur-ws.org/Vol-1967/FLMOOCS_Paper3.pdf
Cohen, A. & Holstein, S. (2018) ‘Analysing successful massive open online courses using the community of inquiry model as perceived by students’, Journal of Computer Assisted Learning, vol. 34, no. 5, pp. 544–556. https://doi.org/10.1111/jcal.12259.
Damm, C. V. A. (2016) ‘Applying a community of inquiry instrument to measure student engagement in large courses’, Current Issues in Emerging ELearning, vol. 3, no. 1, pp. 138–172. https://scholarworks.umb.edu/ciee/vol3/iss1/9
Garrison, D. R., Anderson, T. & Archer, W. (1999) ‘Critical enquiry in a text based environment: computer conferencing in Higher Education’, The Internet and Higher Education, vol. 2, pp. 87–105. https://doi.org/10.1016/S1096-7516(00)00016-6
Garrison, D. R., Anderson, T. & Archer, W. (2001) ‘Critical thinking, cognitive presence and computer conferencing in distance education’, American Journal of Distance Education, vol. 15, pp. 7–23. https://doi.org/10.1080/08923640109527071
Gašević, D., et al., (2014) ‘Where is research on Massive Open Online Courses Headed? A data analysis of the MOOC research initiative’, International Review of Research in Online and Distance Learning, vol. 15, no. 5. https://doi.org/10.19173/irrodl.v15i5.1954
Ghadiri, K., et al., (2013) ‘The transformative potential of blended learning using MIT edX’s 6.002x online MOOC content combined with student team-based learning in class’, https://www.edx.org/sites/default/files/upload/ed-tech-paper.pdf
Howson, C. & Buckley, A. (2016) ‘Development of the UK engagement survey’, Assessment and Evaluation in Higher Education, vol. 42, no. 7, pp. 1132–1144. https://doi.org/10.1080/02602938.2016.1235134
Israel, M. J. (2015). ‘Effectiveness of integrating MOOCs in traditional classrooms for undergraduate students’, IRRODL, vol. 16, no. 5, pp. 102–118. https://doi.org/10.19173/irrodl.v16i5.2222
Jaffer, T., Govender, S. & Brown, C. (2017) ‘“The best part was the contact!” Understanding postgraduate students’ experiences of wrapped MOOCs’, Open Praxis, vol. 9, no. 2, pp. 207–221. https://doi.org/10.5944/openpraxis.9.2.565
Kahan, T., Soffer, T. & Nachmias, R. (2017) ‘Types of participant behavior in a Massive Open Online Course’, IRRODL, vol. 18, no. 6, pp. 1–18. https://doi.org/10.19173/irrodl.v18i6.3087
Kovanović, V., et al., (2018) ‘Exploring communities of inquiry in massive open online courses’, Computers and Education, vol. 119, pp. 44–58. https://doi.org/10.1016/j.compedu.2017.11.010
Macfarlane, B. & Tomlinson, M. (2017) ‘Critiques of student engagement’, Higher Education Policy, vol. 30, no. 1, pp. 5–21. https://doi.org/10.1057/s41307-016-0027-3
Milligan, C. & Littlejohn, A. (2014) ‘Supporting professional learning in a massive open online course’, IRRODL, vol. 15, no. 5, pp. 197–213. https://doi.org/10.19173/irrodl.v15i5.1855
Milligan, C., Littlejohn, A. & Margaryan, A. (2013) ‘Patterns of engagement in connectivist MOOCs’, Journal of Online Learning and Teaching, vol. 9, no. 2, pp. 149–159. http://jolt.merlot.org/vol9no2/milligan_0613.htm
Pascarella, E., Seifert, T. & Blaich, C. (2010) ‘How effective are the NSSE benchmarks in predicting important educational outcomes’, Change: The Magazine of Higher Learning, vol. 42, no. 1, pp. 16–22. https://doi.org/10.1080/00091380903449060
Rourke, L., et al., (1999) ‘Assessing social presence in asynchronous text-based computer conferencing’, The Journal of Distance Education, vol. 14, no. 3, pp. 50–71. https://www.learntechlib.org/p/92000/
Sharples, M. (2013) ‘Social learning and large scale online learning’, FutureLearn blog [online] Available at: http://about.futurelearn.com/blog/massive-scale-social-learning/
Sinclair, J. & Kalvala, S. (2016) ‘Student engagement in massive open online courses’, International Journal of Learning Technology (IJLT), vol. 11, no. 3, pp. 218–237. https://doi.org/10.1504/IJLT.2016.079035
Trowler, V. (2010) Student Engagement Literature Review, Higher Education Academy, York, UK, Available at: https://www.heacademy.ac.uk/system/files/studentengagementliteraturereview_1.pdf
Wicks, D. A., et al., (2015) ‘An investigation into the community of inquiry of blended classrooms by a Faculty Learning Community’, The Internet and Higher Education, vol. 25, pp. 53–62. https://doi.org/10.1016/j.iheduc.2014.12.001
Wintrup, J., Wakefield, K. & Davis, H. (2015) Engaged Learning in MOOCs: A Study Using the UK Engagement Survey, Higher Education Academy, York, UK [online], Available at: https://www.heacademy.ac.uk/sites/default/files/resources/engaged-learning-in-MOOCs.pdf
Yousef, A. M. F., et al., (2015) ‘A usability evaluation of a blended MOOC environment: an experimental case study’, IRRODL, vol. 16, no. 2, pp. 69–93. https://doi.org/10.19173/irrodl.v16i2.2032