ORIGINAL RESEARCH ARTICLE

The impact of audience response platform Mentimeter on the student and staff learning experience

Emma Mayhewa*, Madeleine Daviesb, Amanda Millmorec, Lindsey Thompsond and Alicia Pena Bizamae

aFaculty of Arts and Social Science, University of Surrey, Guildford, Surrey, UK; b School of Literature and Languages, University of Reading, Reading, Berkshire, UK; cSchool of Law, University of Reading, Reading, Berkshire, UK; dSchool of Biological Sciences, University of Reading, Reading, Berkshire, UK; eStudent Wellbeing Service, University of Reading, Reading, Berkshire, UK

Received: 20 January 2020; Revised: 3 July 2020; Accepted: 3 July 2020; Published: 30 October 2020.

Abstract

Research suggests that active and discussion-driven dialogic approaches to teaching are more effective than passive learning methods. One way to encourage more participatory learning is through the adoption of simple and freely available audience response systems which allow instant and inclusive staff–student dialogue during teaching sessions. Existing literature is largely limited to exploring the impact of basic approaches to audience participation, using handheld cards or simple ‘clickers’. Limited research exists looking at the impact and best use of a new generation of online audience response systems which have significantly expanded functionality. This article explores the impact of one of the most agile platforms, Mentimeter. It outlines impact on student satisfaction, enjoyment, voice and learning within small and large group settings across multiple disciplines drawing on 204 student survey responses. It also explores staff experiences and reflections on the key practical and pedagogical thinking required to optimise the use of this platform in higher education. The research responds to a need within the sector to react to rapid advances in teaching and learning technology, to provide evidence of impact for lecturers looking to improve student learning environments whilst being cognisant of the underlying pedagogy supportive of new practices.

Keywords: active learning; dialogic teaching

*Corresponding author. Email: e.mayhew@surrey.ac.uk

Research in Learning Technology 2020. © 2020 E. Mayhew et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Research in Learning Technology 2020, 28: 2397 - http://dx.doi.org/10.25304/rlt.v28.2397

Introduction

Educators are increasingly being challenged to introduce more interactive and engaging approaches to teaching. Students are right to expect this. An established body of research, across disciplinary areas, has found that the shift away from passive learning methods towards student-centred active learning leads to significant increase in satisfaction, engagement, learning (Knight and Wood 2005; Michael 2006) and attainment (Deslauriers et al. 2019). In particular, research has shown that introducing a more dialogic approach, drawing on the power of classroom talk as the basic foundation of teaching and learning, feeds into student cognitive development and higher attainment (Alexander 2017). Lecturers can use dialogue to understand students’ perspectives, explore emerging ideas and correct misunderstandings. Lecturers and students create a democratic learning community working in a reciprocal, supportive space. Participants can disagree, challenge, self-correct, develop problem-solving skills and learn more deeply in comparison to passive approaches focused on listening and recall.

Whilst advancements in technology have fostered and encouraged dialogue in digital spaces between academics and students, often this use of technology is asynchronous, for example by the use of a discussion board, forum or wiki within the student’s Virtual Learning Environment (VLE). These more static forms of dialogue have been shown to be positive in fostering a sense of belonging to an online community (McDaniels, Pfund, and Barnicle 2016; Yee and Ean 2020), but there are limitations and often the quality of discussion is constrained and does not develop naturally (Gao, Zhang, and Franklin 2013). Their benefits in supporting learning are clear (Gao, Zhang, and Franklin 2013), but rather than focusing on asynchronous tools, we were keen to look at using technology to support this dialogue in a synchronous environment.

One way to encourage more active learning and, in particular, a dialogic approach, is through the adoption of audience response systems (ARS). Unlike VLE-based discussion boards, ARS can easily be used synchronously while teaching because they allow instructors to pose a range of questions live and directly to the student audience during lectures and seminars (Compton and Allen 2018). This opens up numerous possibilities for in-class, ongoing staff–student interaction. In the past, these systems typically involved the distribution and use of individual handheld ‘clickers’ which would register audience responses to simple yes/no or multiple-choice questions and instantly display overall results on a central screen.

Research focused on these basic ARSs shows that their use creates a more dynamic session enabling student-focused, discussion-driven pedagogy (Beatty 2004), and can lead to improvements in learning gain and deeper learning (Beekes 2006). It increases problem-solving skills (Hake 1998; Knight and Wood 2005), engagement (Heaslip, Donovan, and Cullen 2014), motivation, particularly within large-group lectures (Gauci et al. 2009), peer-to-peer interaction (Caldwell 2007; El-Rady 2006), enjoyment and attention (Elliot 2003). ARS can also increase inclusivity, particularly for students used to passive learning or for those who are reluctant to participate (Beekes 2006; Graham et al. 2007). Because the ideas and opinions of the whole cohort are visible (Little 2016), students can immediately see that their peers might have misunderstood or be confused just as they are (Knight and Wood 2005) so can enhance the sense of belonging to a learning community. Students can be exposed to immediate formative feedback (Caldwell 2007) which allows instructors to measure student understanding (Hung 2016) and adapt session content (Beatty 2004).

The edtech industry has now moved significantly beyond basic handheld keypads and clickers. In the last 10 years, lecturers have been able to access new-generation web-based ARSs or ‘live voting apps’ at no, or very low, subscription cost. These include multi-player quiz-based apps encouraging gamification within teaching sessions such as Kahoot (Cameron and Bizo 2019), Quizziz and Socrative (Guarascio, Nemeck, and Zimmerman 2017).

This research focuses on Mentimeter, which has one of the broadest range of functions and is increasing in popularity within the sector. Lecturers create presentations using the Mentimeter site (www.mentimeter.com). Audience members visit www.menti.com on any web browser and use a unique pin code to access the presentation. The platform enables students to send responses as the lecturer shows each slide on a central screen. Students do not create their own accounts, universities do not buy hardware, and there are no devices to distribute and collect during classes. It is open source and cloud-based, so there is no need to download software. Colleagues can combine static slides with the ones requiring audience participation, including a small number of activity slides, or run the entire presentation as an interactive activity. Users need to determine appropriate questions and how they would like answers to be displayed. Lecturers control all timings – when the instructor moves to the next slide, all students’ devices immediately reflect this. Mentimeter shows the number of responders in real time in the corner of the screen, so lecturers know when to move on. The platform enables both qualitative and quantitative responses through a broad range of question types. For example, using their own devices, students can collectively create word clouds, rate statements according to scales (results move dynamically as each result is cast), ask questions anonymously or provide comments. Students can distribute 100 points against a range of options, vote in support of a specific answer, concept, school of thought or person, rate ideas across a 2×2 matrix, complete surveys or join a communal quiz to check knowledge. Mentimeter adopts a standard ‘freemium’ model allowing educators free use of a basic version with an option to pay a small monthly fee for access to additional functionality, such as the import of PowerPoint presentations into Mentimeter and the export of data to Excel.

Very little research has been carried out to explore how Mentimeter impacts teaching and how best it can be optimised. Skoyles and Bloxsidge’s (2017) use of Mentimeter to outline referencing skills to law students enhanced engagement, created a more inclusive experience and enabled formative assessment. A small study conducted by Davarzani (2013) found that Mentimeter increased student interest and encouraged involvement. A short study conducted by Puspa and Imamyartha (2019) suggests that Mentimeter improved the learning experience of English students in West Java. An unpublished report by Hill and Fielden (2017) found that students enjoyed posting anonymous questions and live quizzes, especially important for less confident students who feared being wrong or looking ‘silly’. Similarly, Vallely and Gibson’s (2018) short review found it enabled safe, non-judgemental dialogue and more tailored teaching. A short review by Little (2016) also highlights increased student engagement. Outside higher education, a Norfolk vicar recently found that using Mentimeter to ask church service attendees to rate hymns, ask questions during sermons and create word clouds of subjects that church-goers are praying for enabled old and young parishioners to speak to each other and encouraged shy audience members to engage (Bale 2018).

This research aims to contribute to the limited existing literature by focusing on the following research questions:

  1. How does the use of Mentimeter impact students’ teaching and learning experience across disciplinary areas and types of teaching sessions?
  2. How does Mentimeter impact staff experience and what key practical and pedagogical thinking is required to optimise the platform?

Method

Mentimeter was introduced to students attending the teaching sessions shown in Table 1:

Table 1. Teaching sessions trialling Mentimeter.
Discipline and module Year Session type Number of students attending Average participation rate Use of functionality
English Literature ‘Research and Criticism’ 1 Small group seminar 13 100% Use of open-ended questions and multiple choice question (MCQ) bar charts in the first seminar meeting to test the group’s knowledge of unfamiliar terms in a way that did not humiliate or target.
English Literature ‘Critical Issues’ 2 Large lecture 50 40% Use of word cloud, open-ended free text questions and voting to identify problems with theoretical positions, rank critical approaches to literature and ask questions.
Law ‘Tort’ 1 Large lecture Up to 310 50%–82% Use of word cloud and sliding scales for students to rate their own views to enable the identification of issues in a question scenario. Use of MCQ ‘dots’, pie charts and bar charts to section 2-h lectures into manageable parts by asking questions at intervals to encourage concentration.
Law ‘Family Law’ 3 Large lectures and small group seminars 100–125 in lectures; 15–20 in seminars 50%–65% in lectures; up to 100% in seminars Use of MCQs to check knowledge and understanding and for students to practice applying their knowledge to factual problem scenarios.
Maths ‘Maths Foundation’ F Large lecture 100–160 90% Use of MCQ bar charts for students to check their understanding of concepts covered in the lecture. It was used midway through the lecture and at the end with time provided for students to ask questions after each set of Mentimeter results were shown.
Maths ‘Applications of Physics for Medicine’ 2 Small group lecture 30 100% Use of MCQ bar charts to provide feedback to students on their understanding of the main learning objectives of the lecture and to ask questions during the session.
Life Tools (voluntary life skills psycho-educational training) All Small group lecture 20–100 80% Use of MCQ bar charts, word clouds and sliding scales for students to rate levels of productivity, stress management and concentration at the start and end of the session to indicate the level of impact the session has.

Note that, for the purpose of this article, ‘large group lectures’ refer to 100 or more students in a theatre-style layout. ‘Small group lectures’ refer to 0–99 students in a theatre-style layout. ‘Small group seminars’ refer to up to 30 engaging in open discussion in a cabaret or boardroom-style layout.

Examples of the use of Mentimeter in these sessions are shown in Figure 1.

Fig 1
Figure 1. From top: Examples of open-ended question and ‘vote for winner’ formats in small group literature lectures, multiple choice questions (MCQ) donut chart in large law lecture, word cloud in small law seminar, MCQ ‘dots’ in large law lecture, MCQ bar charts in medical physics lecture, and sliding scales in life skills psycho-educational training.

To understand the impact of Mentimeter, an anonymous questionnaire was distributed to students who had experienced Mentimeter in at least one teaching session. It asked students to respond to a range of statements using Likert scales, multiple-choice and open-ended questions. Survey questions were based on key themes within the existing literature.

A hard copy of the survey was distributed and completed by students at the end of teaching sessions. In larger classes, where a hard copy distribution and collection would be difficult in the time available, a link to an identical anonymous Online Surveys was created using Online Surveys.

Voluntary, informed consent was secured from all students. All data remained anonymous and confidential throughout. This research was considered by a University Research Ethics Committee, and complete approval was provided.

Quantitative data was drawn together and thematically analysed using Braun and Clarke’s (2006) six-phase thematic analysis: familiarisation, coding, establishing, reviewing and naming themes, and writing up into a narrative. Open-ended questions were given a manual coding according to a range of themes identified using both deductive and inductive approaches to capture additional areas not initially identified.

A focus group was held to explore staff experiences of using the platform with a particular emphasis on practical and pedagogical thinking to optimise use. Five colleagues from the School of Law and the Department of English Literature were invited on the basis of their prior knowledge of Mentimeter, use of other interactive platforms (such as Kahoot) and interest in innovative teaching. Responses were audio-recorded, transcribed and analysed thematically.

Results

In total 204 students completed the survey: Foundation students (48), year 1 undergraduate students (89), year 2 undergraduate students (34), year 3/4 undergraduate students (30), and postgraduate students (3) across 10 different disciplinary areas of law, English literature and language, biological sciences, maths, philosophy, psychology, economics, languages, business and pharmacy. Of the total responders, approximately 61% had experienced the use of Mentimeter within a large group lecture, 23% had experienced Mentimeter within a small-group lecture, 4% within a small group seminar and 12% in multiple settings.

The impact of Mentimeter on the student experience

Student satisfaction

Students across all disciplinary areas expressed strong levels of satisfaction as shown in Figure 2. There were no statistically significant differences between students in English, law, biology and other disciplines (p < 0.001); however, foundation maths students reported significantly less satisfaction (p < 0.001) when data were compared using both Mann Whitney and t- comparisons.

Fig 2
Figure 2. Student satisfaction with the use of Mentimeter in teaching sessions (percentages rounded to the nearest number).

Eight students (4% of responders) reported that they did not like Mentimeter (seven foundation maths students and one English literature student). One said: ‘Don’t feel confident revealing answer’; two others said, it ‘wastes time’ and ‘slow to set up’, one would rather do more ‘lecture questions’; another felt everyone just copies answers; two others said that they did not like using more technology, but all eight went on to identify benefits; three said that Mentimeter made learning more enjoyable, for example. None of the feedback from these students appeared to relate to their discipline or type of teaching session they attended. In terms of satisfaction, 1% (3 students) felt less satisfied. Two of these were from the above group who also said that they disliked Mentimeter. The other responder provided no further explanation.

In contrast, 191 students (96%) liked Mentimeter and 171 (82%) felt ‘more’ or ‘much more’ satisfied when Mentimeter was used in teaching sessions. In an additional question, 94% felt that Mentimeter should be used more. Comments include:

I literally love using it.

Mentimeter should be used by everyone!

Three key themes are evident within qualitative and quantitative data which start to explain such high levels of satisfaction: firstly, the role of Mentimeter in enhancing enjoyment; secondly, the role of Mentimeter in enhancing the student voice; and thirdly, the role of Mentimeter in improving student understanding, learning and retention.

Mentimeter increases student enjoyment. Of those that responded, 95% said that their learning experiences were more enjoyable and 62% said that their lectures or seminars felt ‘less formal and fun’:

[Mentimeter was] a way to relax and have fun.

Makes the lectures more fun and interesting as it’s not just someone talking at you.

I found it to be fun and energising – actually, interacting with lecturers rather than just sitting and listening makes it easier to pay attention.

Nearly half of all responders (75 students) who explained why they liked Mentimeter (169 students) specifically used the words ‘interactive’ and/or ‘engaging’ unprompted, in free text comments. This was particularly the case for law students who were often attending back-to-back lectures. Students commented as follows:

[Mentimeter] keeps you engaged when you drift away.

It was a fun interactive way to discuss your opinions on a case and was a good break when you have continuous 6-hr lectures.

Enhancing attention was also reported across other disciplinary areas. When asked how Mentimeter impacts the levels of attention in teaching sessions compared to sessions that do not use Mentimeter, 74% of all responders said that they had experienced either higher or significantly higher levels, mirroring Elliot’s (2003) findings exploring the impact on attention of basic handheld response systems.

Mentimeter enhances the student voice. Students were similarly positive when asked about Mentimeter’s impact on the student voice. A key theme was that Mentimeter allows all students to engage and, because this engagement is easy and completely anonymous, students are less restricted by a lack of confidence or other constraints. In all, 72% said that Mentimeter helped them to feel more confident participating in seminars and lectures (28% said that Mentimeter had no impact). When asked to identify whether Mentimeter changed their learning experience, 56% chose to highlight that their lecture or seminar felt more inclusive for all types of learner. A total of 35% chose to highlight that they felt their voice was being heard. The emphasis on student voice-related responses is shown in Figure 3.

Fig 3
Figure 3. Student responses when asked whether Mentimeter had changed their lecture or seminar experience (Figures show the number of students who ticked each comment).

Comments relating to the student voice included the following:

Good way to engage…without fear of being wrong.

Sometimes interacting in class is a bit nerve wracking; this was a way of doing that without actually having to use my voice.

It provided a useful tool to include thoughts & ideas of students who would otherwise be hesitant to answer or contribute in lectures.

Allows you to interact in lectures without having to face the pressures of ‘speaking in front of everyone’. Also, the fact that it is anonymous is a very good feature as it means you can feel free to share your opinion without the fear of ever being ‘wrong’ or ridiculed.

It helps with people like me that struggle with anxiety and it is pretty fun to be honest.

Benefits extend beyond increased student–lecturer dialogue. Students also point to the use of Mentimeter in facilitating peer-to-peer interaction; in some sessions leaders would use voting results, for example, as a starting point for further student-led small group discussion. This may help to explain why 72% of responders felt that Mentimeter encouraged them to feel part of a learning community.

Increased voice can impact understanding; for example, within a small group literature lecture, Mentimeter was used to invite students to identify problems with complex theoretical positions and rank critical approaches. The lecturer felt that the follow-on seminar was much more sophisticated than in previous years and this was linked to the way in which students had been more involved in the broad-based introductory lecture.

Greater willingness to participate appears linked to anonymity, a feature that normal class discussions cannot deliver (Heaslip, Donovan, and Cullen 2014). In all, 76% of students said that they liked this feature because it encouraged them to participate. However, 26% said that it made no difference, although one made the point that ‘it may also benefit other members of the seminar/lecture as an individual may think of an answer that the rest have not’. As such, not only is Mentimeter one important way in which students can project their voices when they might otherwise have been silent but students also recognise the value in hearing the views of others, mirroring the findings of earlier research (Knight and Wood 2005; Little 2016). In addition, when asked to identify any ways in which Mentimeter had impacted their learning experience, 51% of students highlighted that they felt reassured by seeing how fellow students answered questions and what kind of questions they were asked because they then felt that they were not the only one thinking the same thing and that they were ‘not the only one struggling’.

Although anonymity might help to enable an inclusive environment, it is important to note that a small number expressed concerns about inappropriate or pointless comments. One responder said, ‘In an EU law lecture some students kept spamming nonsense…and I found that disturbing’. This requires the lecturer to develop strategies to avoid misuse such as avoiding free text question types and issuing regular reminders about professional behaviours.

More broadly, because Mentimeter enables the student voice to be heard so easily, some responders have reflected on how it starts to alter the dynamics between lecturer and student, consistent with ideas around dialogic teaching approaches: ‘It creates an atmosphere for interaction between the teachers and students and thus aids learning and encourages debate’. Part of this change in dynamics also comes from lecturers adopting a more agile approach to teaching, using the additional student responses to expand or facilitate further debate.

Mentimeter can help to improve student learning. Students were asked, how they felt Mentimeter impacted on the amount that they had learnt. A total of 68% said that Mentimeter either increased or significantly increased learning. Almost all other responders said that the level of learning was the same. Four key themes were identified – knowledge, application, flexibility and retention.

Students repeatedly commented that Mentimeter enables knowledge and understanding to be checked. For example, students in large maths lectures found that the way the lecturer used Mentimeter allowed them to ‘assess what we have covered in lectures’, ‘check knowledge’, and ‘confirm what you don’t remember and show you what to work on’. When it became clear in a law lecture that students had universally misunderstood a particular concept, the lecturer was able to adjust their lecture plan and go back to that idea and explain it again in more depth. In this way, Mentimeter ‘helps the lecturer understand the class’ and ‘to know what had been explained was understood’.

A strong and related theme in large law lectures was the use of Mentimeter to ask students to apply theoretical knowledge to real-world situations. This is a skill which students practice practice in small group teaching and on which they are examined, but it is much harder to facilitate in a traditional lecture setting without Mentimeter; the lecturer uses mini scenarios at intervals throughout the lecture to check the students’ knowledge so as to ensure that they have understood the law using fictional scenarios to explore which torts had been committed in the examples given. Students found that they were able to ‘show we understand what we’re listening to’, appreciated the opportunity to ‘put the knowledge you are acquiring into practice’ and reported that it ‘allowed us to think critically rather than just absorb information’.

When asked to identify any benefits of Mentimeter, 31% of responders said that they felt learning is undertaken in partnership with the lecturer and 36% said that the lecture was more personal because it allowed the lecturer to be more flexible in what they taught next (see Figure 3). For example, during life skills psycho-educational training, students are asked to outline questions and concerns. The instructor then goes on to address these in the session knowing that they are responding to a specific need without any student feeling put under pressure by having to ask a question. As in all similar examples of lecturer response to Mentimeter-delivered synchronous feedback, a significant level of lecturer agility is required to respond to the groups’ learning needs. This issue is explored further below.

Other students, across disciplinary areas, commented that the use of Mentimeter improved content retention: ‘It can help the information to stay in our minds’ and ‘makes it easier to remember’. Although most students felt that their learning increased, it is difficult to draw any conclusions from actual performance data. Within the module used in this study, casual variables can vary from year to year, including assessment type, load and timings. In addition, lecturers often teach as part of a module team, so the use of Mentimeter is not consistent from week to week, and assessment usually draws on learning across sessions. This research is only able to draw on students’ perceptions of their learning. Understanding impact on formative or summative attainment requires further research.

Direct causal impact on attainment might be difficult to establish, but there does appear to be an impact of usage on attendance. The University does not currently employ an automatic attendance monitoring system and paper-based attendance recording is not practical in large lectures. Instead, the survey explored student perceptions. Although 55% of responders said that the use of Mentimeter would not impact their decision to attend, a large minority (45%) said that they were either much more likely to attend or more likely to attend when Mentimeter is used. This is a significant finding, given the existing body of research which has found a direct and causal relationship between attendance and attainment, suggesting that it is attendance, more than other factors, which appears to be linked to higher grades (Arulampalam, Naylor, and Smith 2007; Crede, Roch, and Kieszczynka 2010).

A positive overall response to functionality and usability. Students did not report any significant challenges surrounding the use of Mentimeter in teaching sessions. The vast majority found the Mentimeter login page easily, and once entered, found the platform easy to use. Only a small percentage of responders (14%) reported being annoyed that they needed to have a device on them or complained of any issues surrounding lack of power or data (13%). The majority of students attend class with an internet-enabled device, such as a laptop, tablet or smartphone. One of the advantages of having the results appearing live on the projector screen within the class, however, is that even students who have not actively participating can follow. Students can discuss with the person next to them if they do not have a device with which to participate.

In addition, when asked to rate the five different question types which Mentimeter enables, there were no significant variations other than a slight reduction in the value attributed to the word clouds feature (see Figure 4). This could be as a result of some past misuse, in the past, of the anonymity feature which has led to a small number of unhelpful comments being displayed.

Fig 4
Figure 4. Student responses to Mentimeter question formats.

Benefits and challenges surrounding the staff experience

Five key themes were identified following content analysis of staff-focus group discussion.

Like students, staff also identified the potential of adopting a more agile approach to teaching and, where time allows, session content. Lecturers surrender some control over the vocal ownership of the lecture, build in space to respond to issues raised by students, and must be prepared to change the focus of the class dependent on student responses. This might mean reiteration and further explanation of a concept, argument or text, or discussion of a new unpredicted area, and acceptance that the learning domain is shared and collaborative. Students become involved in a two-way dialogue, rather than being positioned (and positioning themselves) as passive observers of the teaching that is being ‘done’ to them. This has led to a greater sense of partnership for staff.

The second related theme in terms of optimising use surrounds class management. One participant commented, ‘A lot of the skill on this is how you respond to what comes on the screen’. This involves effectively managing the resulting online and offline discussion and remaining cognisant of, for example, learning goals, group dynamics and time limitations. Participants felt that it was important not to belittle incorrect answers and to encourage minority views, especially if the majority of participants are wrong. Another participant highlighted that noise levels increase when Mentimeter is used due to excitement. Lecturers need to manage the class to ‘bring them back down again’ and focus on the next element. For staff then, the use of Mentimeter does increase challenges surrounding time, content and class management.

The response may depend on the experience, pedagogic principles and temperament of the lecturer; some may not want to surrender control and of the lecture content. The opportunity to engage in a learning dialogue with students in a session which cannot be predicted will not, therefore, be to the taste of all lecturers who may prefer that only one voice is heard.

This touches on the third theme, also evidenced in student views, surrounding the ‘inclusive potential’ of Mentimeter, ‘giving a voice’ to students who are less likely to participate due to the influence of culture, gender, disability and other factors. One participant recalled a student with a speech impediment, for example, noting how Mentimeter enabled their full participation in discussion. Another said, ‘It effectively says your opinion matters’ to all students. The ability to enhance inclusiveness was seen as critical in terms of opening up and building discussion. In addition, Mentimeter provides students with the option to participate or not, as opposed to being asked by the tutor. This links to Deci and Ryan’s (2000) work on their theory of self-determination, where having a choice showed increased intrinsic motivation. Seeing responses and how errors are addressed means that students can learn to view mistakes as learning opportunities. It can encourage them to try too, promoting the development of a ‘growth mindset’ and boosting confidence in their capacity to learn (Dweck 2006).

The fourth theme surrounds timeliness. As students identified, Mentimeter creates a ‘real-time’ assessment of understanding: ‘It can give an indication as to whether the students have any clue about what is going on in the lecture’. This allows lecturers to act at a time when blind spots can be remedied. It also allows lecturers to be positively surprised. One lecturer said that she was impressed to see and hear the sophisticated responses from students who sometimes appeared not to be listening. This was a ‘revelation of what they’re really thinking’. More broadly, lecturers can also judge the temperature of the room at the very beginning of a session if they know ‘where the students are at’, at the very beginning of a session and adjust the entire starting point and pace of the session accordingly.

The fifth theme surrounds disciplinary variance. Initially, lecturers thought that Mentimeter would be less easy to embed in humanities-based disciplines than in sciences and social sciences because of the discursive basis of humanities subjects and their resistance to binary ‘answers’. Despite these misgivings, lecturers found that they could use Mentimeter discursively, and not just for questions requiring a ‘correct’ response or for testing technical vocabulary or historical knowledge. In the Philosophy department, for example, as a starting point for discussion, students were asked to situate themselves on a sliding scale according to where they sit in an argument.

In terms of the staff user experiences, one focus group noted some limitations in the free version of the software, which restricts lecturers to two questions and five quizzes per presentation, as also highlighted in the existing literature (Compton and Allen 2018). There are also some restrictions on the number of characters available for each question. Two concerns were also raised around time restrictions: firstly, Mentimeter might impact the lecturer’s ability to cover sufficient content, particularly because students need time to log on and then think about what to say in response; secondly, the software could impact staff time because of the need to compose effective questions in advance. ARSs do not, in themselves, guarantee an enhanced learning experience without some practical pedagogic thinking, especially around question-setting. To gain maximum cognitive benefits, research suggests that questions should link to clear learning goals and encourage peer-to-peer interaction (Beatty 2004), link ideas or arguments together and apply them to new material (Brewer 2004), propose a number of plausible multiple-choice answers surrounding common misinterpretations (Crouch and Mazur 2001), be designed to create space for discussion of student responses, and encourage a an involving and lively environment (Caldwell 2007). Designing optimum questions does require lecturers to set aside time in order to create pedagogically sound questions which encourage deeper learning. This demands increased preparation time and also increased reflection on the teaching and learning function of the session itself. Although this, undoubtedly, is of benefit to the likely efficacy of the teaching session, and thus to student experience and student learning, increase in academic workload across the higher education sector (Gregory and Lodge 2015) suggests that this additional pull on staff time should be factored in to the decision to adopt Mentimeter on a regular basis in small or large group teaching.

Conclusion

Previous research has identified the positive impact of standard handheld cards and clickers on the student experience. This research mirrors previous findings, but whilst the classic ARSs required additional equipment such as clickers or cards, which added to the logistical burden for the teacher, Mentimeter uses the technology that is in front of the students already in the form of laptops, tablets or smart phones. Students can access the system quickly and easily without requiring a login. From a staff perspective, no additional technology is required other than internet access to the web page, although prior planning is recommended. It offers a straightforward accessible method of inviting audience responses with significantly increased breadth of functionality. Satisfaction across all disciplinary areas and within both small and large group teaching is high and students felt that teaching sessions are more enjoyable. Mentimeter enables increased interaction without judgement and, in turn, enables all student voices to be heard within a more inclusive learning environment. Some responders specifically identified a shift away from passive teaching sessions, an increased emphasis on staff–student and peer-to-peer dialogue in line with dialogic teaching approaches, and a more responsive approach to session content. Students self-report increased attention, improved attendance and greater learning. Staff also identified the benefits of adopting a more dynamic approach, fed by timely class feedback provided in an environment which encouraged greater inclusivity. This could be augmented by careful reflection on the role of the lecturer in teaching environments, responding to and managing class interactions effectively, and setting time aside for practical and pedagogic thinking designed to optimise use. In this kind of environment, Mentimeter has the clear potential to increase student satisfaction, engagement, voice and learning within higher education as well as the potential to produce a more dynamic and stimulating teaching role for the lecturer.

References

Alexander, R. (2017) Towards Dialogic Teaching: Rethinking Classroom Talk, Dialogos, Thirsk.

Arulampalam, W., Naylor, R. & Smith, J. (2007) ‘Am I missing something? The effects of absence from class on student performance’, Warwick Economic Research Papers [online] Available at: https://warwick.ac.uk/fac/soc/economics/research/workingpapers/2008/twerp_820.pdf

Bale, D. (2018) ‘This Norfolk church is using a phone app to rate hymns’, North Norfolk News, 4 Dec. [online] Available at: https://www.northnorfolknews.co.uk/news/aylsham-church-trialling-use-of-a-live-voting-smartphone-app-1-5805737

Beatty, I. (2004) ‘Transforming student learning with classroom communication systems’, Educause Research Bulletin, 3 Feb., pp. 1–13 [online] Available at: http://www.educause.edu/ir/library/pdf/ERB0403.pdf

Beekes, W. (2006) ‘The “millionaire” method for encouraging participation’, Active Learning in Higher Education, vol. 7, no. 1, pp. 25–36. doi: 10.1177/1469787406061143

Braun, V. & Clarke, V. (2006) ‘Using thematic analysis in psychology’, Qualitative Research in Psychology, vol. 3, no. 2, pp. 77–101. doi: 10.1191/1478088706qp063oa

Brewer, C. (2004) ‘Near real-time assessment of student learning and understanding in biology courses’, BioScience, vol. 54, no. 11, pp. 1034–1039 [online] Available at: https://doi.org/10.1641/0006-3568(2004)054[1034:NRAOSL]2.0.CO;2

Caldwell, J. (2007) ‘Clickers in the large classroom: current research and best-practice tips’, Life Sciences Education, vol. 6, no. 1, pp. 9–20 [online] Available at: http://www.lifescied.org/cgi/reprint/6/1/9.pdf

Cameron, K. & Bizo, L. (2019) ‘Use of the game-based learning platform KAHOOT! to facilitate learner engagement in animal science students’, Research in Learning Technology, vol. 27 [online] Available at: https://doi.org/10.25304/rlt.v27.2225

Compton, M. & Allen, J. (2018) ‘Student response systems: a rationale for their use and a comparison of some cloud-based tools’, Compass: Journal of Teaching and Learning, vol. 11, no. 1 [online] Available at: https://pdfs.semanticscholar.org/5f77/f5e529996962df3c15bf6ff1e4d97f48fb88.pdf?_ga=2.69461712.1301248929.1592667859-668047203.1592667859

Crede, M., Roch, S. & Kieszczynka, U. (2010) ‘Class attendance in college: a meta-analytic review of the relationship of class attendance with grades and student characteristics’, Review of Educational Research, vol. 80, no. 2, pp. 272–295 [online] Available at: https://journals.sagepub.com/doi/full/10.3102/0034654310362998

Crouch, C. & Mazur,E. (2001) ‘Peer instruction: ten years of experience and results’, American Journal of Physics, vol. 69, no. 9, pp. 970–977 [online] Available at: http://web.mit.edu/jbelcher/www/TEALref/Crouch_Mazur.pdf

Davarzani, H. (2013) ‘Improving students’ interactions during lectures by using Mentimeter’, Supporting Learning through Digital Resources [online] Available at: https://journals.lub.lu.se/KG/article/view/8710

Deslauriers, L. et al. (2019) ‘Measuring actual learning versus feeling of learning in response to being actively engaged in the classroom’, Proceedings of the National Academy of Sciences of the United States of America, vol. 116, 19251–19257 [online] Available at: https://www.pnas.org/content/116/39/19251

Dweck, C. (2006) Mindset: The New Psychology of Success, Random House, New York, NY.

Elliot, C. (2003) ‘Using a personal response system in economics teaching’, International Review of Economics Education, vol. 1, no. 1, pp. 80–86 [online] Available at: http://www.economicsnetwork.ac.uk/iree/i1/elliott.htm

El-Rady, J. (2006) ‘To click or not to click: that’s the question’, Innovate: Journal of Online Education, vol. 2, no. 4 [online] Available at: https://nsuworks.nova.edu/cgi/viewcontent.cgi?referer=https://www.google.co.uk/&httpsredir=1&article=1139&context=innovate

Gao, F., Zhang, T. & Franklin, T. (2013) ‘Designing asynchronous online discussion environments: recent progress and possible future directions’, British Journal of Educational Technology, vol. 44, no. 3, pp. 469–483 [online] Available at: https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1467-8535.2012.01330.x

Gauci, S. et al. (2009) ‘Promoting student-centered active learning in lectures with a personal response system’, Advances in Physiology Education, vol. 33, no. 1, pp. 60–71 [online] Available at: https://pdfs.semanticscholar.org/b906/e5fc3ddc5e4a08a29a5237f02e7f31f6dcb2.pdf

Graham, C. et al. (2007) ‘Empowering or compelling reluctant participants using audience response systems’, Active Learning in Higher Education, vol.8, no. 3, pp. 233–258. doi: 10.1177/1469787407081885

Gregory, S. & Lodge, J. (2015) ‘Academic workload: the silent barrier to the implementation of technology-enhanced learning strategies in higher education’, Distance Education, vol. 36, no. 2, pp. 210–230 [online] Available at: https://www.tandfonline.com/doi/abs/10.1080/01587919.2015.1055056

Guarascio, A., Nemeck, B. & Zimmerman, D. (2017) ‘Evaluation of students’ perceptions of the Socrative application verses a traditional student response system and its impact on classroom engagement’, Currents in Pharmacy Teaching and Learning, vol. 9, no. 5, pp. 808–812. doi: 10.1016/j.cptl.2017.05.011

Hake, R. (1998) ‘Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses’, American Journal of Physics, vol. 66, no. 1, pp. 64–74. doi: 10.1119/1.18809

Heaslip, G., Donovan, P. & Cullen, J.G. (2014) ‘Student response systems and learner engagement in large classes’, Active Learning in Higher Education, vol. 15, no. 1, pp. 11–24. doi: 10.1177/1469787413514648

Hill, D. & Fielden, K. (2017) ‘Using Mentimeter to promote student engagement and inclusion’, Pedagogy in Practice Seminar, 18 Dec., Fusehill Street, Carlisle, UK (unpublished report) [online] Available at: http://insight.cumbria.ac.uk/id/eprint/3473/

Hung, H. (2016) ‘Clickers in the flipped classroom: bring your own device (BYOD) to promote student learning’, Interactive Learning Environments, vol. 25, no. 8, pp. 983–995. doi: 10.1080/10494820.2016.1240090

Knight, J. & Wood, W. (2005) ‘Teaching more by lecturing less’, Cell Biology Education, vol. 4, no. 4, pp. 298–310 [online] Available at: http://doi.org/10.1187/05-06-0082

Little, C. (2016) ‘Technological review: Mentimeter smartphone student response systems’, Compass: Journal of Learning and Teaching, vol. 9, no. 13 [online] Available at: https://journals.gre.ac.uk/index.php/compass/article/view/328/pdf

McDaniels, M., Pfund, C. & Barnicle, K. (2016) ‘Creating dynamic learning communities in synchronous online courses: one approach from the Center for the Integration of Research, Teaching & Learning (CIRTL)’, Online Learning, vol. 20, no. 1 pp. 110–129 [online] Available at: https://files.eric.ed.gov/fulltext/EJ1096380.pdf

Mentimeter. ‘Interactive presentations, workshops and meetings’ [online] Available at: https://www.mentimeter.com/

Michael, J. (2006) ‘Where’s the evidence that active learning works?’, Advances in Physiology Education, vol. 30, no. 4, pp. 159–167 [online] Available at: https://journals.physiology.org/doi/pdf/10.1152/advan.00053.2006

Puspa, A. & Imamyartha, D. (2019) ‘Experiences of social science students through online application of Mentimeter in English milieu’, IOP Conference Series: Earth and Environmental Science, vol. 243, no. 1.

Ryan, R. & Deci, E. (2000) ‘Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being’, American Psychologist, vol. 55, no. 1, pp. 68–78.

Skoyles, A. & Bloxsidge, E. (2017) ‘Have you voted? Teaching OSCOLA with Mentimeter’, Legal Information Management, vol. 17, no. 1, pp. 232–238 [online] Available at: https://www.cambridge.org/core/journals/legal-information-management/article/have-you-voted-teaching-oscola-with-mentimeter/96552E8A42F8CB853BC2DD16A9759947/core-reader

Vallely, K. & Gibson, P. (2018) ‘Engaging students on their devices with Mentimeter’, Compass Journal of Learning and Teaching, vol. 11, no. 2 [online] Available at: https://journals.gre.ac.uk/index.php/compass/article/view/843/pdf

Yee, S.L.W. & Ean, C.L.C. (2020) ‘Malaysian private university students’ perception of online discussion forums: a qualitative enquiry’, Sains Humanika, vol. 12, no. 2 [online] Available at: https://sainshumanika.utm.my/index.php/sainshumanika/article/view/1610