Perceptions of the effects of clicker technology on student learning and engagement: a study of freshmen Chemistry students

Jenepher Lennox Terriona* and Victoria Acetib

aDepartment of Communication, University of Ottawa, Ottawa, Canada; bHealth Informatics Institute, Algoma University, Sault Ste. Marie, Canada

(Received 17 August 2010; final version received 02 September 2011; published: 15 March 2012)

Abstract

While technology – in the form of laptops and cellphones – may be the cause of much of the distraction in university and college classrooms, some, including the personal or classroom response system (PRS/CRS) or clicker, also present pedagogical opportunities to enhance student engagement. The current study explored the reactions of students to clicker implementation in a large, introductory chemistry class. During the final class of the semester, 200 students in an introductory chemistry class responded to an attitudinal and informational student survey using both Likert-type and non-Likert type questions to evaluate their perception of the implementation of the clickers and their impact on student learning and engagement. The results demonstrated that, when implemented effectively, clickers contribute to greater student engagement and, ultimately, an opportunity for professors to enact best practices in higher education pedagogy. This study points to the importance of effective pedagogy in making clickers worthwhile.

Keywords: Clicker; technology; student engagement; higher education; learning; pedagogy

Citation: Research in Learning Technology 2012, 20: 16150 - 10.3402/rlt.v20i0.16150

RLT 2012. © 2012 J. Lennox Terrion and V. Aceti. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons "Attribution 3.0 Unported (CC BY 3.0)" license (http://creativecommons.org/licenses/by/3.0/) permitting use, reuse, distribution and transmission, and reproduction in any medium, provided the original work is properly cited.


*Corresponding author. Email: jlennoxt@uottawa.ca

 

Most of the students populating classrooms in today's universities and colleges were born in 1982 or later and are the first university cohort to be so constantly wired and connected to friends, to the media and to the Internet through cell phones, MP3 players and laptops. In today's large lecture halls, students have access to wireless internet and are, to a great extent, unmonitored in terms of what is on their screens. With so much opportunity to check their email, text their friends, visit Facebook or surf the web, it is not surprising, therefore, that many of these students are distracted during class (Rice and Bunz 2006). However, while technology may be the cause of much of the distraction in university and college classrooms, it also presents pedagogical opportunities to enhance the learning experience and, more specifically, student engagement.

As defined by Kuh (2003), student engagement refers to “the time and energy students devote to educationally sound activities inside and outside of the classroom” (p. 25). Several major reviews of the literature conclude that student engagement is among the better predictors of learning and personal development (Carini, Kuh, and Klein 2006; Umback and Wawrzynski 2005), academic success (Appleton et al. 2006; Dunleavy and Milton 2008; Finn 1989; Fredericks, Blumenfeld, and Paris 2004; Marks 2000; Pirot and De Ketele 2000) and retention (Kuh 2003). Students become engaged when their academic experience is characterised by meaningful educational activities, including active learning, involvement in enriching educational experiences, seeking guidance from staff or working collaboratively with other students (Carini, Kuh, and Klein 2006). However, as Umback and Wawrzynski (2005) suggest, little new knowledge has been generated about indicators of educational practice that predict student engagement. Furthermore, understanding how students perceive these educational practices lends important insight into what actually contributes to this outcome.

Most universities and colleges have placed great emphasis on student engagement in the past decade, in part thanks to the National Survey on Student Engagement (NSSE), an instrument used by more than 1300 colleges and universities in the U.S. and Canada since 2000. This instrument allows comparison of universities against each other as well as against a benchmark score. According to NSSE, the “results provide an estimate of how undergraduates spend their time and what they gain from attending college” and the instrument provides a measure of “empirically confirmed ‘good practices’ in undergraduate education” (http://nsse.iub.edu/html/about.cfm). In terms of identifying these good practices, Chickering and Gamson (1987) are often looked to because of their model of the principles of effective undergraduate education. As Umback and Wawrzynski (2005) argue, most of these principles reflect pedagogical behaviours undertaken by instructors and thus this model focuses on the context created by faculty members on campus and the relationship of this learning context to student engagement. Chickering and Gamson suggest that good practice builds student engagement because it encourages contact between students and faculty, gives prompt feedback, develops reciprocity and cooperation among students, encourages active learning, emphasises time on task, communicates high expectations and respects diverse talents and ways of learning.

Student engagement is important to learning because unengaged students do not listen, process or attend to the learning process. While the influence of the instructor and his or her pedagogical choices and practices are primordial to the student experience (Lane and Shelton 2001), technology may offer a means to enhance student engagement. One technology that presents opportunities for student engagement is the personal or classroom response system (PRS/CRS), commonly referred to as the clicker. In classrooms equipped with a receiver and the appropriate software, students each have a handheld remote control, purchased along with their textbook for about $20 (CAD). Students “click” in their responses to multiple choice questions posed by the professor and projected on the screen from within a slide presentation. When all responses are received the results are projected onto the screen for the entire class to see (either anonymously or with respondents identified). Individual student and collective class data can be saved for each session, allowing responses to be recorded analysed and graphed.

As Bruff (2009) points out in his book on teaching with classroom response systems, the consensus among studies of student perceptions of clickers is that attention, attendance and interest (and even learning, depending on the pedagogical approach taken) can be enhanced by using this technology. Specifically, in their review of three decades of literature on clicker use, Judson and Sawada (2002) concluded that, while the impact of clickers on learning outcomes was disappointing, use of this technology did seem to have a positive impact on student engagement and attitudes. Caldwell (2007) and Fies and Marshall (2007), in their literature reviews of the impact of clickers in higher education settings, report that the technology has been found to have a generally positive impact on the classroom. Numerous studies have concluded that, when implemented using appropriate and effective teaching methods, clickers can foster student engagement and overall student success in the course in which they are used (Hoekstra 2008; Kaleta and Joosten 2007; Rice and Bunz 2006; Twetten et al. 2007). One study, conducted at the University of Wisconsin, involved 3500 students in 28 courses and 19 disciplines and found that 93.5% of faculty strongly agreed/agreed that students were more engaged and 72% strongly agreed/agreed that clickers benefited learning (Ellozy 2007). Similarly, Rice and Bunz (2006) found that clickers have led to “greater student engagement in the classroom, high satisfaction, and gains in student learning across the disciplines” (p. 3).

In regard to creating the conditions for engagement and addressing the principles for effective undergraduate education, clicker technology presents interesting potential. To explore this potential, the current study measured student perceptions of clicker use and the pedagogical choices of the instructor in a large, introductory chemistry class.

Research Questions:

RQ1: Do students perceive that they are more engaged with course material and in-class discussions as a result of clicker use?

RQ2: Do students believe that the clickers help them to more effectively learn the course material?

RQ3: Do students who perceive a positive impact on their learning as a result of the clickers also perceive that they are more engaged in their learning experience?

Methodology

In an effort to address student inattention and the distractions created by electronics in a large (200 students), first-year chemistry class, the instructor used eInstruction's Classroom Performance System (http://www.einstruction.com/) to build a number of clicker innovations into the lecture. The course was lecture based and used Microsoft Powerpoint slides with infusions of multimedia and online tools. Each lecture incorporated many different clicker activities, including exercises, quizzes and team-based activities. For example, the instructor used quizzes at the beginning of the lecture to assess understanding of readings and at the end of challenging sections in lectures to check understanding. As well, the professor used “peer instruction” (Mazur 1997) to help make lectures more interactive and to get students intellectually engaged with the material by having them think critically about the material and discuss their ideas and insights with their neighbours. Specifically, after presenting a complex concept or process, the instructor would present a slide with a question, to which students would respond on their own using their clicker. The correct answer would not be displayed and, ideally, there would be a range of answers (demonstrating that the topic is challenging, that there is not universal understanding and, thus, that peer instruction is necessary). Then, students would discuss with those sitting next to them and answer again using their own clickers. At this point, the instructor would reveal the correct answer.

Participants

As the chemistry class chosen for the study was an introductory course, the majority of the students were in their first year. Specifically, 177 of students were in their first year of university (freshman), while 16 were in their second year (sophomore), 4 in their third year (junior) and 3 in their fourth year or higher (senior). Participants’ age ranged from 17 to 34 years, with a mode of 18 years and a mean of 18.6 years. In terms of gender, 67 (33.5%) of the students were female, and 131 (65.5%) were male, with 2 participants leaving the gender question blank. Questions of ethnicity, religion or socioeconomic status were not asked. Given that 88.5% of students were in the first semester of their first year of studies, it is likely that the vast majority of participants had had no prior experience with the clickers and thus the threat to validity of prior knowledge and experience, or history, was removed. At the time of the study, only six professors (out of about 700 in total) at the University of Ottawa were making use of the clicker technology, so this was a novel pedagogical approach for most students at this institution.

Procedures

After receiving ethical clearance to conduct this study, a graduate research assistant unrelated to the course distributed a brief attitudinal and informational student survey to the 250 students registered in CHM1301 Principles of Chemistry (Fall 2009) during the last class of the semester. The professor was not involved in the distribution of the questionnaires and students were verbally informed that they were free to participate or not. They were collected by the research assistant and placed in a sealed envelope. The survey used both Likert-type and non-Likert type questions to evaluate the students’ perception of the clickers and their impact on student engagement. A total of 200 surveys (80%) were completed.

Measures

The survey consisted of an introductory section of three questions in which students indicated demographic information and an attitudinal section consisting of 15 items to which students responded using a five-point Likert-type scale (1 = strongly disagree; 2 = disagree; 3 = neutral; 4 = agree; and 5 = strongly agree). Questions dealt with the perceived effect of the clickers on a range of variables, including learning and understanding, perceptions related to the use of the clickers during class time and the perceived effect of the clickers on attention, engagement, participation and enjoyment. These were later grouped to create three distinct variables:

Implementation

Six items were grouped to assess technical aspects of clicker use in the classroom such as the ease of operation of the clickers and the integration or fit of the technology within the course design. These questions centred on the students’ perspective, for both ease of use and the effectiveness of the professor's integration of the technology into the course. Specifically, participants were asked to respond to the following: (1) the professor made good use of the clickers; (2) clickers were well integrated into the course; (3) the clickers were frustrating to use; (4) the clickers were easy to use; (5) the technology used in this course worked well; and (6) I would like to have the chance to use the clicker again in another class. Question #3 (related to frustration with the technology) was recoded so that higher scores reflected stronger agreement with the statement. All five items (α=0.754) were combined to measure students’ perceptions of ease of use and integration of clickers in the classroom (mean = 3.59, SD = 0.74).

Student engagement

Seven items were combined to measure student engagement (α=0.908). Using a five-point Likert-type scale, items focused on students’ perceptions of the impact of the clicker on various aspects of student engagement. These items included questions about the relationship between the clicker and increased interest in course material, participation during lecture and interaction with other students and with the professor: (1) clickers contributed significantly to my interest in the course material; (2) as a result of the clicker, I felt more involved in the lecture; (3) as a result of using clickers in this course, I interacted more with other students; (4) the use of clickers in this course was an appropriate way to achieve course objectives; (5) as a result of using the clicker, I felt more engaged and involved; (6) as a result of using the clicker, I felt more inclined to participate in class discussions; and (7) clickers enabled the professor to respond to concepts that I might not have understood. None of the eight items was recoded (mean = 3.12, SD = 0.85).

Effect on learning

Two items were used to measure whether students perceived that clickers contributed positively to or detracted from their learning during lecture (α=0.779). Using a five-point Likert-type scale, students were asked to agree or disagree with the following statements: “clickers contributed significantly to my learning” and “clickers did not contribute to my learning experience.” The negative oriented item was recoded (mean = 3.05, SD = 1.04).

Data analysis

The statistical software package SPSS was used to analyse the data. The aim of the study was to determine if there was a statistically significant relationship between the use of the clickers and student engagement. Thus, the null hypothesis is that there is no significant relationship between the use of clickers and student engagement. Depending on the variable to be assessed a T-test, ANOVA, or correlation was conducted in order to test the null hypothesis.

Results

Demographics

The first questions focussed on the relationship between participant demographics and the three variables. No statistically significant differences between a student's year in university, gender or age and their perceived ease of use and implementation of the clicker in the class were observed. Further, no significant difference was found between these variables and student engagement or these and impact on learning. These findings demonstrate that any significant relationships among the three variables (implementation, student engagement and impact on learning) cannot be attributed to demographic variance.

Findings in relation to the research questions

RQ1. Do students perceive that they are more engaged with course material and in-class discussions as a result of clicker use?

RQ1 attempted to get at the relationship between student engagement and clicker use in the classroom. Results found, as shown in Table 1, a strong correlation between student engagement and clicker implementation (r =.0678, p<0.01), indicating that these two variables are collinear. This means that students perceive that clickers encourage engagement when implemented into lectures.


Table 1. Correlation results.
  Implementation Student engagement Impact on learning
Implementation 1.00 0.678 0.577
Student engagement 0.678 1.00 0.846
Impact on learning 0.577 0.846 1.00

RQ2. Do students believe that the clickers help them to more effectively learn the course material?

The goal of RQ2 was to understand the effect of clicker use and implementation in the classroom on students’ perceptions of their own learning experience. The correlation test on the variables integration and impact on learning showed as seen in Table 1, a moderate correlation (r=0.577, p<0.01). This demonstrates that students believe that using clickers as part of the class lecture had a positive effect on their ability to learn course material.

RQ3: Do students who perceive a positive impact on their learning as a result of the clickers perceive that they are more engaged in their learning experience?

Finally, to assess whether student perception of learning correlated with reported engagement, correlation tests were performed on the variables student engagement and impact on learning of using the clickers. Results showed as seen in Table 1, a very strong positive correlation (r=0.846, p<0.01), indicating that, if students perceived their learning to be enhanced by the clickers, then they also reported being more engaged in the learning experience.

Discussion

This study has shown that students reacted positively to the professor's integration of clickers into their chemistry class and perceived that this technology enhanced their engagement and learning in this particular classroom. While it does not take into account directly the professor's personal teaching style, it does take into account the learning context created through the use of clicker technology. Further, this study assesses students’ perceptions of faculty behaviours and thus, as Umback and Wawrzynski (2005) suggest, places the emphasis on pedagogical choices made by instructors. These findings are supported by numerous studies (e.g., Caldwell 2007; Crossgrove and Curran 2008; Cue 1998; Hoekstra 2008; Jackson and Trees 2003; Kaleta and Joosten 2007; Preszler et al. 2007; Rice and Bunz 2006; Trees and Jackson 2007; Twetten et al. 2007).

As discussed above, in elaborating the seven principles of effective undergraduate education, Chickering and Gamson (1987) argue that good practise builds student engagement because it encourages contact between students and faculty, gives prompt feedback, develops reciprocity and cooperation among students, encourages active learning, emphasises time on task, communicates high expectations and respects diverse talents and ways of learning. The research findings will be discussed below in the context of each of these principles.

In terms of increased contact between student and faculty, it could be suggested that the clicker enables a classroom of students – whether 50 or 500 – to communicate directly with the professor in a mediated format. In fact, clickers are designed specifically to be used in large classes, where it is difficult for students to get feedback, and clickers have been shown in numerous studies to be helpful in overcoming the challenges faced with student inattention and distractions in these large classes (Boyle and Nicol 2003). Professors can respond to students by providing another example or a more detailed explanation to a question that the clicker results have shown is unclear to some or most students. Our data showed that students perceived that the professor responded to concepts that were not understood when this lack of understanding was communicated by students through the clicker. Large lectures may be intimidating for first year students who are not accustomed to the number of students, and thus students who do not understand something may be reluctant to ask questions. Clickers afford students the opportunity to voice their misunderstandings anonymously and, as a result, the professor can respond to those students without centering them out when she sees incorrect responses on the screen.

In a similar vein, clickers enable greater and more immediate feedback from professors and, thus, allow students to assess their knowledge of major concepts, as well as their level of understanding in comparison to their peers (Kaleta and Joosten 2007). When faced with a potential lack of knowledge or understanding (as shown through incorrect answers), the professor can encourage students to visit the instructor during office hours or to follow up with the teaching assistant or tutor in order to receive more tailored support. In fact, our data showed that students reported being not only more inclined to participate during lectures when using clickers but also, as discussed above, that the professor was more likely to respond to a lack of conceptual understandings demonstrated by students. In other words, two-way communication seemed to be enhanced by the clickers in the sense that students showed their level of understanding more effectively and the professor responded and adapted his/her lecture to address concepts that students might not have understood.

In terms of increased reciprocity and cooperation among students, clickers provide the chance for students to work with each other when the clicker is used as it was in this class, to enable “peer instruction,” a technique developed by Eric Mazur (1997) to help make lectures more interactive and to get students intellectually engaged by having them think critically about the material and discuss their ideas and insights with their neighbours. As Chickering and Gamson (1987) argue, good learning is collaborative and social, not competitive and isolated. By requiring students to turn to their neighbours and discuss a question and possible answers before responding on their clickers, peer instruction fosters this goal. In our study, participants reported that the clickers allowed for increased interaction with other students during lecture and this, in a class of 200 students, is an important outcome where students could feel lost.

As for active learning and diverse learning styles, the tangible act of answering a question using a clicker increases engagement, especially in students who are kinesthetic – or hands-on – learners (Bruff 2009) because the clicker takes the student's role from a passive observer to an active contributor. Incorporating the needs of all learning styles increases student engagement, as each student is able to learn effectively through one of the delivery approaches – auditory, visual or kinesthetic – of the lecture. The data showed that there was a strong positive correlation between a perception of higher engagement and a positive impact on learning. This indicates that, when students report being more actively involved in the lecture, they are more inclined to report more positive perceptions of their own learning.

The clicker also seems to enhance time on task, or the time and energy that students put into learning activities, by encouraging regular attendance. While some research has shown that students respond negatively to clickers when they are used only for attendance (Trees and Jackson 2007), many studies demonstrate the positive impact on attendance when clickers are used effectively. For example, Burnstein and Lederman (2001), in their study of clicker use in science classes, reported 80% to 90% attendance in classes where clickers were in use.

In terms of communicating high expectations, Kaleta and Joosten (2007) found in their study of clicker use that students reported their “need to come to class ready to participate and pay attention” (p. 9) when the lecture incorporated clickers. Likewise, as stated by Trees and Jackson (2007), the clicker “creates a learning environment with higher expectations for student preparation prior to class” (p. 25). As a result, students are more motivated to complete the required readings before class and, in turn, are more participatory in class discussions. Kaleta and Joosten agree, claiming that clickers create a more accountable student who is then able to engage with course material during lectures. Furthermore, given the anonymity fostered by using the clicker, students likely experience less social anxiety about offering a wrong answer, so that students may be more likely to participate. While our data did not directly measure student preparation or completion of homework or required readings, our data do show that clickers positively contributed to students’ interest in course material and to their overall learning experience.

One limitation of this study is that all data are based on students’ reported perceptions of their own engagement and learning. The authors were not involved in delivery of the course in any way and, thus, we do not have measures of attendance or final grades. We cannot, therefore, draw conclusions about the impact of clicker technology on these variables. Future research could track attendance, grades and withdrawal and attempt to correlate these with clicker use. Furthermore, triangulating the data through observations or qualitative interviews would enhance and deepen our understanding of the interaction between students and clicker technology and thus move beyond measuring reactions to the professor's integration of the technology in the course.

Conclusion

This study has shown that clickers have the potential to be an effective learning tool that can be used to enhance student engagement through the enactment of Chickering and Gamson's (1987) principles for good practices in higher education. As Draper, Cargill and Cutts (2002) conclude in their study of clicker implementation, it is possible that this technology allows students to play an active and responsible role in the classroom, and, in so doing, increases their motivation and engagement. It is important to note, however, that while clickers appear in this study to have a significant and positive impact on learning and engagement, it is through thoughtful and purposeful implementation that they are most likely to be effective.

As Sutherland, Robertson and John (2008) point out, professors may fear that, when they introduce a new technology into their classroom pedagogy, it will not work well or will use up class time while they struggle to get it working. Further, as a result of the potentially negative impact of technological difficulties on professor credibility and course evaluations, professors may be reluctant to integrate innovations such as clickers in the classroom. Thus, technical support must be available to quickly help professors during lectures, as well as to provide adequate training and troubleshooting prior to the start of the course. In addition to this support, educational materials must be created and made available to professors to assist in integrating clickers successfully.

Further, inconsequential use of the clickers in lecture could lead to negative student perceptions of the technology (Trees and Jackson 2007). For example, if clickers are used for only attendance purposes, students will feel that the clicker was a waste of money, may become more disengaged and may not be open to using clickers in the future when other professors employ the technology. Thus, student support of clickers is imperative as the use of clickers creates a shift in the classroom environment where more active participation is required from students (Trees and Jackson 2007). For freshman students, this change to a more active classroom may not even be noticed; however, in upper year courses where students are used to taking a more passive role in the lecture, lecturers who employ clickers may face more resistance. Furthermore, as Lane and Shelton (2001) point out, technology is no substitute for effective pedagogy and “the failure to place pedagogy prior to technology results in little or no net instructional gain” (p. 248). This contention is supported by Judson and Sawada (2002), who conclude in their extensive review of the literature on clicker use that it is the pedagogical practices of the instructor and not simply the incorporation of technology into the classroom that is essential to student learning. This means that instructors must carefully consider the learning objectives of their course as well as the pedagogical options they have, including lecture, small group discussion, in-class exercises and so on.

Many freshman college and university students are still teenagers, with the developmental limitations of this age group, and thus, as educators, we must develop strategies to engage students and enhance their retention in the critical first year of higher education. This research illustrates the perception of the clicker as a tool for student engagement not from a professor's perspective, but the students’ perspective. It demonstrates how clickers can be incorporated in the classroom to create a culture of learning through increased communication and understanding between the students and professors. Few pedagogical techniques allow students the opportunity to honestly answer questions in class without fear of ridicule or embarrassment while also providing the professor with a clear picture of learning in his/her classroom.

Clickers, like all technologies in the classroom, present opportunities to continuously adapt to the realities of changing demographics in colleges and universities. When implemented effectively, they allow professors to connect with students in a way that complements the lecture format and thus may lead to greater student engagement. As clicker technology is made more accessible and less expensive, for example, by using a student's own mobile phone as the clicker device (see Tremblay, 2010), then the use of clickers may become more commonplace and, as a result, will require even more creative use of this learning tool. Indeed students themselves may have many ideas about “apps” that can enhance their learning experience and that can keep the implementations of the technology fresh and relevant. In this sense, pedagogical practice becomes a shared responsibility as professor and students participate in the joint venture of higher education.

References

Appleton, J. J. et al. (2006) ‘Measuring cognitive and psychological engagement: validation of the student engagement instrument’, Journal of School Psychology, vol. 44, pp. 427–445. [Crossref]

Boyle, J. T. & Nicol, D. J. (2003) ‘Using classroom communication systems to support interaction and discussion in large class settings’, Association for Learning Technology Journal, vol. 11, pp. 43–57. [Crossref]

Bruff, D. (2009). Teaching with Classroom Response Systems: Creating Active Learning Environments, Jossey-Bass, San Francisco, CA.

Burnstein, R. A. & Lederman, L. M. (2001) ‘Using a wireless keypad in lecture classes’, The Physics Teacher, vol. 39, pp. 8–11. [Crossref]

Caldwell, J. E. (2007) ‘Clickers in the large classroom: Current research and best-practice tips’, CBE Life Sciences Education, vol. 6, pp. 9–20. [Crossref]

Carini, R. M., Kuh, G. D. & Klein, S. P. (2006) ‘Student engagement and student learning: testing the linkages’, Research in Higher Education, vol. 47, pp. 1–32. [Crossref]

Chickering, A. W. & Gamson, Z. F. (1987) ‘Seven principles for good practice in undergraduate education’, American Association of Higher Education Bulletin, March, pp. 3–7.

Crossgrove, K. & Curran, K. L. (2008) ‘Using clickers in nonmajors- and majors-level biology courses: student opinion, learning, and long-term retention of course material’, Life Sciences Education, vol. 7, pp. 146–154. [Crossref]

Cue, N. (1998) ‘A universal learning tool for classrooms?’, Proceedings of the First Quality in Teaching and Learning Conference, Hong Kong SAR, China, December 10–12, 1998, [online] Available at: http://celt.ust.hk/ideas/prs/pdf/Nelsoncue.pdf, accessed 10 May 2010.

Draper, S. W., Cargill, J. & Cutts, Q. (2002) ‘Electronically enhanced classroom interaction’, Australian Journal of Educational Technology, vol. 18, pp. 13–23.

Dunleavy, J. & Milton, P. (2008) ‘Student engagement for effective teaching’, Education Canada, vol. 48, pp. 4–8.

Ellozy, A. (2007) ‘New initiative: “Clickers” in our classrooms’, New Chalk Talk, vol. 6, pp. 8.

Fies, C. & Marshall, J. (2006) ‘Classroom response systems: a review of the literature’, Journal of Science Education and Technology, vol. 15, pp. 101–109. [Crossref]

Finn, J. D. (1989) ‘Withdrawing from school’, Review of Educational Research, vol. 59, no. 2, pp. 117–142.

Fredericks, J., Blumenfeld, P. & Paris, A. (2004) ‘School engagement: potential of the concept, state of the evidence’, Review of Educational Research, vol. 74, pp. 59–109. [Crossref]

Hoekstra, A. (2008) ‘Vibrant student voices: exploring effects of the use of clickers in large college courses’, Learning, Media and Technology, vol. 33, pp. 329–341. [Crossref]

Jackson, M. & Trees, A. (2003) Clicker Implementation and Assessment, Information and Technology Services and Faculty Teaching Excellence Program, University of Colorado, Boulder, CO, [online] Available at: http://comm.colorado.edu/mjackson/clickerreport.htm

Judson, E. & Sawada, D. (2002) ‘Learning from past and present: electronic response systems in college lecture halls’, Journal of Computers in Mathematics and Science Teaching, vol. 21, pp. 167–181.

Kaleta, R. & Joosten, T. (2007) ‘Student response systems: a University of Wisconsin system study of clickers’, EDUCAUSE Center for Applied Research: Research Bulletin, vol. 10, pp. 1–12.

Kuh, G. D. (2003) ‘What we're learning about student engagement from NSSE’, Change, vol. 35, pp. 24–32. [Crossref]

Lane, D. R. & Shelton, M. W. (2001) ‘The centrality of communication education in classroom computer-mediated communication: toward a practical and evaluative pedagogy’, Communication Education, vol. 50, pp. 241–255. [Crossref]

Marks, H. M. (2000) ‘Student engagement in instructional activity: patterns in the elementary, middle, and high school years’, American Educational Research Journal, vol. 37, pp. 153–84.

Mazur, E. (1997). Peer Instruction: A User's Manual, Prentice Hall, Upper Saddle River, NJ.

Pirot, L. & De Ketele, J.-M. (2000) ‘L'engagement académique de l’étudiant comme facteur de réussite à l'université: Étude exploratoire menées dans deux facultés contrastées’, Revue des Sciences de l'ducation, vol. 26, no. 20, pp. 367–394.

Preszler, R. et al. (2007) ‘Assessment of the effects of student response systems on student learning and attitudes over a broad range of biology courses’, CBE Life Sciences Education, vol. 6, pp. 29–41. [Crossref]

Rice, R. & Bunz, U. (2006) ‘Evaluating a wireless course feedback system: the role of demographics, expertise, fluency, competency, and usage’, Studies in Media and Information Literacy Education, vol. 6, no. 3, pp. 3.

Sutherland, R., Robertson, S. & John, P. (2008). Improving Classroom Learning With ICT, Routledge, New York, NY.

Trees, A. R. & Jackson, M. H. (2007) ‘The learning environment in clicker classrooms: student processes of learning and involvement in large university-level courses using student response systems’, Learning, Media and Technology, vol. 32, pp. 21–40. [Crossref]

Tremblay, E. (2010) ‘Educating the Mobile Generation – using personal cell phones as audience response systems in post-secondary science teaching’, Journal of Computers in Mathematics and Science Teaching, vol. 29, no. 2, pp. 217–227.

Tweeten, J. et al. (2007) ‘Successful clicker standardization’, Educause Quarterly, vol. 4, pp. 63–67.

Umbach, P. D. & Wawrzynski, M. R. (2005) ‘Faculty do matter: the role of college faculty in student learning and engagement’, Research in Higher Education, vol. 46, pp. 153–184. [Crossref]