Selecting student-authored questions for summative assessments

  • Alice Huang School of Life and Environmental Science, University of Sydney, Sydney, Australia
  • Dale Hancock School of Life and Environmental Science, University of Sydney, Sydney, Australia
  • Matthew Clemson School of Life and Environmental Science, University of Sydney, Sydney, Australia
  • Giselle Yeo School of Life and Environmental Science, University of Sydney, Sydney, Australia
  • Dylan Harney School of Life and Environmental Science, University of Sydney, Sydney, Australia
  • Paul Denny School of Computer Science, University of Auckland, Auckland, New Zealand
  • Gareth Denyer School of Life and Environmental Science, University of Sydney, Sydney, Australia
Keywords: multiple-choice questions, student authored questions, question banks, examinations, item response theory, peerwise

Abstract

Production of high-quality multiple-choice questions (MCQs) for both formative and summative assessments is a time-consuming task requiring great skill, creativity and insight. The transition to online examinations, with the concomitant exposure of previously tried-and-tested MCQs, exacerbates the challenges of question production and highlights the need for innovative solutions. Several groups have shown that it is practical to leverage the student cohort to produce a very large number of syllabus-aligned MCQs for study banks. Although student-generated questions are well suited for formative feedback and practice activities, they are generally not thought to be suitable for high-stakes assessments. In this study, we aimed to demonstrate that training can be provided to students in a scalable fashion to generate questions of similar quality to those produced by experts and that identification of suitable questions can be achieved with minimal academic review and editing. Second-year biochemistry and molecular biology students were assigned a series of activities designed to coach them in the art of writing and critiquing MCQs. This training resulted in the production of over 1000 MCQs that were then gauged for potential by either expert academic judgement or via a data-driven approach in which the questions were trialled objectively in a low-stakes test. Questions selected by either method were then deployed in a high-stakes in-semester assessment alongside questions from two academically authored sources: textbook-derived MCQs and past paper questions. A total of 120 MCQs from these four sources were deployed in assessments attempted by over 600 students. Each question was subjected to rigorous performance analysis, including the calculation of standard metrics from classical test theory and more sophisticated item response theory (IRT) measures. The results showed that MCQs authored by students, and selected at low cost, performed as well as questions authored by academics, illustrating the potential of this strategy for the efficient creation of large numbers of high-quality MCQs for summative assessment.

Downloads

Download data is not yet available.

References


Aflalo, E. (2018) ‘Students generating questions as a way of learning’, Active Learning in Higher Education, pp. 1–13. doi: 10.1177/1469787418769120


Amini, N., et al., (2020) ‘Inclusion of MCQs written by radiology residents in their annual evaluation: innovative method to enhance resident’s empowerment?’, Insights into Imaging, vol. 11, no. 1, pp. 1–8. doi: 10.1186/s13244-019-0809-4


Bates, S. P., et al., (2014) ‘Assessing the quality of a student-generated question repository’, Physical Review Special Topics-Physics Education Research, vol. 10, no. 2. pp. 1–11. doi: 10.1103/PhysRevSTPER.10.020105


Biggs, J. B. & Tang, C. (2011) Teaching for Quality Learning at University, McGraw-Hill Education, Maidenhead.


Bottomley, S. & Denny, P. (2011) ‘A participatory learning approach to biochemistry using student authored and evaluated multiple-choice questions’, Biochemistry and Molecular Biology Education, vol. 39, no. 5, pp. 352–361. doi: 10.1002/bmb.20526


Denny, P., et al., (2008) ‘PeerWise: students sharing their multiple choice questions’, Proceedings of the Fourth international Workshop on Computing Education Research, Association for Computing Machinery, New York, NY, pp. 51–58. doi: 10.1145/1404520.1404526


Denny, P., Luxton-Reilly, A. & Hamer, J. (2008) ‘Student use of the peerwise system’, SIGCSE Bull., vol. 40, no. 3, pp. 73–77. doi: 10.1145/1597849.1384293


Denny, P., et al., (2017) ‘Examining a student-generated question activity using random topic assignment’, Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education, Association for Computing Machinery, New York, NY, pp. 146–151. doi: 10.1145/3059009.3059033


Doyle, E. & Buckley, P. (2020) ‘The impact of co-creation: an analysis of the effectiveness of student authored multiple choice questions on achievement of learning outcomes’, Interactive Learning Environments, pp. 1–10. doi: 10.1080/10494820.2020.1777166


Duret, D., et al., (2018) ‘Collaborative learning with peerwise’, Research in Learning Technology, vol. 26, no. 0, pp. 1–13. doi: 10.25304/rlt.v26.1979


Galloway, K. W. & Burns, S. (2015) ‘Doing it for themselves: students creating a high quality peer-learning environment’, Chemistry Education Research and Practice, vol. 16, no. 1, pp. 82–92. doi: 10.1039/c4rp00209a


Gooi, A. C. C. & Sommerfeld, C. S. (2015) ‘Medical school 2.0: how we developed a student-generated question bank using small group learning’, Medical Teacher, vol. 37, no. 10, pp. 892–896. doi: 10.3109/0142159x.2014.970624


Hancock, D., et al., (2018) ‘Improving large class performance and engagement through student-generated question banks’, Biochemistry and Molecular Biology Education, vol. 46, no. 4, pp. 306–317. doi: 10.1002/bmb.21119


Hardy, J., et al., (2014) ‘Student-generated content: enhancing learning through sharing multiple-choice questions’, International Journal of Science Education, vol. 36, no. 13, pp. 2180–2194. doi: 10.1080/09500693.2014.916831


Harper, R. (2003) ‘Multiple-choice questions – a reprieve’, Bioscience Education, vol. 2, no. 1, pp. 1–6. doi: 10.3108/beej.2003.02000007


Harris, B. H. L., et al., (2015) ‘A novel student-led approach to multiple-choice question generation and online database creation, with targeted clinician input’, Teaching and Learning in Medicine, vol. 27, no. 2, pp. 182–188. doi: 10.1080/10401334.2015.1011651


Hudson, S. L., Jarstfer, M. B. & Persky, A. M. (2018) ‘Student learning with generated and answered peer-written questions’, American Journal of Pharmaceutical Education, vol. 82, no. 2, pp. 96–99. doi: 10.5688/ajpe6315


Jobs, A., et al., (2013) ‘Question-writing as a learning tool for students – outcomes from curricular exams’, BMC Medical Education, vol. 13, no. 1, pp. 89. doi: 10.1186/1472-6920-13-89


Kay, A. E., Hardy, J. & Galloway, R. K. (2018) ‘Learning from peer feedback on student-generated multiple choice questions: views of introductory physics students’, Physical Review Physics Education Research, vol. 14, no. 1, pp. 1–17. doi: 10.1103/PhysRevPhysEducRes.14.010119


Kay, A. E., Hardy, J. & Galloway, R. K. (2020) ‘Student use of peerwise: a multi-institutional, multidisciplinary evaluation’, British Journal of Educational Technology, vol. 51, no. 1, pp. 23–35. doi: 10.1111/bjet.12754


Kelley, M. R., et al., (2019) ‘Generation and retrieval practice effects in the classroom using peerwise’, Teaching of Psychology, vol. 46, no. 2, pp. 121–126. doi: 10.1177/0098628319834174


Masters, J. C., et al., (2001) ‘Assessment of multiple-choice questions in selected test banks accompanying text books used in nursing education’, Journal of Nursing Education, vol. 40, no. 1, pp. 25–32. doi: 10.3928/0148-4834-20010101-07


Mccoubrie, P. (2004) ‘Improving the fairness of multiple-choice questions: a literature review’, Medical Teacher, vol. 26, no. 8, pp. 709–712. doi: 10.1080/01421590400013495


Mcleod, P. J. & Snell, L. (1996) ‘Student-generated MCQs’, Medical Teacher, vol. 18, no. 1, pp. 23–25. doi: 10.3109/01421599609040257


Momsen, J. L., et al., (2010) ‘Just the facts? Introductory undergraduate biology courses focus on low-level cognitive skills’, CBE-Life Sciences Education, vol. 9, no. 4, pp. 435–440. doi: 10.1187/cbe.10-01-0001


Panczyk, M., et al., (2018) ‘Does repetition of the same test questions in consecutive years affect their psychometric indicators? – five-year analysis of in-house exams at Medical University of Warsaw’, Eurasia Journal of Mathematics, Science and Technology Education, vol. 14, pp. 3301–3309. doi: 10.29333/ejmste/91681


Papinczak, T., et al., (2012) ‘Using student-generated questions for student-centred assessment’, Assessment & Evaluation in Higher Education, vol. 37, no. 4, pp. 439–452. doi: 10.1080/02602938.2010.538666


Purchase, H., et al., (2010) ‘The quality of a peerwise MCQ repository’, Proceedings of the Twelfth Australasian Conference on Computing Education – Volume 103, Australian Computer Society, Inc., Brisbane.


Schullo-Feulner, A., et al., (2014) ‘Student-generated, faculty-vetted multiple-choice questions: value, participant satisfaction, and workload’, Currents in Pharmacy Teaching and Learning, vol. 6, no. 1, pp. 15–21. doi: 10.1016/j.cptl.2013.09.019


Snow, S., et al., (2019) ‘A discursive question: supporting student-authored multiple choice questions through peer-learning software in non-STEMM disciplines’, British Journal of Educational Technology, vol. 50, no. 4, pp. 1815–1830. doi: 10.1111/bjet.12686


Walsh, J. L., et al., (2018) ‘Formative student-authored question bank: perceptions, question quality and association with summative performance’, Postgraduate Medical Journal, vol. 94, no. 1108, pp. 97–103. doi: 10.1136/postgradmedj-2017-135018


Xie, B., et al., (2019) ‘An item response theory evaluation of a language-independent CS1 knowledge assessment’, Proceedings of the 50th ACM Technical Symposium on Computer Science Education, Association for Computing Machinery, Minneapolis, MN. doi: 10.1145/3287324.3287370
Published
2021-02-03
How to Cite
Huang A., Hancock D., Clemson M., Yeo G., Harney D., Denny P., & Denyer G. (2021). Selecting student-authored questions for summative assessments. Research in Learning Technology, 29. https://doi.org/10.25304/rlt.v29.2517
Section
Original Research Articles