Negin Mirriahi*, Dennis Alonzo and Bob Fox
School of Education & Learning and Teaching Unit, University of New South Wales, Sydney, Australia
Abstract
The need for flexibility in learning and the affordances of technology provided the impetus for the rise of blended learning (BL) globally across higher education institutions. However, the adoption of BL practices continues at a low pace due to academics’ low digital fluency, various views and BL definitions, and limited standards-based tools to guide academic practice. To address these issues, this paper introduces a BL framework, based on one definition and with criteria and standards of practice to support the evaluation and advancement of BL in higher education. The framework is theoretically underpinned by the extant literature and supported by focus group discussions. The evidence supporting the criteria and standards are discussed with suggestions for how they can be used to guide course design, academic practice, and professional development.
Keywords: digital literacy; criteria; standards; academic development; curriculum design
Citation: Research in Learning Technology 2015, 23: 28451 - http://dx.doi.org/10.3402/rlt.v23.28451
Responsible Editor: Carlo Perrotta, School of Education, University of Leeds, United Kingdom.
Copyright: © 2015 N. Mirriahi et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License, allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.
Received: 5 May 2015; Accepted: 27 September 2015; Published: 26 October 2015
*Correspondence to: Email: negin.mirriahi@unsw.edu.au
The demand for flexibility in learning and the affordances of technology provided the impetus for the rise of blended learning (BL) across the higher education sector. Since the early 1990s, its popularity has increased, and recently, it has received more attention due to education institutions attempting to offer more personalised learning experiences. BL has the capability to deliver personalised learning when designed with a strong focus for meeting the needs of individual students (Gaeta, Orciuoli, and Ritrovato 2009) and provided with strong institutional support and policy to enable more effective learning and teaching (Hargreaves 2006).
The popularity of BL, particularly in higher education contexts, however, does not necessarily translate into advancement of academic practice due to three key challenges. First, digital fluency or academics’ confidence and skills in using online technologies remain low (Johnson et al. 2014) despite the availability and affordances of digital technologies. The low digital skills of academics compromise appropriate technology integration, limiting the facilitation of more effective student learning (Torrisi-Steele and Drew 2013). At the moment, the use of technology for instruction is mostly for management and administrative purposes rather than for facilitating learning (Palak and Walls 2009).
Second, there are various views and definitions of BL, which according to Oliver and Trigwell (2005) ‘is ill-defined and inconsistently used’ (p. 24). Consequently, there is no uniform understanding of BL, and hence, academic practice is often underpinned by individuals’ own interpretations of the term rather than a consistent approach across an institution (Hinrichsen and Coombs 2013). The inconsistencies revolve around the design, pedagogical approaches, portion of online versus face-to-face time, purpose of blending, and the role of technology. For example, Garrison and Kanuka (2004) posit that the integration of differing modalities requires the combination of the most desirable aspects of face-to-face and online environments. Further, BL can address access, convenience, and cost-effectiveness, enabling students to save a considerable amount of time and resources from commuting and institutions to reduce the cost for additional buildings and facilities (Bleed 2001). However, Procter (2003) critiques the view that BL addresses the challenge of distance, arguing that it has a different design and delivery approach than fully distance learning. In particular, Procter (2003) emphasises that BL requires the ‘effective combination of different modes of delivery, models of teaching and styles of learning’ (p. 3). This is based on the assumption that the achievement of learning outcomes is dependent on the quality of learning and teaching experiences.
Third, the tools available to guide and evaluate BL course designs are limited (Smythe 2012). Though, there are available frameworks to design and evaluate BL practices both from the perspectives of learning and teaching and IT infrastructure, these frameworks are problematic either in their design or in the criteria and standards, or lack thereof. For example, some frameworks have identified the criteria needed but take the form of a Likert scale with no description of standards. Smythe’s (2012) framework has five levels of performance and claims to be standards-based, but it lacks the descriptions of standards for each level. This is problematic as it allows academics to have their own judgement on what is considered appropriate for each level. Oliver (2003) provides benchmarks with criteria and standards, but it is largely an adaptation of the principles of face-to-face teaching rather than considering the criteria for effective BL practices. Parsell and Collaborators’ (2013) framework includes criteria, but they are generic with emphasis on the elements of learning and teaching with technology appearing as an additional component and not as interweaved together. The use of explicit criteria and standards in BL will facilitate more effective learning and teaching activities as the criteria can be used to benchmark academic practice (Reed 2014).
The three issues discussed above are critical for BL implementation for enhancing academics’ skills and confidence using technologies, formulating a consistent definition to inform academics’ practice, and providing frameworks for objective evaluation of BL practice. We propose a standards-based BL framework based on one definition that reconciles the discrepancies in the literature discussed under issue two above and informed by the literature and supported by qualitative data gathered from focus groups. The framework provides a consistent understanding of BL practice and engages academics in self-assessment of their own practice to help them identify areas of expertise and areas requiring further development. The framework is introduced in the next section to provide a conceptual overview of the criteria and standards and how they have been informed. The introduction of this proposed framework provides academics, researchers, and others interested in BL with an opportunity to trial and/or adapt the framework to their own contexts.
After investigating the various interpretations and purposes of BL in the literature, the following consolidating definition forms the basis of the proposed BL framework: a process of integrating the most appropriate learning and teaching strategies, technology and/or media to provide meaningful, flexible learning experiences to achieve learning outcomes (based on Dick, Carey, and Carey 2009; Holden and Westfall 2010). Institutions can adapt this definition to fulfil their own strategic BL directions and provide learning experiences as appropriate for their contexts. The framework, therefore, guided by this definition, addresses the three issues identified. The standards-based framework proposed in this paper can:
A combination of logical, rational and theoretical approaches (Brown 1983; Friedenberg 1995) was used in the development of the framework. The criteria and standards were established based primarily on literature on BL frameworks and tools for BL evaluation and with support from focus groups at one higher education institution. The development process started with a literature search from online databases and search engines to identify existing BL frameworks and evaluation instruments used in higher education institutions. Two focus groups were conducted with a total of eight participants representing various disciplines and roles to collect information regarding current BL practice, the challenges faced in terms of implementation, and what they require to advance BL in their disciplines. Focus group discussions were audio recorded, transcribed, and any identifiable information was removed prior to analysis. Table 1 below shows the distribution of the focus group participants. While the number of participants is limited, it provides an initial understanding of whether the participants’ views support or refute the findings in the literature that inform the proposed BL framework.
Role | Number |
Senior Administrator (Associate Dean/Director) | 2 |
Lecturer | 2 |
Educational Developer/Instructional Designer | 3 |
The proposed BL framework contains criteria, which are the indicators of the ability of academics in designing and delivering a BL course, and standards that define the quality of practice. The use of criteria and standards has been proven by research to effectively support practice and skill development (Wolf and Stevens 2007). The criteria and standards will guide academics to determine their current level of BL practice and subsequent levels of practice that they should aspire towards to further improve their practice (Inbar-Lourie 2008). Griffin’s (2000) research revealed that using criteria and standards can support various professionals in their skill development and using a skills assessment instrument with criteria and standards can be linked to identifying professionals’ zone of proximal development (ZPD), as defined by Vygotsky (1986) as the area of opportunity for skill development. Further, the results of a skills assessment with explicitly defined standards can inform needs-based professional development programs and resources. This is consistent with Bruner’s (1996) learning theory where the progression of learning should be built upon the learner’s current level of ability.
The criteria and standards in the proposed framework are organised around the RASE learning design model that supports a student-centred, technology-rich environment suitable for BL. This particular model was chosen due to its suitability for BL in technology-rich environments. Central to the RASE model is identifying Activities for students to work on, to gain multiple skills, literacies, knowledge and content competencies needed to address meeting the learning outcomes defined by a BL course. The Resources in the RASE offer students well-structured exercises and discipline content, that enables students to work successfully through the Activities set. Support in the RASE provides students with assistance in technical, peer and tutor support needed to understand how best to work through the tasks set and to offer advice in areas students might find difficult. Evaluation (Assessment) in the RASE offers a structured guide to help students understand how well they are doing as they work their way through the course. The Evaluation (Assessment) stage in a blended environment also enables teachers to monitor how well individual and groups of students are doing as they progress through the course. The Evaluation (Assessment) stage can offer a traffic light signal to teachers, identifying students in need of further tutorial support (Churchill et al. 2013). In the following section, each criterion is discussed with evidence from the literature and supplemented with comments made by focus group participants where available.
Criteria related to resources focuses on the availability of resources in general and of those related to formative assessment.
We use students a lot for developing our resources, they are constantly designing our resources, and most of our resources are designed by students themselves.
Both face-to-face and online activities should facilitate learning experiences that help students achieve the intended learning outcomes of the course.
… it’s really looking at the outcomes and how the technology can support students developing those outcomes, and there might be particular issues such as you might have students at a distance. That means you need technology to enable things. It might be that you need to be able to put students into the field, in such a virtual way, through simulations.
The thoughtful process of designing face-to-face and online activities will ensure that the activities in both mediums support one another rather than overlap. To achieve this, the two environments should be integrated, as focus group participant, P2FG1, emphasises, ‘how you integrate what you do with one with the other so that then the two are not separate things, you are not repeating, you are not replacing, but you are integrating’.
I think different students will approach it differently as well, if they’ve got the options. I mean, some students would still love coming to lecture, whereas some students shy away from coming to lectures but they listen to their [video or audio] recordings … so I think, as well as discipline specific context, there is individual student context.
… and this is part of digital literacy too, knowing that different tools work differently and have different affordances. So, it’s being able to take a step back and evaluate the new opportunities.
In addition, a number of online tools are readily available through learning management systems or through the Internet due to the advancement of web-based technologies such as wikis, blogs, discussion forums, simulations, and synchronous webinars that can facilitate interactive and collaborative learning enhancing student engagement (Chen, Lambert, and Guidry 2010). This informs the higher standards for this criterion (Standards B and C on Table 2).
These criteria focus on supporting students’ digital literacy skills, responding to queries, and providing on-going feedback.
… yes, some of them are, but this is an assumption … academics assume that students are so much more advanced in terms of digital literacy … but the research shows that they are literate in pockets, they are really good at using Facebook perhaps, or using their phones to send SMS messages but as far as using technologies for learning or for professional purposes maybe they’re not so literate.
Thus, strong support for students’ development of digital literacy is necessary to ensure their effective and meaningful engagement with BL (Garrison and Kanuka 2004). Kennedy et al. (2010) argue that students’ technological experiences and expectations need to be managed. Focus group participant, P2FG1, further elaborates, ‘… what support is available for students … what training and development is there, these are the things that probably will make a better blended environment’. Further, apart from the use of technology, students need to be trained in their role as BL learners (Cheung and Hew 2011). Hence, as the higher standard for this criterion (Standard C on Table 2) states, students’ technical capabilities should be identified in order for the appropriate training resources to be provided for using the online tools effectively.
These criteria focus on the design of assessment tasks, student access, and self and peer assessment.
… there are new opportunities for assessment online … looking at sort of deductive assessment for students, we can do that quite well in an online environment that we can’t do on paper.
Equally important is the use of technology to develop differentiated assessment and provide students with the option to choose how to best demonstrate their learning through a variety of assessment submission types such as videos, posters, presentations, and essays. Students’ autonomy to choose how they demonstrate their learning informs the higher standard of this criterion (Standards B and C on Table 2).
The academic digital literacy required in BL environments can be seen as a hierarchy of skills and attributes (Bennett 2014). Underpinned by this view, the standards established in the proposed framework include three pre-defined levels building upon one another (Standards A through C) and with an additional fourth standard (Faculty-Determined Standard). This highest level of standard allows individual Faculties to account for the breadth and diversity of BL and online practices across an institution. As one of the focus group participants, P2FG1, explains:
… what works in one Faculty is not going work for another Faculty, the science and the practical subjects in Medicine, have face-to-face lab work, yes you can have simulations, and yes you can have all the iPad apps you’ve got, but the physical working with chemicals or whatever it is, that’s a whole different experience to something like Journalism where a lot of the work will now be in the online environment ….
The first level or standard requires academics to integrate technology to offer online teaching and learning activities. Technology use is evident but it is more academic-controlled. However, the second level or standard requires academics to use technology to deliver flexible learning opportunities, which are constructive, authentic and collaborative, while the third level or standard moves towards a more enriched and higher degree of student collaboration. In addition, in this third level, technology allows students to have greater participation in terms of creating the resources allowing them to learn with technology rather than just from technology (Churchill et al. 2013). This is consistent with Darling-Hammond, Zielezinski, and Goldman’s (2014) view that ‘the curriculum and instructional plan should enable students to create content as well as to learn material’ (p. 15). Table 2 presents the complete proposed BL framework.
To realise its full potential in transforming academic practice, the following should be considered when using the BL framework:
There are several implications for research related to the proposed framework. First, the framework should be trialled with academic staff across different disciplines and at different stages of their BL practice to identify how they engage with it and whether they find it useful for gauging their current BL practice and identifying enhancements. While the authors intend to pilot the framework at their institution, they hope that by introducing it as a proposed conceptual framework largely underpinned by the literature, they provide an opportunity for other researchers and educational developers to trial the framework with their own staff or adapt it for their own contexts. Second, as the number of focus group participants was limited, future work should include gathering the views and perceptions of both academics and students in order to further refine the criteria and standards. The views of students, in particular, would help identify whether the standards associated with their engagement of online technologies would support their learning. Finally, future research can include the inclusion of professional development resources associated with the framework’s standards and criteria in order to investigate how academics engage with them and how academic development units can best support the development of BL practice across their institutions.
The proposed BL framework, theoretically informed by the extant literature and supported by qualitative data collected from focus groups, addresses the three issues identified in the BL literature: (1) lack of academics’ digital fluency, (2) multiple definitions of BL, and (3) lack of standards in existing BL frameworks. First, as a self-assessment instrument, the framework can help academic staff identify their current standard of BL practice and the changes required to progress to the higher standards. Consequently, this could lead academics to enhance their practice or seek professional development, both, which will develop their digital literacy skills addressing the first issue. Second, a consolidated definition of BL based on the literature forms the basis of the framework and can be adapted by institutions to inform their own BL definitions and policies. Third, the proposed framework provides three levels of descriptive standards with a fourth level for Faculties to identify for themselves to address their own disciplinary and contextual needs.
Bath, D. & Bourke, J. (2010) Getting Started with Blended Learning, Griffith University, Australia, [online] Available at: https://www.griffith.edu.au/__data/assets/pdf_file/0004/267178/Getting_started_with_blended_learning_guide.pdf
Bennett, L. (2014) ‘Learning from the early adopters: developing the digital practitioner’, Research in Learning Technology, vol. 22, 21453, doi: http://dx.doi.org/10.3402/rlt.v22.21453
Bleed, R. (2001) ‘A hybrid campus for the new millennium’, Educause Review, Jan/Feb, pp. 17–24. Available at: http://er.educause.edu/~/media/files/article-downloads/erm0110.pdf
Brown, F. G. (1983) Principles of Educational Design and Psychological Testing, 3rd edn, Hoilt, Rinehart, and Winston, New York.
Brown, G., Smith, T. & Henderson, T. (2007) ‘Student perceptions of assessment efficacy in online and blended classes’, in Blended Learning: Research Perspectives, eds A. G. Picciano and C. Dziuban, Sloan Consortium, Needham, MA, pp. 145–160.
Bruner, J. (1996) The Culture of Education, Harvard University Press, Cambridge, MA.
Chen, P.-S. D., Lambert, A. D. & Guidry, K. R. (2010) ‘Engaging online learners: the impact of Web-based learning technology on college student engagement’, Computers and Education, vol. 54, no. 4, pp. 1222–1232. Publisher Full Text
Cheung, W. S. & Hew, K. F. (2011) ‘Design and evaluation of two blended learning approaches: lessons learned’, Australasian Journal of Educational Technology, vol. 27, no. 8, pp. 1319–1337.
Churchill, D., King, M., Webster, B. & Fox, B. (2013) ‘Integrating learning design, interactivity, and technology’, in Proceedings of the 30th ascilite Conference, Sydney, pp. 139–143. Available at: http://www.ascilite.org/conferences/sydney13/program/papers/Churchill.pdf
Darling-Hammond, L., Zielezinski, M. B. & Goldman, S. (2014) Using Technology to Support At-Risk Students’ Learning, Alliance for Excellent Education and Stanford Center for Opportunity Policy in Education, [online] Available at: https://edpolicy.stanford.edu/sites/default/files/scope-pub-using-technology-report.pdf
Dick, W., Carey, L. & Carey, J. O. (2009) The Systematic Design of Instruction. 7th edn, Pearson Merrill, Upper Saddle River, NJ.
Dochy, F., Segers, M. & Sluijsmans, D. (1999) ‘The use of self-, peer and co-assessment in higher education: a review’, Studies in Higher Education, vol. 24, no. 3, pp. 331–350. Publisher Full Text
Dziuban, C., et al., (2006) ‘Blended learning enters the mainstream’, in The Handbook of Blended Learning, eds C. J. Bonk & C. Graham, Wiley, San Francisco, CA, pp. 195–208.
Friedenberg, L. (1995) Psychological Testing: Design, Analysis and Use, Allyn and Bacon, Boston, MA.
Gaeta, M., Orciuoli, F. & Ritrovato, P. (2009) ‘Advanced ontology management system for personalised e-Learning’, Knowledge-Based Systems, vol. 22, no. 4, pp. 292–301. Publisher Full Text
Garrison, D. R. & Kanuka, H. (2004) ‘Blended learning: uncovering its transformative potential in higher education’, The Internet and Higher Education, vol. 7, no. 2, pp. 95–105. Publisher Full Text
Griffin, P. (2000) Competency Based Assessment of Higher Order Competencies, NSW State Conference of the Australian Council for Educational Administration, Mudgee, NSW.
Hargreaves, D. (2006) A New Shape for Schooling, Specialist Schools and Academies Trust, London.
Hattie, J. (2008) Visible Learning: A Synthesis of over 800 Meta-analyses Relating to Achievement, Routledge, Hoboken.
Hattie, J. & Timperley, H. (2007) ‘The power of feedback’, Review of Educational Research, vol. 77, no. 1, pp. 81–112. Publisher Full Text
Hinrichsen, J. & Coombs, A. (2013) ‘The five resources of critical digital literacy: a framework for curriculum integration’, Research in Learning Technology, vol. 21, 21334, doi: http://dx.doi.org/10.3402/rlt.v21.21334
Holden, J. & Westfall, P. (2010) An Instructional Media Selection Guide for Distance Learning: Implications for Blended Learning, 2nd edn, United States Distance Learning Association, [online] Available at: https://www.usdla.org/wp-content/uploads/2015/05/AIMSGDL_2nd_Ed_styled_010311.pdf
Inbar-Lourie, O. (2008) ‘Constructing a language assessment knowledge base: a focus on language assessment courses’, Language Testing, vol. 25, no. 3, pp. 385–402. Publisher Full Text
Johnson, L., et al., (2014) Horizon Report: 2014 Higher Education, New Media Consortium, Austin, TX.
Kennedy, G., et al., (2010) ‘Beyond natives and immigrants: exploring types of net generation students’, Journal of Computer Assisted Learning, vol. 26, no. 5, pp. 332–343. Publisher Full Text
Lateef, F. (2010) ‘Simulation-based learning: just like the real thing’, Journal of Emergencies, Trauma, and Shock, vol. 3, no. 4, pp. 348–352. Publisher Full Text
Li, L., Lu, X. & Steckelberg, A. (2010) ‘Assessor or assessee: how student learning improves by giving and receiving feedback’, British Journal of Educational Technology, vol. 41, no. 3, pp. 525–536. Publisher Full Text
Liu, N. F. & Carless, D. (2006) ‘Peer feedback: the learning element of peer assessment’, Teaching in Higher Education, vol. 11, no. 3, pp. 279–290. Publisher Full Text
Means, B., et al., (1993) Using Technology to Support Education Reform, U.S. Government Printing Office, Washington, DC.
Mohr, A. T., Holtbrügge, D. & Berg, N. (2011) ‘Learning style preferences and the perceived usefulness of e-learning’, Teaching in Higher Education, vol. 17, no. 3, pp. 309–322. Publisher Full Text
Mirriahi, N. & Alonzo, D. (2015) ‘Shedding light on students’ technology preferences: implications for academic development’, Journal of University Teaching & Learning Practice, vol. 12, no. 1, pp. 1–14. doi: http://ro.uow.edu.au/jutlp/vol12/iss1/6/ PubMed Abstract | PubMed Central Full Text
Oliver, M. & Trigwell, K. (2005) ‘Can learning be redeemed?’, E-learning, vol. 2, no. 1, pp. 17–26. Publisher Full Text
Oliver, R. (2003) ‘Exploring benchmarks and standards for assuring quality online teaching and learning in higher education’, Proceedings of the 16th Open and Distance Learning Association of Australia Biennial Forum, Canberra, ACT, pp. 79–90.
Palak, D. & Walls, R.T. (2009) ‘Teachers’ belief and technology practices: a mixed-methods approach’, Journal of Research on Technology in Education, vol. 41, no. 4, pp. 417–441. Publisher Full Text
Parsell, M. & Collaborators. (2013) Standards Online Education Framework, [online] Available at: http://www.onlinestandards.net/standards/
Procter, C. (2003) ‘Blended leaning in practice’, Proceedings of the Education in a Changing Environment Conference, [online] Available at: http://usir.salford.ac.uk/27428/2/BlendedLearningInPractice.pdf
Reed, P. (2014) ‘Staff experience and attitudes towards technology-enhanced learning initiatives in one faculty of health and life sciences’, Research in Learning Technology, vol. 22, 22770, doi: http://dx.doi.org/10.3402/rlt.v22.22770 Publisher Full Text
Sadler, D. R. (1989). ‘Formative assessment and the design of instructional systems’. Instructional Science, vol. 18, no. 2, pp. 119–144. Publisher Full Text
Sadler, P. & Good, D. (2006) ‘The impact of self- and peer-grading on student learning’, Educational Assessment, vol. 11, no. 1, pp. 1–31. Publisher Full Text
Smythe, M. (2012) ‘Toward a framework for evaluating blended learning’, Future Challenges, Sustainable Futures, Proceedings Ascilite, Wellington, New Zealand, pp. 854–858.
Torrisi-Steele, G. & Drew, S. (2013) ‘The literature landscape of blended learning in higher education: the need for better understanding of academic blended practice’, International Journal for Academic Development, vol. 18, no. 4, pp. 371–383. Publisher Full Text
Velan, G. M., et al., (2002) ‘Web-based assessments in pathology with question mark perception’, Pathology, vol. 34, no. 3, pp. 282–284. Publisher Full Text
Vygotsky, L. (1986) Thought and Language, MIT Press, Boston, MA.
Wagner, E. D. (2006) ‘On designing interaction experiences for the next generation of blended learning’, In The Handbook of Blended Learning, eds C. J. Bonk & C. Graham, Wiley, San Francisco, CA, pp. 41–55.
Wang, M. (2010) ‘Online collaboration and offline interaction between students using asynchronous tools in blended learning’. Australasian Journal of Educational Technology, vol. 26, no. 6, pp. 830–846.
Wingard, R. G. (2004) ‘Classroom teaching changes in web-based enhanced courses: a multi-institutional study’, EDUCAUSE Quarterly, vol. 27, no. 1, pp. 26–35.
Wolf, K. & Stevens, E. (2007) ‘The role of rubrics in advancing and assessing student learning’, The Journal of Effective Teaching, vol. 7, no. 1, pp. 3–14.
Yam, S. & Rossini, P. (2011) ‘Online learning and blended learning: which is more effective’, 17th Pacific Rim Real Estate Society Conference, Gold Coast, Australia, [online] Available at: http://www.prres.net/papers/YAM_Online_learning_and_blended_learning.pdf