Orchestration in learning technology research: evaluation of a conceptual framework

Luis P. Prietoa*, Yannis Dimitriadisb, Juan I. Asensio-Pérezb and Chee-Kit Looic

aCHILI Lab, EPFL, Lausanne, Switzerland; bGSIC-EMIC, Universidad de Valladolid, Valladolid, Spain; cNational Institute of Education, Nanyang Technological University, Singapore

Abstract

The term ‘orchestrating learning’ is being used increasingly often, referring to the coordination activities performed while applying learning technologies to authentic settings. However, there is little consensus about how this notion should be conceptualised, and what aspects it entails. In this paper, a conceptual framework for orchestration-related research is evaluated by an international panel of learning technology experts. The results of this evaluation show that the framework is complete and understandable, and it is particularly useful as an integrative list of aspects to consider when designing and evaluating learning technologies. To illustrate a way in which the framework can be used to help researchers structure their classroom innovation evaluations, an example is presented that follows the adoption of the framework by a group of researchers in Singapore. Finally, a new evolved version of the framework is presented, taking into account the evaluation feedback.

Keywords: orchestration; conceptual framework; analytical lens; expert panel

Citation: Research in Learning Technology 2015, 23: 28019 - http://dx.doi.org/10.3402/rlt.v23.28019

Responsible Editor: Meg O’Reilly, Southern Cross University, Australia.

Copyright: © 2015 L.P. Prieto et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License, allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Received: 1 April 2015; Accepted: 30 August 2015; Published: 24 September 2015

*Correspondence to: Email: luis.prieto@epfl.ch

Introduction

There is a growing concern in the educational technology research community about the lack of adoption of research-based results in everyday educational practice (Chan 2011). This concern, related to the complexities of applying novel technologies in authentic educational practice, has crystallised around the notion of ‘orchestration’ (Prieto et al. 2011). Dillenbourg, Järvelä, and Fischer (2009) defined orchestration as ‘the process of productively coordinating supportive interventions across multiple learning activities occurring at multiple social levels’. There is, however, an on-going debate about how orchestration should be investigated, as attested to during several workshops, symposia and publications (e.g. Fischer et al. 2013; Nussbaum et al. 2011; Sutherland and Joubert 2009). While the variety of meanings is enriching in a multi-disciplinary research community, it can also be confusing for a newcomer to the field.

In this paper, we update and expand on this debate with the aim of helping learning technology researchers to address and communicate this ill-defined, multi-faceted notion. A conceptual framework for research in orchestration (‘5+3 Aspects’, Prieto et al. 2011) was evaluated using consensus by an international panel of educational technology researchers. The purpose of this evaluation was not to achieve a ‘canonical’ definition of orchestration, rather to expose the variety of perspectives in the field, organise them and identify potential areas of research on this topic. To complement this evaluation, we also present an illustrative example of how an external research team has applied and appropriated the framework for their own research on mobile classrooms. Finally, an evolved version of the conceptual framework is presented, synthesising the lessons learned and emergent themes identified through the evaluation and application of the framework in an authentic research setting.

The next section describes the notion of orchestration in recent learning technology research; later, the ‘5+3 Aspects’ framework is presented in detail. We then provide findings and supporting evidence from the evaluation of the conceptual framework by learning technology experts. Next, we present an example of framework usage in educational technology research, and synthesise a new definition and conceptual framework for orchestration research. We end the article with several implications and future directions for this work.

Orchestration in learning technology research: multi-faceted and ill-defined

‘Orchestration’, which the English dictionary defines as ‘to arrange or combine so as to achieve a desired or maximum effect’ (Merriam-Webster Dictionary 2014), has been used as a metaphor for educational practice for many years. Teachers’ practice includes arranging the flow and combination of learning activities, so that students achieve a certain learning effect. However, the use of the term has recently become popular specifically in the area of educational technologies, as we can see from the number of publications on the subject (see Figure 1, from Google Scholar, August 2015). Indeed, Sutherland and Joubert (2009) mentioned it as one of the ‘Grand Challenges’ in technology-enhanced learning (TEL). More recently, a special issue published 11 articles by different researchers, providing a selection of perspectives on the issue of orchestration (Roschelle, Dimitriadis, and Hoppe 2013).

Fig 1
Figure 1.  Ratio of publications about ‘learning’ which also use ‘orchestration’.

The definition of ‘orchestrating (technology-enhanced) learning’, however, is not straightforward as it involves the coordination of a multiplicity of activities and social levels, contexts and media (Dillenbourg, Järvelä, and Fischer 2009), as well as learning processes (cognitive orchestration), technologies and the adaptation of activities to the classroom context (pedagogical orchestration) (Dillenbourg and Jermann 2007). Certainly, we lack a unique definition for this phenomenon, and overlaps can be found between this emergent term and existing areas of educational and learning technology research, including instructional design, classroom management or teacher facilitation. However, the new usage of the word ‘orchestration’ seems to have gathered many of these previous research efforts, highlighting the difficulties of applying research-proposed technologies and innovations to authentic educational settings (Roschelle, Dimitriadis, and Hoppe 2013).

This broad diversity of perspectives on orchestration has also prompted efforts to review, gather and synthesise such overlapping (sometimes, opposing) views. From the teacher and teacher education perspective, Hämäläinen and Vähäsantanen (2011) provided a theoretical and pedagogical review of the orchestration of collaborative learning. Prieto et al. (2011), on the other hand, reviewed educational technology literature from the researcher perspective, clustering researcher notions of orchestration into eight different ‘aspects’, describing how orchestration takes place and the driving forces that shape such orchestration (see the next section for further details). Roschelle, Dimitriadis, and Hoppe (2013) synthesised 11 papers in a special issue on orchestration, highlighting a common theme of veering away from laboratory studies, and emphasising ‘attention to the challenges of classroom use of technology’ and ‘supporting teachers’ roles’. They also drew parallels with ‘design research for implementation’ (Penuel et al. 2011), concluding that ‘researchers from a range of TEL and CSCL [computer-supported collaborative learning] fields have much to gain by joining in the debate’.

To continue this debate, in the spirit of what Boyer (1990) called the ‘scholarship of integration’ (the value of research that does not propose entirely new knowledge, but rather summarises and integrates existing knowledge), we have taken one of these synthetic views on orchestration, and we have proposed its evaluation to a wide range of educational technology researchers, focusing mainly on internationally recognised experts interested in this topic. We chose the ‘5+3 Aspects’ framework (Prieto et al. 2011), as it was the only one available at the beginning of the study (Spring 2012) that was designed for use by researchers. The purpose of the evaluation of the framework was to assess its completeness and usefulness for researchers, and also to help us better understand aspects or perspectives missing from the framework, therefore achieving a more complete, organised view of orchestration in learning technology research.

The ‘5+3 Aspects’ conceptual framework

Prieto et al. (2011), noting that it was becoming increasingly difficult to achieve common ground about orchestration in the TEL research community, reviewed relevant literature mentioning, defining or highlighting orchestration in computer-supported learning. Their main goal was to achieve a simple yet encompassing view of what orchestration meant for learning technology researchers (rather than teachers or other stakeholders).

The authors clustered the literature about orchestration into eight aspects (see Figure 2). Five key aspects in characterising orchestration, providing a descriptive view of orchestration in an educational setting are as follows:

The other three aspects describing key factors that shape the way orchestration is done are as follows:

Based on this framework, Prieto (2012) offered another definition of orchestration, as ‘the process by which teachers and other actors design, manage, adapt and assess learning activities, aligning the resources at their disposal to achieve the maximum learning effect, informed by theory while complying pragmatically with the contextual constraints of the setting’. There, it was also proposed that educational technology innovations should consider all eight aspects in order to avoid reporting gains in one aspect that could be outweighed by (otherwise unnoticed) losses in other aspects. An initial researcher instrument based on the framework was also developed: a reflective interview guide to be used when collecting data about how orchestration is performed in an authentic setting (see Prieto 2012, Appendix A).

Fig 2
Figure 2.  Graphical representation of the ‘5+3 Aspects’ conceptual framework.

Evaluation study

Context and methods

In order to explore the extent to which the ‘5+3 Aspects’ framework reflects the perceptions and opinions of learning technology researchers, and further delve into the notion of orchestration itself, we performed a consensus-based validation (see Moody 2005). We conducted the following two mixed-methods (Creswell et al. 2003) panel studies, involving up to 45 researchers (see Figure 3):

  1. The pilot panel study (RP1) involved 22 researchers (including Ph.D. students and postdoctoral researchers/professors) from four Spanish research labs. The purpose of the pilot panel study was to test and fine-tune the tasks and questionnaires to be used in the main expert panel (RP2, see below), and to triangulate the conclusions of the main expert panel with those of a set of researchers from a wider range of expertise and focus within learning technologies (not restricted to researchers interested in the topic of orchestration). This would give an initial indication of whether the conclusions from the orchestration expert evaluation aligned with those of the wider learning technologies community.
  2. The main panel study (RP2) was aimed at researchers who were internationally recognised experts in orchestration. The 23 researchers involved in the main panel study were recruited from workshops and symposia about orchestration (Fischer et al. 2013; Nussbaum et al. 2011), as well as technical reports and special issues on the topic (Dillenbourg et al. 2011; Roschelle, Dimitriadis, and Hoppe 2013).

Fig 3
Figure 3.  Conceptual structure, data sources (labels in brackets) and analysis structure of the evaluation.

Data were gathered through online questionnaires (due to the geographical distribution and limited availability of participants) including Likert-scale and open-ended questions, as well as analysis of documents provided by participants. The data analysis included quantitative (descriptive, to detect trends in the data) and qualitative analyses. The analysis was structured using an anticipatory data reduction process (Miles and Huberman 1994) around one main evaluative issue (Does ‘5+3 Aspects’ clarify the notion of orchestration and support orchestration-related research?, see Figure 3). This issue was illustrated through the exploration of three pre-defined topics: participants’ background and profile (Topic 1), to help us understand the perspective from which researcher opinions stemmed; assessment of the framework’s completeness (Topic 2); and findings about the framework’s usefulness for research purposes (Topic 3). We also defined a topic dedicated to emergent insights about orchestration, given our explicit aim to understand orchestration beyond the concerned framework (Topic 4). In the following subsection, we describe the main findings and supporting evidence for each of these four topics. Due to space restrictions, we have omitted the findings and evidence of the pilot study (RP1), as they largely confirmed those of the main panel study (see Figure 5). The interested reader can find further details and evidence about the pilot study in Chapter 3 of Prieto (2012).

The procedure of each researcher panel was as follows (see Figure 3): first, participants answered a profiling questionnaire about their background, experience and pre-existing notions on orchestration [RPx-Q1]; later, participants watched a short multimedia presentation about the framework (self-paced, estimated duration of 20–30 min, available online at www.goo.gl/10QFk); after that, researchers answered a questionnaire with open-ended and Likert-scale questions, assessing the usefulness and completeness of the framework [RPx-Q2]. The study also featured two additional optional activities: an exercise using the reflective guide based on the ‘5+3 Aspects’ framework (Prieto 2012, Appendix A) to characterise the orchestration of an educational setting of the participants’ choice; finally, participants were asked to answer another short questionnaire about the reflection guide and the new insights it sparked [RPx-Q3]. Participants then submitted the resulting document, which was also analysed [RPx-D].

Results

After minor modifications to the procedure and questionnaires to be used (based on the feedback from the pilot panel, RP1), we contacted 31 internationally-recognised experts in the topic of orchestration, of which twenty-three (n=23) volunteered to undergo the procedure described above.

Regarding participant background and pre-existing notions (Topic 1), participants were highly experienced researchers (average research expertise of 17 years). The mix of backgrounds was multi-disciplinary, with a certain bias towards education backgrounds (11 participants had an educational background, whereas only 6 participants had a technical background). Participants’ prior definitions of orchestration were very rich (see Figure 4), even if the mix of concepts was for the most part similar to those in the ‘5+3 Aspects’ framework and the pilot study. Just to show two examples: ‘[…] I look at orchestration especially in terms of what can be done to support teachers in the complex process of creating the multifaceted conditions that are conducive to learning. Those conditions are of many types, including: social […], emotional […], temporal […], structure […]; cognitive […], material […]’, or ‘It is about both the learning activities […] we want students to perform (so, it includes […] scripting approaches, etc.) and the way we manage to do this in learning environments (so it involves management). It is not solely about classroom management, but it could also be virtual classroom management. It is not solely about the ad hoc activities on the spot, but also on the preparation […]’ [RP2-Q1].

Fig 4
Figure 4.  Word cloud visualisation from experts’ prior definitions of orchestration in RP2.

In these prior concepts of orchestration, there were also critical voices that considered the notion ‘fuzzy’ (‘everyone concerned with it seems to have a different understanding of what the term is about’ [RP2-Q1]), or doubted that it should be used in academia (‘I wonder if it’s a good idea for academics to deal with this. It seems more like a product-oriented question […]’ [RP2-Q1]). However, in general, orchestration was considered as an important, unsolved research problem: ‘[Orchestration is] a valuable concept that needs further research and clarification’, or ‘Orchestration “hides” a really complex lifecycle that […] we consider as particularly “flexible” with entwined phases’ [RP2-Q1].

When asked about the framework’s coherence and completeness (Topic 2), participants considered the framework as fairly logical (avg=4.78, std=1 in a 1–6 Likert scale), although participants sometimes suggested different re-organisations of the framework notions (e.g. ‘The 5+3 includes “what” and “how” aspects. Perhaps the Actors issue should refer to a “who” aspect?’, ‘[…] I also can see arguments for both separating and combining theory and alignment – they are both “background influences” on the way that a teacher creates a design, and so could be combined […] I also prefer the second diagram which collapsed management, adaptation and awareness together, as they seem more closely related to each other than, say, the design stage, which is quite different’ [RP2-Q2]). The framework was generally considered clear (avg=5.04, std=0.76 in a 1–6 Likert scale [RP2-Q2]), although some participants thought it did not distinguish orchestration from other overlapping concepts: ‘What’s missing is an attempt to differentiate it from lesson planning, scripting, authoring, formative assessment, learning design – all of which overlap with the definition’ [RP2-Q2]. The framework was deemed fairly comprehensive (avg=4.52, std=1.16 in a 1–6 Likert scale [RP2-Q2]) and, when asked about missing elements, there was very little coincidence among responses, with a few comments about the role of technology (e.g. ‘It dismisses computational aspects and opportunities […] The framework could be read as a general pedagogical framework’, or ‘I add the technology’ [RP2-Q2]), and also pedagogical issues (e.g. ‘Maybe […] the concepts of “reflection” or “beliefs” could be integrated somewhere’ [RP2-Q2]). Another notion that emerged in several responses about the framework’s completeness was that it ran the risk of being too comprehensive: ‘What I’m wondering now […] is what is not orchestration. In other words, almost any TEL research line might claim to be focused on one aspect of the framework or another’ [RP2-Q2]. Overall, participants characterised the framework as highly relevant to their field of research (avg=5.13, std=1.14 in a 1–6 Likert scale [RP2-Q2]).

Regarding the usefulness of the framework for orchestration-related research (Topic 3), participants responded quite variedly on the likelihood of using the framework in their own research in the future (avg=4.43, std=1.27 in a 1–6 Likert scale): Twelve participants (52%) answered on the high end of the scale (5–6) and only one on the lower end (1–2) [RP2-Q2]. Among the framework’s main perceived affordances, experts most often mentioned its usefulness as an integrative view of this broad and fuzzy term (‘I think that the framework has integrative value and might well be used for a synthesis of past research’, or ‘The categorization helped me in getting a more clear view of this fuzzy field’ [RP2-Q2]), or as a list of issues to take into account for design or data collection (‘Yes, it may be good when developing new teaching/learning activities, in the sense that you can quickly check if you have given all the different aspects a thought’, or ‘it might be a nice framework for focusing/attuning data collection’ [RP2-Q2]). There were also participants who highlighted the potential pedagogical value of the framework, for example, for inexperienced researchers (‘[…] if I had to teach this stuff, I would present this framework to my students. It’s a nice and easy-to-understand way to introduce this perspective’ [RP2-Q2]). Among the shortcomings of the framework, participants mentioned the lack of usage examples (‘I first want to see a successful real world test that demonstrates the usefulness of this thing’ [RP2-Q2]) or the lack of clear boundaries for its notions (‘[I would not use it] as it stands – as it isn’t sufficiently precise to differentiate orchestration from related practices and theories’ [RP2-Q2]). Although the framework was presented as aimed at researchers, there were also a few comments mentioning its potential use by teachers: ‘I think I’d also separate out a teacher-oriented view (simplified) vs. an expert view (more complex, and more technical precision)’ [RP2-Q2].

From the responses of participants, we can also gather certain emergent insights about the notion of orchestration itself and its meaning for our research community (Topic 4). One is that orchestration has, at its heart, certain issues that are still hotly debated in the community, such as the role of the teacher and, in general, the role of the different actors involved in the learning scenario. The description of the framework often presented teachers as the main ‘orchestrators’. Certain participants, from different perspectives, expressed their opposition to this view:

The only thing I see is: there is an overemphasis on the teacher, thus it excludes informal learning – and other forms with social instruction.

Usually, it is not a teacher who develops a TEL scenario, but rather a programmer or a curriculum designer who does so. Only then, this scenario is taken up by a practitioner and adapted to her current needs.

My experience in scaling up TEL is that what teachers want is very nicely designed materials that are easy to use and work [with]. I find very few teachers want to and are good at designing, adapting, aligning. [RP2-Q2]

This suggests that the teacher as sole orchestrator need not be the only case, depending on contextual factors such as the research focus, or the local teacher culture.

Another emergent insight is that a few of the experts did not agree with the very idea of such a framework (e.g. ‘Adopting such a framework imposes structure that we will most likely want to avoid, given the nascent aspect of this domain. […] Multiplexed meanings and applications aren’t a bad thing. I get to talk about it as being “whatever I am treating it as” […] but it will help many in the field as an overview’ [RP2-Q2]), or the notion of orchestration as a whole: ‘Orchestration as a patch on badly designed […] tools isn’t going to work. Now let’s suppose we have very nicely designed, pretty simple, highly usable tools that teachers like. I am not sure why we need orchestration on top of that. So either it fixes something that is too badly broken to be fixed or it adds little value to something that is already working quite nicely […]’ [RP2-Q2].

Figure 5 summarises graphically the main conclusions of the two panel studies: both collectives of researchers (orchestration experts in RP2, and a wider sample of expertise and focus in RP1) had similar initial conceptualisations of orchestration. Both also valued positively the framework in terms of its completeness, and saw its value as a holistic overview or as a checklist for researchers in the field. Some of the international experts, however, were critical of the framework and of the notion of orchestration itself, and illustrated on-going debates related to orchestration, such as the role of the teacher and other actors in orchestration.

Fig 5
Figure 5.  Summary of findings from the framework’s evaluation.

The case of the mobile classroom: an example of the use of the framework

As noted by some of the panels’ participants, one missing element from the conceptual framework is a set of usage examples of the ‘5+3 Aspects’ framework in actual learning technology research. A number of such examples has been published since (Gutiérrez-Rojas, Crespo-García, and Delgado-Kloos 2012; Muñoz-Cristóbal et al. 2015; Prieto et al. 2014a, 2014c; Prieto, Dimitriadis, & Asensio-Pérez 2014b), mainly to analyse the benefits of a certain learning technology innovation, or to structure the evaluation of such innovations. Below, we briefly describe the first example of research using the framework by researchers unrelated to the original authors of the ‘5+3 Aspects’ framework, on the topic of mobile classrooms.

Looi and Toh (2013) describe a longitudinal research effort in a Singapore primary school, trying to aid teachers in orchestrating science learning activities using mobile technologies, including actions outside the classroom and parental participation in student-directed activities. The study followed a design-based research iterative process (Van den Akker et al. 2006), from which a conceptual framework emerged to enable flexible learning in mobile learning classrooms. This framework was composed of three main elements, organised temporally: learning design, lesson enactment and knowledge dissemination, each with multiple sub-elements, which included and restructured those encountered in the ‘5+3 Aspects’ framework (see Figure 6). Looi and Toh (2013) describe the different aspects, structuring evaluation evidence and lessons learned around this modified set of aspects.

Fig 6
Figure 6.  Modified orchestration conceptual framework, used by Looi and Toh (2013).

Looi and Toh (2013) were interviewed to elicit further reflections on their use of the framework. For this research group, orchestration is mainly about helping teachers perform the management of classroom activities, within the multiple constraints of their schools. They initially used the framework as a post-hoc analytical tool to structure the findings in their iterative, design-based research process. In subsequent cycles of the research process in the school over the period of 5 years, the framework served as a guide for researchers to provide support for the teachers who enacted the technology-enabled lessons in the classroom.

Although they see the ‘5+3 Aspects’ framework (in its original form) as mostly complete, their research follows a systemic approach to educational innovation, which led them to add a new aspect (institutional support), and to remove the theory aspect (considering it a ‘meta’ aspect). Furthermore, they narrowed the definition of the synergy aspect, transforming it to synergy between culture and action (i.e. the interrelationships between classroom culture and what the teacher would do to orchestrate the classroom). These adaptations to the framework were done to further clarify orchestration aspects (and overlaps between aspects). The resulting list of orchestration issues was ‘very useful as beacons to remind us what needs to be considered in the teachers’ orchestration’. Looi and Toh (2013) also highlighted that further examples were needed to make the framework truly useful, ‘adding guidelines for the operationalisation of its use, towards something more prescriptive’.

This example illustrates the usefulness of the framework as a checklist to assess the completeness of the support given to teachers by the concerned system/intervention, and to help researchers understand the labour of teachers. Looi and Toh (2013) also considered the possibility of using this framework (in a simplified form) to support teachers directly (as opposed to the researcher focus described thus far).

Synthesis: towards a new conceptual framework

Our consensus-based evaluation of an already-existing conceptual framework for research on orchestration (‘5+3 Aspects’) indicates that the framework is considered complete, clear and useful as an integrative list of issues to consider and evaluate when performing learning technology interventions in authentic settings. However, participants in the international experts’ panel found limitations and missing elements, as well as the need for re-organisation of the framework (a finding supported as the framework was appropriated by other research groups).

Using this feedback, we propose a more complete, reorganised conceptual framework, which separates more clearly the different categories of elements that play a role in orchestration: activities that orchestration entails, actors that perform these activities and background that shapes the way orchestration is performed (see Figure 7). These different planes can then be aligned with the intention of achieving the desired learning effect.

Fig 7
Figure 7.  Revised conceptual framework for orchestration in learning technology research.

Using the experts’ feedback and this evolved framework, we also provide another synthetic definition of orchestration in learning technology research: ‘the process of designing and managing in real-time (including awareness and adaptation mechanisms) the learning processes in an authentic computer-supported learning scenario’. The responsibilities in this process are shared among a number of actors, depending on the context (teachers, students, researchers or technologies), who aim to pragmatically align the context’s background elements (constraints, resources) towards a satisficing effect, shaped by their mental models, their theories and beliefs.

It is worth noting that both the original ‘5+3 Aspects’ framework and this revised version are presented mainly as descriptive of the orchestration phenomenon: they may serve to describe in a detailed manner how orchestration of learning is performed in a concrete context, or the kind of orchestration that a research effort tries to achieve. As shown by the usage by different research groups, these frameworks can also be used to structure the evaluation of an intervention aimed at better orchestration of learning.

However, as designers of learning technologies, we also need prescriptive guidelines to help us do our job. The conceptual frameworks presented in this paper can also provide the seed of ‘design principles’ (Kali, Levin-Peled, and Dori 2009), in the form ‘design for… (aspect)’. Therefore, learning technologies should be ‘designed for adaptation’ (making technology flexible to improvised lesson changes), ‘designed for awareness’ (making student progress more visible and easy to model), and even ‘designed for design’ (empowering teachers to design the technology-supported learning activities themselves). Another prescriptive outcome stemming from this kind of holistic framework is to: ‘consider (or improve) as many orchestration aspects as possible’ (and check that your intervention does not impact negatively on the aspects that your innovation does not enhance). This kind of prescriptive advice does indeed support and expand other guidelines slowly coming to light as researchers investigate this phenomenon (see, e.g. Cuendet et al. 2013).

Aside from this tension between descriptive and prescriptive frameworks on orchestration, Roschelle, Dimitriadis, and Hoppe (2013) also note the unresolved debate about the role that computing technology should have in orchestration (also echoed in our evaluation): should we treat technology as a separate entity, or should we consider how it can be infused in the orchestration of the setting, enhancing certain aspects of it? Should we design easy-to-use, minimal orchestrable technologies or feature-rich technologies for orchestration of other technologies (see Tchounikine 2013)? Among these ongoing debates, we can also find another of our evaluation’s emergent insights: what is the right balance of roles between teachers and students (e.g. students may also play a part in the orchestration process through planned interactions), or between teachers and researchers/designers (e.g. when to design and how much improvisation to enable during enactment)? Maybe the answer is not unique and, like orchestration, it is shaped by the pragmatic restrictions of each context.

Conclusion

We have presented the notion of ‘orchestrating learning’ as an ill-defined research phenomenon dealing with the complexity of applying educational technology innovations in authentic settings. After presenting an holistic conceptual framework on orchestration research, its use by multiple independent research groups and its evaluation by international experts on the topic, a revised framework and definition of orchestration research have emerged.

Certainly, the evaluation and its outcomes are not without limitations: the participant researchers in our studies, while including most of the internationally recognised experts on the topic, are not necessarily representative of our research community, and the focus on such a new ‘buzzword’ might have ignored experts from overlapping sub-fields. The inclusion of a wider range of researchers in the pilot study and the alignment of findings between the two studies, however, ameliorate this weakness. The fact that teachers and other stakeholders were not included in the evaluation detracts from the claims about the value that such framework may have for the wider educational community. Finally, the novelty of the conceptual framework makes it difficult to assess its long-term value, although it is promising that there is a growing number of researchers using it.

The results and limitations of this work also suggest several paths for future research: as hinted by several panel participants and as exemplified in the mobile learning example on the use of the framework, the ‘5+3 Aspects’ framework (or a modified version of it) could be used with teachers, for example, in teacher professional development. In the previous section, we mentioned the need for further prescriptive guidelines and design principles to complement the descriptive power of these conceptual frameworks. Furthermore, the critique by some participants of the notion of orchestration as ‘fuzzy’ suggests that techniques and methods to measure orchestration and its aspects more concretely could also be invaluable to accumulate design knowledge and enable meaningful comparisons (see Prieto et al. 2015, for an initial step in this direction).

The ‘5+3 Aspects’ framework and its newest reincarnation follow the tradition of research studies in the educational technology field, in which several elements of learning are integrated in a coherent framework (see, e.g. Goodyear and Retalis 2010, or Phillips, McNaught, and Kennedy 2011). These orchestration frameworks are resonant with a design-based approach to seek continuous improvement of orchestration processes: by researching and understanding each of its essential aspects, through iterative cycles of research and implementation, we can better theorise and provide support for pre-designing and enacting classroom processes. We hope the same iterative approach will be applied to these frameworks and their derived research instruments in the future, not only by us, but also by the rest of the learning technology research community.

Acknowledgements

This research has been partially funded by the European Marie Curie IEF project MIOCTI (FP7-PEOPLE-2012-IEF project no. 327384), Spanish Research Project TIN2011-28308-C03-02 and the European Project METIS (531262-LLP-2012-ES-KA3-KA3MP). The authors thank all participants in the researcher panels for their time and their insightful responses.

References

Boyer, E. L. (1990) Scholarship Reconsidered: Priorities of the Professoriate, Princeton University Press, Princeton, NJ.

Chan, C. K. K. (2011) ‘Bridging research and practice: implementing and sustaining knowledge building in Hong Kong classrooms’, Journal of Computer-Supported Collaborative Learning, vol. 6, no. 2, pp. 147–186. Publisher Full Text

Creswell, J. W., et al., (2003) ‘Advanced mixed methods research designs’, in Handbook of Mixed Methods in Social and Behavioral Research, eds A. Tashakkori & C. Teddlie, Sage Publications, Thousand Oaks, CA, pp. 209–240.

Cuendet, S., et al., (2013) ‘Designing augmented reality for the classroom’, Computers & Education, vol. 68, pp. 557–569. Publisher Full Text

Dillenbourg, P., Järvelä, S. & Fischer, F. (2009) ‘The evolution of research in computer-supported collaborative learning: from design to orchestration’, in Technology-Enhanced Learning: Principles and Products, eds N. Balacheff et al., Springer, Dordrecht, pp. 3–19.

Dillenbourg, P. & Jermann, P. (2007) ‘Designing integrative scripts’, in Scripting Computer-Supported Collaborative Learning: Cognitive, Computational and Educational Perspectives, eds F. Fischer et al., Computer-supported collaborative learning series, Springer, New York, NY, pp. 275–301.

Dillenbourg, P., et al., (2011) ‘Trends in orchestration: second research and technology scouting report’, STELLAR Deliverable D1.5, Available at: http://www.stellarnet.eu/repository/deliverable_repository_list/

Fischer, F., et al., (2013) ‘Scripting and orchestration: recent theoretical advances’, Proceedings of the International Conference of Computer-Supported Collaborative Learning (CSCL2013), Madison, WI, pp. 564–571.

Goodyear, P. & Retalis, S. (2010) ‘Learning, technology and design’, in Technology-Enhanced Learning: Design Patterns and Pattern Languages (Vol. 2), eds P. Goodyear & S. Retalis, Sense Publishers, Rotterdam, pp. 1–28.

Gutiérrez-Rojas, I., Crespo-García, R. M. & Delgado-Kloos, C. (2012), ‘Enhancing orchestration of lab sessions by means of awareness mechanisms’, Proceedings of the 7th European Conference of Technology Enhanced Learning (EC-TEL 2012), Saarbrucken, Germany, pp. 113–125.

Hämäläinen, R. & Vähäsantanen, K. (2011) ‘Theoretical and pedagogical perspectives on orchestrating creativity and collaborative learning’, Educational Research Review, vol. 6, no. 3, pp. 169–184. Publisher Full Text

Kali, Y., Levin-Peled, R. & Dori, Y. J. (2009) ‘The role of design-principles in designing courses that promote collaborative learning in higher-education’, Computers in Human Behavior, vol. 25, no. 5, pp. 1067–1078. Publisher Full Text

Looi, C. K. & Toh, Y. (2013) ‘Orchestrating the flexible mobile learning classroom’, in Increasing Access through Mobile Learning, eds M. Ally & A. Tsinakos, Commonwealth of Learning, Vancouver, pp. 161–174.

Merriam-Webster Dictionary. (2014) Merriam-Webster Online, Available at: http://www.merriam-webster.com/

Miles, M. B. & Huberman, M. (1994) Qualitative Data Analysis: An Expanded Sourcebook, Sage Publications, Thousand Oaks, CA.

Moody, D. L. (2005) ‘Theoretical and practical issues in evaluating the quality of conceptual models: current state and future directions’, Data & Knowledge Engineering, vol. 55, pp. 243–276. Publisher Full Text

Muñoz-Cristóbal, J. A., et al., (2015) ‘Supporting teacher orchestration in ubiquitous learning environments: a study in primary education’, IEEE Transactions on Learning Technologies, vol. 8, no. 1, pp. 83–97. Publisher Full Text

Nussbaum, M., et al., (2011) ‘How to integrate CSCL in classroom life: orchestration’, Proceedings of the 9th International Conference on Computer-Supported Collaborative Learning (CSCL 2011), Hong Kong, China, p. 1199.

Penuel, W. R., et al., (2011) ‘Organizing research and development at the intersection of learning, implementation, and design’, Educational Researcher, vol. 40, no. 7, pp. 331–337. Publisher Full Text

Phillips, R. A., McNaught, C. & Kennedy, G. (2011) Evaluating e-Learning: Guiding Research and Practice, Routledge, London.

Prieto, L. P. (2012) Supporting Orchestration of Blended CSCL Scenarios in Distributed Learning Environments, Ph.D. Thesis, School of Telecommunications Engineering, University of Valladolid, Spain.

Prieto, L. P., et al., (2014a) ‘Supporting orchestration of CSCL scenarios in web-based Distributed Learning Environments’, Computers & Education, vol. 73, pp. 9–25. Publisher Full Text

Prieto, L. P., Dimitriadis, Y. & Asensio-Pérez, J. I. (2014b) ‘Orchestrating evaluation of complex educational technologies: a case study of a CSCL system’, Qualitative Research in Education, vol. 3, no. 2, pp. 175–205.

Prieto, L. P., et al., (2011) ‘Orchestrating technology enhanced learning: a literature review and a conceptual framework’, International Journal of Technology-Enhanced Learning, vol. 3, no. 6, pp. 583–598. Publisher Full Text

Prieto, L. P., et al., (2015) ‘The burden of facilitating collaboration: towards estimation of teacher orchestration load using eye-tracking measures’ in Proceedings of the 11th International Conference on Computer-Supported Collaborative Learning (CSCL 2015), Gothenburg, Sweden, pp. 212–219.

Prieto, L. P., et al., (2014c) ‘Review of augmented paper systems in education: an orchestration perspective’, Educational Technology & Society, vol. 17, no. 4, pp. 169–185.

Roschelle, J., Dimitriadis, Y. & Hoppe, U. (2013) ‘Classroom orchestration: synthesis’, Computers & Education, vol. 69, pp. 523–526. Publisher Full Text

Sutherland, R. & Joubert, M. (2009) ‘The STELLAR vision and strategy statement’, STELLAR Deliverable D1.1, Available at: http://www.stellarnet.eu/repository/deliverable_repository_list/

Tchounikine, P. (2013) ‘Clarifying design for orchestration: orchestration and orchestrable technology, scripting and conducting’, Computers & Education, vol. 69, pp. 500–503. Publisher Full Text

Van den Akker, J., et al., (2006) ‘Introducing educational design research’, in Educational Design Research, eds J. Van den Akker, J. Gravemeijer, S. McKenney & N. Nieveen, Routledge, London, pp. 3–7.