ORIGINAL RESEARCH ARTICLE

Open educational resources for research training: quality assurance through a collaborative evaluation

Victoria I. Marína*, Martha Lucía Orellanab and Nancy Peréc

aCenter for Open Education Research (COER), Faculty of Education and Social Sciences, Carl von Ossietzky Universität Oldenburg, Oldenburg, Germany;

bUNAB Creative, Universidad Autónoma de Bucaramanga (UNAB), Bucaramanga, Colombia;

cAcademic Unit, Universidad de la República, Montevideo, Uruguay

(Received: 18 May 2019; Revised: 4 October 2019; Accepted: 21 October 2019; Published: 18 November 2019)

Abstract

Although it is considered that open educational resources offer vast pedagogical opportunities for any educational context, there have been only few studies so far that have linked their use or application in the field of research training, and even less works that have addressed their quality assurance for that context. As part of an inter-institutional project, the main aim of this article is the collaborative selection and evaluation of appropriate educational resources for research training. The mixed method approach of the article includes needs’ analysis of researchers in training through questionnaires and interviews. This was the starting point for the collaborative evaluation of educational resources using the agreed common criteria derived from the Learning Object Review Instrument (LORI) evaluation instrument. This article suggests recommendations regarding the collaborative evaluation of educational resources and the use of LORI, and suggestions for creators of educational resources for research training to facilitate the quality assurance of their materials. A website is being developed to bind together the resources that have met the quality criteria established in collaborative evaluation.

Keywords: evaluation criteria; higher education; distance education

*Corresponding author. Email: victoria.marin@uni-oldenburg.de

Research in Learning Technology 2019. © 2019 V.I. Marín. et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Research in Learning Technology 2019, 27: 2271 - http://dx.doi.org/10.25304/rlt.v27.2271

Introduction

Students who initiate research tasks, either with a master’s or doctoral thesis, often find deficiencies in their training that make the research process more difficult (Wang and Li 2011). Research supervisors also expose this situation. While master’s or doctoral students receive the support of their research supervisors in the process of research training, a balance between the supervisor’s accompaniment and the development of autonomy as a researcher is crucial. In addition, common issues regarding the thesis arise among researchers during training. As a result, researchers explore support possibilities available on the Internet. Examples of this type of support are online communities, such as the #PhDChat generated on Twitter (Ford, Veletsianos, and Resta 2014), or specific communities, for instance CoVIF (a learning community for researchers in training in the field of educational technology; Moreno and Salinas 2011). The use of Open Educational Resources (OER) as a possible source of content has not yet been exploited to address the shortcomings or the need to deepen knowledge in certain aspects of research training. The Open and mobile educational resources for educational researchers training project, carried out by several universities in Mexico, focused on the creation and use of an OER repository of mobile learning for educational researchers during training, it resulted in the production of 37 mobile OER and the development of a digital repository to lodge them (Ramírez and Burgos 2012). No study has focussed on identifying the existing OER, and making them available to researchers during training in any field. Such a study would need to consider that OER are not always easy to locate or recover (Atenas, Havemann, and Priego 2014), and that opportunities discussed around OER, especially those related to universal access to education have not been realised (Caswell et al. 2008). A study locating OER also needs to consider the discussions around the quality of these resources and how to evaluate that quality (Camilleri, Ehlers, and Pawlowski 2014).

This research aims to select and collaboratively evaluate digital educational resources, especially open but not exclusively, and make them available to researchers during training. For this purpose, both the needs expressed by researchers during training and the existing criteria and instruments in the field of the evaluation of educational resources are considered. This effort is a part of an Ibero-American inter-institutional project (Investiga+) led by the Autonomous University of Bucaramanga (UNAB, Colombia), which is oriented towards the strengthening of postgraduate research (Orellana et al. 2016). Based on the fact that there is a growing number of master’s and doctoral students worldwide (Association of Universities and Colleges of Canada (AUCC) 2011; Ministerio de Educación Nacional (MEN) 2013; Snyder and Dillow 2013), Investiga+ seeks to contribute to supervisors’ training and practice. In this article we focus on researchers in training, although research supervisors could also benefit from the results of this work.

Framework

Training to be a researcher

According to Wisker (2012), research as a form of learning that values the creation and questioning of knowledge has become a central component of curricula throughout the world; the number of postgraduate and undergraduate students has increased, as has the number of research supervisors who are expected to encourage, support and train students to develop competencies, values and practices essential for research work.

Following are the responsibilities that Taylor, Kiley, and Humphrey (2017) attribute to effective supervisors in relation to the expected achievements of researchers in training: to allow the student to be able to start and plan a research project; to acquire research competencies necessary to accomplish research and have adequate access to resources; and to develop creative, critical and analytical skills. However, supervisors do not have to assume these responsibilities by themselves, but should provide guidance and academic support, and the institution is expected to be a facilitator of structure, policies, conditions and resources that enable supervision (Orellana 2016).

Remote supervision, either completely or partially using online communication, adds to the challenges of supervision. Despite these challenges of remote supervision, here are also advantages and opportunities. According to Sussex (2008), the same fact of using different forms of communication and information exchange increases the quality of supervision by making it richer and more flexible than without the combination of media. These advantages and opportunities can also be harnessed by face-to-face supervision supported by information and communication technology (ICT).

ICT also supports one of the main purposes of research training, that is, to achieve student autonomy as a researcher. According to Adham et al. (2018), the training of students as researchers is enhanced by their interactions within professional and social networks, self-study and self-reflection. In the same way, Caplan and Graham (2008) suggest viewing the web as more than a tool, and also as an aid in the creation of learning environments that promote active student-centred learning and support the development of critical and high-level thinking skills.

Within the frame of the Investiga+ project, the current work promotes the creation of a website where digital educational resources for research training, especially open resources, are linked. This website facilitates autonomous learning, communication and interaction in cross-disciplinary and multicultural environments, especially in online education, although the resources could also be used in the context of face-to-face and blended learning.

Digital educational resources: learning objects (LO) and OERs

According to Fernández-Pampillón, Dominguez, and de Armas (2013), digital educational content or resources are

digital resources used in the teaching-learning process of the courses taught by teachers or the collection of resources that a teacher or a student uses to pursue a course: a lesson plan, a calendar, the teaching guide, a proposal of activities, tutorials, ... [...] When the digital educational resource is created with the objective to be scalable, reusable, interoperable and accessible is considered to be a learning object. (Fernández-Pampillón, Dominguez, & de Armas 2013, p. 14)

Learning objects, as educational resources, are minimum units of information in multiple formats with an interactive nature, self-contained in the context of learning and oriented to a single objective through micro-learning units that include content, resources, activities and evaluation (Del Moral and Cernea 2005). They also present features of accessibility (identifiable through metadata), interoperability (technical compatibility of the resource among different platforms and devices), reusability in other contexts, adaptability to situations and needs, and durability in the face of technological changes (Del Moral and Cernea 2005). According to Camilleri, Ehlers and Pawlowski (2014, p. 7), LOs are ‘materials used to support learning (that) can be broken down into (or constructed from), a number of elements which can be combined differently and reused in various scenarios’.

When LOs, as minimum educational resources, are licensed with an open licence, they can be considered OER. The term OER was coined by the UNESCO (2002) and defined as ‘teaching, learning, and research resources that reside in the public domain or have been released under an intellectual property license that allows their free use or re-purposing by others’ (Atkins, Brown, and Hammond 2007, p. 4). OER as well as digital educational resources in general may include complete courses, course materials, modules or units, syllabus, books, research articles, videos, evaluation tools, interactive materials, databases, software, applications and any other useful educational material (UNESCO and Commonwealth of Learning (COL) 2011). It is widely agreed that OER can offer great educational possibilities, especially for the developing countries (Kawachi 2014), but their potential in distance education and in research training, through the reuse and sharing of digital educational resources, can be also considered (Ramírez and Burgos 2012).

OERs are openly available, often accessible through educational or institutional repositories but not always easy to find or retrieve (Atenas, Havemann, and Priego 2014). Likewise, just as the use of OER is open to any person, so is the creation of new OER, and therefore the pedagogical, technical and content quality can be variable, and educators might have problems judging the OER quality and relevance (Hylén 2006). In comparison to commercially published resources, which follow traditional peer review, licensing and publication (Wiley 2013), there are still doubts about the OER quality, which can be managed through different (centralised/decentralised; open/closed) processes (Hylén 2006).

As a main part of our study, it is a key to review what the literature proposes in relation to the quality of digital educational resources, primarily OER.

Ensuring the quality of digital educational resources

Quality is one of the most discussed topics in the field of digital educational resources (especially OER), and also one of the considered reasons why these resources are scarcely used (Camilleri, Ehlers, and Pawlowski 2014). The same authors also emphasise that the potential of OER to transform the educational practice has not yet been exploited; and that innovative ways are needed for the creation and evaluation of OER, as well as an evolving empirical database on OER effectiveness.

In order to assess and evaluate OER, and other educational resources, Camilleri, Ehlers, and Pawlowski (2014) propose a social qualification where the evaluators are the users, and the results of the evaluation are available to other users of resources. According to McGill (2011), the quality of educational resources is usually determined by their accuracy, the reputation of their author/originated institution, the technical production standards, the accessibility and the suitability for specific purpose.

On the other hand, there are different approaches to the concept of quality, which include generic perspectives of quality related to quality management or quality assurance without taking into account the context of use, specific quality perspectives for learning, education and training (e.g. standards and guidelines to ensure quality in the European higher education area or UNIQUe criteria for excellence in technology-enhanced learning), and specific instruments for quality assessment (e.g. recommendation systems or peer reviews; Camilleri, Ehlers, and Pawlowski 2014).

These instruments could be ratings, rubrics or frameworks and facilitate the evaluation and selection of digital educational resources (MERLOT, the Learning Object Evaluation Instrument and the Rubric to Evaluate Learner Generated Content, among others). In studies by Yuan and Recker (2015) and Zawacki-Richter and Mayrberger (2017), detailed analyses of these instruments were carried out, including a comparison and evaluation of their elements.

Both studies point towards the Learning Object Review Instrument (LORI) (Nesbit, Belfer, and Leacock 2007) as one of the most easily accessible evaluation instruments. LORI has also been reviewed for several times and there are examples of empirical results of its implementation (Yuan and Recker 2015; Zawacki-Richter and Mayrberger 2017). Resting on the evidence from these studies, LORI appears to be one of the most valid instruments for evaluating LOs and also the most inclusive in terms of the quality criteria or dimensions described in Zawacki-Richter and Mayrberger (2017) -- technical, pedagogical and intellectual property rights – and the Spanish Standard UNE 71632 (quality criteria for digital educational materials; Fernández-Pampillón Cesteros et al. 2017) -- pedagogical effectiveness, technological effectiveness and effectiveness regarding accessibility. LORI is designed for individual or collaborative evaluation of multimedia LOs, although it can also be used in a generic way for LOs with other formats.

LORI comprises nine items, which can be specified when evaluating the educational resource, according to the mentioned quality criteria or dimensions (Table 1).

 

Table 1. Quality items of learning object review instrument according to the quality criteria and dimensions.
Quality criteria (Zawacki-Richter and Mayrberger 2017) Quality dimensions (Fernández-Pampillón Cesteros et al. 2017) LORI items (Nesbit, Belfer, and Leacock 2007)
Technical (usability, accessibility and reusability) Technological effectiveness Presentation design
Reusability
Compliance with standards
Effectiveness regarding accessibility Accessibility
Interaction usability
Pedagogical (content, learning design and evaluation) Pedagogical effectiveness Content quality
Alignment of learning objectives
Motivation
Feedback and adaptation
Intellectual property rights (license) - -

For each item, a rating of 1 to 5 stars is established, meaning 1 is the lowest score and 5 is the highest. Examples of what is meant by assigning 1–5 stars to each of the items are specified in LORI.

In an effort to create an integrated platform for LO evaluation, LORI and other evaluation instruments (e.g. UNE 71362 or the Learning Object Evaluation Metric) have been included in the Learning Evaluation Object Platform (LOEP). This platform facilitates collaboration in the evaluation of educational resources, and generates automatic scores derived from that joint evaluation (Gordillo, Barra, and Quemada 2015). LOEP is an important tool for the current study, since it provides an organised system for the allocation of assessment of educational resources, and produces summary representations of their quality according to different instruments. These features are the reasons why LOEP and the integrated LORI were used in this study for the collaborative evaluation of educational resources, ensuring quality criteria in the process and outcome of the process.

Methodology

Research objectives

As a major purpose of the Investiga+ project, a range of digital educational resources (existing and newly created) are expected to put at disposal of researchers during training. These resources include content related to the design and development of a research project.

The objectives of this study contribute to the above-mentioned major purpose of the Investiga+ project and are specified as follows:

  1. To identify the relevant elements of structures and topics of digital educational resources for research training derived from the needs of researchers in training and the related literature.
  2. To collaboratively evaluate and select existing digital educational resources according to the established quality parameters suitable for research training.

Therefore, following would be the research questions:

  1. Which elements of structures and topics of educational resources are relevant for researchers during training?
  2. How appropriate, in terms of their quality, are the existing educational resources to be used in research training?

Phases of the Study

The study has been divided into different consecutive phases (see Figure 1), starting with the literature review and data collection regarding elements of structures and topics of digital educational resources carried out through questionnaires and interviews with researchers in training. After this first identification of educational resources’ needs for researchers in training, a pilot phase to test the collaborative evaluation was completed through the use of the LORI instrument. This allowed us to define elements of quality for an educational resource in the context of research training and, consequently, to generate some consensus around the use of LORI for collaborative evaluation, which took place thereafter. These quality criteria are also the basis for the definition of an appropriate techno-pedagogical structure for the educational resources that are created in a later stage of the project (future work).

Fig 1
Figure 1. Phases of the study.

The study was approved by the university’s Institutional Committee of Ethics in Research (code 062).

Instruments

As part of a larger study, a mixed methodology was used in the initial analysis: Online questionnaires were administered to research students in master’s and doctoral programmes in Ibero-American universities, and online interviews were conducted with research students who voluntarily offered to be contacted in the online questionnaire. The questionnaire was validated by experts through the International Research Panel in Educational Technology (http://www.edutec.es/panel).

The questionnaire included many items, but only the ones that concerned specific objectives of this study are presented.

In the case of the questionnaire, the selected items were as follows:

In both items, the research students were given the opportunity to include supplementary options. In the interview, the researchers during training delved into the characteristics of digital resources considered to support research. In addition, they explained their reasons for choosing specific topics of digital resources for research training.

The next phase in the study included the use of LOEP by project researchers to evaluate an educational resource and the subsequent discussion on the suitability of each item from the context of research training and the type of LO found. Based on the agreements for the use of LORI, the project researchers proceeded to the collaborative and quantitative evaluation of educational resources. These resources were previously selected from 34 educational and institutional repositories, catalogues and Massive Online Open Courses (MOOCs), mostly in the Spanish language, according to the target group.

Results and discussion

As indicated above, several instruments have been used in this study that produced sets of results in each of the study phases. In this section we present these results by dividing them into: (1) a summary of the analysis of the needs of researchers during training, and (2) the work developed by project researchers regarding the revision of LORI and the evaluation of educational resources through LOEP.

Analysis of researchers in training’s needs

A total of 142 postgraduate students from 15 Ibero-American countries participated in the questionnaire aimed at discovering how to strengthen the master’s and doctoral programmes at Ibero-American universities. The majority (60.57%) of participants came from Colombian universities.

When researchers during training were asked about their preference regarding the type of educational resources, they classified them as follows: first of all, virtual courses arranged in a platform allowing for student–content and student–student interaction; in the second place, self-contained courses with self-evaluation options and activities, and with the choice of a student–content interaction; and finally in third place, interactive resources in multimedia format.

The interviews were conducted with a small sample of research students (n = 6), who described the following statements as important characteristics of digital resources for research training:

  1. Openness. Digital resources must be fully open in digital format and accessible to the entire community.
  2. Up-to-date. Obsolete tools should be avoided.
  3. Availability in a single platform. It is important to be able to access the resources whenever wanted or needed, and not only through specific courses.
  4. Quality material. With relevant contents.
  5. Disposal of a clear and broad structure. This covers the important aspects of contents.
  6. Accessibility through different devices. Universal design should be also considered.
  7. Format of LO and training capsules.
  8. Offer of dynamic assistance and interaction support.

Once the project team reviewed and discussed the results, the outcomes were refined taking into account the literature and the specificities of the target group. In conclusion, digital resources to be selected (and eventually created) to support the research training must encompass the following characteristics:

Regarding the topics of digital resources, the analysis of the answers of questionnaires and interviews involved the consideration of seven topics aimed at researchers in training. Codes needed for the subsequent labelling of resources were assigned to them: (1) research methodology and structure of research projects (MET), (2) tools for information management, reference managers and citation styles (GEST), (3) ethical aspects (ETI), (4) scientific writing and scientific publication process (ESC), (5) preparation for research process (PREP), (6) examples of research practices in disciplinary fields (EJE) and (7) use of knowledge management tools (CON).

Collaborative evaluation of educational resources

As mentioned above, the researchers also identified quality elements for the evaluation of educational resources suitable for research training, which also provided clues for the techno-pedagogical structure to be proposed for the future creation of new resources.

With that purpose in mind, several project researchers made use of LORI to collaboratively evaluate an educational resource that had been previously selected from a list of repositories, catalogues and MOOCs.

This initial evaluation resulted in a series of agreements regarding the use of LORI in the next evaluations of educational resources, which would contribute to a homogeneous evaluation among the project researchers involved in this process.

Agreements for the use of LORI

In a pilot phase to test the evaluation of selected educational resources, the evaluated resource was a video belonging to an MOOC about the aspects of research methodology.

The result of the collaborative evaluation work conducted by four project researchers is shown in Figure 2. Considering that each item could get a score between 1 and 5, the evaluation was generally positive, with its maximum score in terms of standard compliance, accessibility, interaction usability, and presentation design (at the left side of Figure 2, from top to bottom). The content quality and reusability items (at top and left side of the figure) also obtained a high score, although not the maximum score (between 4 and 5). At the right side of the figure and from top to bottom, learning goal alignment, feedback and adaptation, and motivation are the items that obtained an average score. In addition to rating, each project researcher included notes in the open text field, which were discussed later in an online meeting.

Fig 2
Figure 2. Graph derived from the collaborative evaluation of educational resource using LORI in LOEP.

It was agreed that most items would be maintained. However, different considerations were made for the use of LORI in subsequent evaluations, partly due to the type of resources that were mostly found (videos from MOOCs):

Regarding the item ‘Standard Compliance’, it was difficult to identify whether standards were followed in most of the selected resources, and therefore this item was not used in this evaluation. It was agreed that this item should be considered in subsequent creation of new resources. This item would include the following information on three aspects: (1) technical, with information about requirements and technical characteristics of the content; (2) educational, considering descriptions of pedagogical and educational characteristics of the content; and (3) rights, which refer to information about intellectual property rights and terms and conditions of use of the content.

Results of the evaluation of educational resources

The search for educational resources in 34 repositories, MOOCs and catalogues guided by the ratings (discussed above) resulted in a final selection of one to five resources in each of the previously identified seven important topics for research training, although in some exceptional cases these limits were exceeded.

Thirty-two educational resources -- mostly but not exclusively, videos, were labelled with a code and a number according to the topic, and included in LOEP. Always two project researchers evaluated each resource that was not selected by them so that a double evaluation was ensured for each resource.

The evaluation team agreed on an average cut-off score of 7.5 in collaborative evaluation. This cut-off score was used to determine the inclusion of educational resources based on their quality. The automatic calculation of this arithmetic average of LORI by LOEP was done according to a mathematical equation that leads to the items being averaged taking into account all the evaluations made and then transformed into a score on a scale of 0-10 (Gordillo 2017). This resulted in the positive assessment of 17 educational resources.

A summary of the LORI averages of the selected resources’ evaluations calculated by LOEP is presented in Table 2. Figures 3 and 4 show the representation of the evaluation of a resource with a low and high score respectively.

 

Table 2 Average LORI rating of educational resources evaluated in LOEP according to each topic (automatically calculated by LOEP).
Educational resource according to the topic Arithmetic average of evaluation LORI
MET1
MET2
MET3 = GEST1
MET4
MET5
8.66
8.61
8.19
5.28
4.03
GEST2
GEST3
GEST4
5
7.92
6.76
ESC1
ESC2
ESC3
ESC4
ESC5
ESC6
ESC7
ESC8
9.17
8.89
8.61
5.56
9.31
7.22
7.64
7.36
EJE1 4
PREP1
PREP2
PREP3
PREP4
PREP5
5
8.61
5.42
7.5
9.17
CON1
CON2
CON3
CON4
7.64
4.44
6.11
8.47
ETI1
ETI2
ETI3
ETI4
ETI5
8.33
7.22
6.94
8.06
8.33

Fig 3
Figure 3. Score graph of ESC1 (positive).

Fig 4
Figure 4. Score graph of MET5 (negative).

Conclusions and the future work

The quality assurance process for the evaluation of digital educational resources presented in this study has resulted in selecting good-quality materials for research training, which can, eventually, be also an additional support for research supervisors. As Wiley (2013) suggested and we would also like to remark again: ‘Quality is not necessarily a function of copyright status [...]. Local experts must vet the quality of whatever resources they choose to adopt [...]’.

Furthermore, the current work shows the importance of sharing criteria for the collaborative evaluation of educational resources, especially in the case of international research groups working remotely together in a project.

We recognise that platforms, such as LOEP, greatly facilitate collaborative evaluation processes and minimise the effort of evaluation. We also recognise that LORI is an appropriate evaluation approach in determining which resources are of suitable pedagogical and technical quality. In addition, other dimensions such as support in the choice of relevant criteria for each type of resource and more guides for evaluation could be of great value: In the first place, for education professionals, to decide on the resources to be used in their courses and, eventually, for other professionals that may not be involved in the education area but need resources in their fields of knowledge for their own professional development. Sharing the results of evaluation with a working group (reviewers and administrator) could also be an opportunity for open discussion within the team.

We emphasise from our research the importance that the authors of educational resources label them explicitly as OERs wherever appropriate and describe their metadata, especially the type of licence. It is also relevant that the repositories and catalogues of educational resources take greater care in providing a clear resource classification and updated links. Some of these difficulties were also mentioned by Atenas, Havemann, and Priego (2014), and therefore Atenas and Havemann (2013, 2014) established quality assurance indicators for repositories, such as the inclusion of keywords in incorporated resources or the availability of user evaluation tools. The latter is a very useful aspect, especially when that evaluation may be available to others.

The contribution of this study to the existing literature lies especially in the collaborative evaluation of educational resources for research training, and based on multiple perspectives, namely the analysis of the needs of researchers during training, the joint analysis of a broadly recognised evaluation instrument (LORI), and the use of its items to ensure the quality of those resources. In addition, this work represents a contribution to the field of OER for the target group of researchers in training, especially in the Spanish-speaking world.

We recognise some limitations of this study in its different phases. Firstly, the samples in the analysis of needs, which were of a voluntary nature, coincidentally came mostly from Colombia, which may imply that needs in terms of topics and types of educational resources for research training cannot be generalised. Perhaps a more detailed analysis of the reasons for this geographical bias would be necessary. Secondly, we must also mention incompleteness in the creation of the list of repositories, catalogues and MOOCs used during resource selection. It was not always possible to locate them all, for example, because they were no longer available. Finally, it is worth mentioning that in the collaborative resource evaluation there was a limitation in time and personal resources related to the number of resources to be evaluated and the number of researchers involved in the evaluation of each of them. Although a community of appropriate peer reviewers is desirable, perhaps solutions that combine this community-based model with other kind of measures could make the (collaborative) evaluation of OERs more feasible and sustainable. As proposed by Orr, Rimini and Van Damme (2015) for developing OERs, some examples could be as follows: generating revenue from additional services related to the evaluation of OERs (revenue-based model) such as an evaluator certification; further training as evaluator of OERs or counselling services on further user/reuse of those OERs; or involving philanthropic organisations to support the quality of OER by offering donations and funding the evaluation of OERs (philanthropy-based model).

The next phases of this study include a second round of identification and evaluation of educational resources, creation of new resources related to topics where no OER exist, and the development and evaluation of an open website with high-quality resources evaluated and oriented towards researchers in training and research supervisors.

The website is currently under development and will involve the evaluation of the actual use of the selected and created resources by researchers in training. This evaluation of the use of educational resources by the target group in the website should be especially considered, as presented by Canto, Guillermo, and Tejada (2012). This would allow triangulating evaluations from the point of view of project researchers and from actual users.

References

Adham, K. A., et al., (2018) ‘Learning to complete the PhD thesis’, Issues in Educational Research, vol. 28, no. 4, pp. 811–829, [online] Available at:http://www.iier.org.au/iier28/adham.pdf

Association of Universities and Colleges of Canada (AUCC) (2011) Trends in Higher Education: Volume 1 – Enrolment, Association of Universities and Colleges of Canada, Ottawa, ON.

Atenas, J. & Havemann, L. (2013) ‘Quality assurance in the open: an evaluation of OER repositories’, The International Journal for Innovation and Quality in Learning, vol. 2, pp. 22–34, [online] Available at: http://eprints.soas.ac.uk/17347/1/30-288-1-PB.pdf

Atenas, J. & Havemann, L. (2014) ‘Questions of quality in repositories of open educational resources: a literature review’, Research in Learning Technology, vol. 22, p. 20889. doi: 10.3402/rlt.v22.20889

Atenas, J., Havemann, L. & Priego, E. (2014) ‘Opening teaching landscapes: the importance of quality assurance in the delivery of open educational resources’, Open Praxis, vol. 6, no. 1, pp. 29–43. doi: 10.5944/openpraxis.6.1.81

Atkins, D. E., Brown, J. S. & Hammond, A. L. (2007) ‘A review of the open educational resources (OER) movement: achievements, challenges, and new opportunities’, Report to the William and Flora Hewlett Foundation, [online] Available at: http://www.hewlett.org/uploads/files/ReviewoftheOERMovement.pdf

Camilleri, A. F., Ehlers, U. D. & Pawlowski, J. (2014) State of the Art Review of Quality Issues Related to Open Educational Resources (OER) (No. JRC88304), Joint Research Centre (Seville site), [online] Available at: http://is.jrc.ec.europa.eu/pages/EAP/documents/201405JRC88304.pdf

Canto, P. J., Guillermo, M. C. & Tejada, M. A. (2012) ‘Recursos educativos abiertos para la formación de investigadores educativos: opinión de usuarios’, in Recursos educativos abiertos y móviles para la formación de investigadores: Investigaciones y experiencias prácticas, coord. M. S. Ramírez & J. V. Burgos, Lulú editorial digital, México, pp. 38–48.

Caplan, D. & Graham, R. (2008) ‘The development of online courses’, in The theory and practice of online learning, ed T. Anderson, Athabasca University Press, Edmonton, pp. 247–265.

Caswell, T., et al., (2008) ‘Open content and open educational resources: enabling universal education’, The International Review of Research in Open and Distributed Learning, vol. 9, no. 1, pp. 1–11, doi: 10.19173/irrodl.v9i1.469

Del Moral, M. E. & Cernea, D. A. (2005) ‘Diseñando Objetos de Aprendizaje como facilitadores de la construcción del conocimiento’, II Simposio Pluridisciplinar sobre Diseño, Evaluación y Descripción de Contenidos Educativos Reutilizables (SPDECE05), Barcelona, España.

Fernández-Pampillón, A. M., Dominguez, E. & de Armas, I. (2013) ‘Análisis de la evolución de los Repositorios Institucionales de material educativo digital de las universidades españolas’, RELATEC: Revista Latinoamericana de Tecnología Educativa, vol. 12, no. 2, pp. 11–25, [online] Available at: http://relatec.unex.es/article/view/1165

Fernández-Pampillón Cesteros, A. M., et al., (2017) ‘Herramienta de evaluación de la calidad de los Materiales Educativos Digitales: perfiles de aplicación del profesor y del alumno’, in UNE 71362 Calidad de los materiales educativos digitales. 35.240.90/Aplicaciones de las tecnologías de la información en educación, AENOR, Madrid, pp. 114–131, [online] Available at: https://eprints.ucm.es/45338/

Ford, K. C., Veletsianos, G. & Resta, P. (2014) ‘The structure and characteristics of #PhDChat, an emergent online social network’, Journal of Interactive Media in Education, vol. 2014, no. 1. doi: 10.5334/2014-08

Gordillo, A. (2017) Contribution to the Authoring, Distribution, Evaluation and Integration of Learning Objects, Dissertation E.T.S.I. Telecommunications, Universidad Politécnica de Madrid, [online] Available at: https://core.ac.uk/display/148690656?recSetID=

Gordillo, A., Barra, E. & Quemada, J. (2015) ‘A flexible open source web platform to facilitate learning object evaluation’, 2014 IEEE Frontiers in Education Conference (FIE) Proceedings, Madrid, Spain. doi: 10.1109/FIE.2014.7044498

Hylén, J. (2006) Open Educational Resources: Opportunities and Challenges, OECD, Paris, [online] Available at: http://www.oecd.org/education/ceri/37351085.pdf

Kawachi, P. (2014) The TIPS Framework Version-2.0: Quality Assurance Guidelines for Teachers as Creators of Open Educational Resources, CEMCA, New Delhi, [online] Available at: http://cemca.org.in/ckfinder/userfiles/files/TIPS Framework_Version 2_0_Low.pdf

McGill, L. (2011) ‘Quality considerations’, [online] Available at: https://www.jisc.ac.uk/full-guide/open-educational-resources

Ministerio de Educación Nacional (MEN) (2013) ‘Ministerio de Educación Nacional, Sistema Nacional de Información de Educación Superior SNIES. Resumen de indicadores de educación superior’, [online] Available at: http://www.mineducacion.gov.co/sistemasdeinformacion

Moreno, J. & Salinas, J. (2011) ‘Resultados del proceso de diseño, desarrollo e implementación de un prototipo de entorno virtual para una comunidad de Investigadores en Formación’, Congreso Internacional EDUTEC 2011, Pachuca, México, [online] Available at: https://www.academia.edu/3010238/Resultados_del_proceso_de_dise%C3%B1o_desarrollo_e_implementaci%C3%B3n_de_un_prototipo_de_entorno_virtual_para_una_comunidad_de_Investigadores_en_Formaci%C3%B3n

Nesbit, J., Belfer, K. & Leacock, T. (2007) ‘Learning Object Review Instrument (LORI)’, [online] Available at: http://transplantedgoose.net/gradstudies/educ892/LORI1.5.pdf

Orellana, M. L. (2014) Supervisión en línea de proyectos de investigación en maestrías y doctorados. Estrategia de apoyo para la interacción, diseño y desarrollo de la investigación, haciendo uso de mapas conceptuales. Tesis Doctoral. Universidad de las Islas Baleares, Departamento de Pedagogía Aplicada y Psicología de la Educación.

Orellana, M. L., et al., (2016) Programa para el Fortalecimiento de la Calidad de la Investigación en Posgrados. Tutoría de la investigación y formación de investigadores en posgrados: modelos, recursos y entorno virtual, [online] Available at: https://www.researchgate.net/publication/330726923_Programa_para_el_Fortalecimiento_de_la_Calidad_de_la_Investigacion_en_Posgrados_Tutoria_de_la_investigacion_y_formacion_de_investigadores_en_posgrados_modelos_recursos_y_entorno_virtual

Orr, D., Rimini, M. & Van Damme, D. (2015) Open Educational Resources: A Catalyst for Innovation, Educational Research and Innovation, OECD Publishing, Paris. doi: 10.1787/9789264247543-en

Ramírez, M. S. & Burgos, J. V. (coords) (2012) Recursos educativos abiertos y móviles para la formación de investigadores: Investigaciones y experiencias prácticas, Lulú editorial digital, Monterrey, México.

Snyder, T. & Dillow, S. (2013) Digest of Education Statistics 2012 (NCES 2014-015), National Center for Education Statistics, Institute of Education Sciences, U.S., Department of Education. Washington, DC.

Sussex, R. (2008) ‘Technological options in supervising remote research students’, Higher Education, vol. 55, no. 1, pp. 121–137. doi: 10.1007/s10734-006-9038-0

Taylor, S., Kiley, M. & Humphrey, R. (2017) A Handbook for Doctoral Supervisors, Routledge, London.

UNESCO (2002) ‘Forum on the impact of open courseware for higher education in developing countries’, [online] Available at: http://unesdoc.unesco.org/images/0012/001285/128515e.pdf

UNESCO & Commonwealth of Learning (COL) (2011) ‘Guidelines for open educational resources (OER) in higher education’, Paris, [online] Available at: http://unesdoc.unesco.org/images/0021/002136/213605E.pdf

Wang, T. & Li, L. Y. (2011) ‘“Tell me what to do” vs. “guide me through it”: Feedback experiences of international doctoral students’, Active Learning in Higher Education, vol. 12, no. 2, pp. 101–112. doi: 10.1177/1469787411402438

Wiley, D. A. (2013) ‘On quality and OER’, in Iterating toward Openness, [online] Available at: https://opencontent.org/blog/archives/2947

Wisker, G. (2012) The Good Supervisor: Supervising Postgraduate and Undergraduate Research for Doctoral Theses and Dissertations, Macmillan International Higher Education, Basingstoke, Hampshire, United Kingdom.

Yuan, M. & Recker, M. (2015) ‘Not all rubrics are equal: a review of rubrics for evaluating the quality of open educational resources’, International Review of Research in Open and Distance Learning, vol. 16, no. 5, pp. 16–38. doi: 10.19173/irrodl.v16i5.2389

Zawacki-Richter, O. & Mayrberger, K. (2017) ‘Qualität von OER: International Bestandsaufnahme von InstrumentenzurQualitätssicherung von Open Educational Resources (OER) – Schrittezueinemdeutschen Modell am Beispiel der Hamburg Open Online University’, Synergie (Sonderband), vol. January, [online] Available at: https://www.synergie.uni-hamburg.de/media/sonderbaende/qualitaet-von-oer-2017.pdf