This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
While learning is commonly conceptualised as a social, collaborative process in organisations, online courses often provide limited opportunities for communication between people. How do students engage with content-based courses? How do they find answers to their questions? How do they achieve the learning outcomes? This paper aims to answer these questions by focusing on students’ experiences in an online content-based course delivered in a large Mexican organisation. Sales supervisors (
Learning is commonly conceptualised as a social, collaborative process, in which people communicate and actively build knowledge. Students who work and share ideas with others are generally more motivated and display better academic performance than passive students (Beaudoin
Social interactions are valuable as a means to improve student engagement (Zepke and Leach
Despite the acknowledged importance of social interactions for student engagement and learning, in organisations online courses often provide limited or no opportunities for communication between people (e.g., Padilla Rodriguez and Fernandez Cardenas
This paper focuses on students’ experiences in an online content-based course delivered in a large Mexican organisation with a high geographical dispersion. Specifically, it addresses the following questions: How do students engage with content-based courses? How do they find answers to their questions? How do they achieve learning outcomes?
In this paper, content-based learning design refers to a way of organising a course that focuses on fostering learner–content interactions and includes no activities to enable communications between people. Moore (
Internet-enabled devices and tools make a variety of learner–content interactions possible. These include replaying sections of a podcast, searching information, following links to glossary entries, answering multiple-choice questions and checking automatic feedback (Anderson
Learner–content interactions can be designed to perform some of the functions traditionally carried out by teachers (Anderson
While this type of interaction has advantages, focusing only on learner–content interactions excludes the potential benefits of other types of educational interactions (see
Types of educational interactions. Diagram from Anderson (
Other issues may arise when no clear teaching presence is deployed on a course (Garrison, Cleveland-Innes, and Fung
This study took place at a large Mexican organisation (+6000 employees) with 30 distribution centres and offices in the country. As part of a Leadership Programme delivered via the e-learning platform Moodle, sales supervisors had to study a content-based course on Performance Feedback. This course aimed to improve the communication competence of employees in charge of managing retailers. Participants had one week to finish the course, with a commitment of approximately five study hours.
The design of the course incorporated six non-assessed activities that fostered interactions with the content and required explicit, observable responses from the students; for example, providing an answer to a question instead of reflecting internally on a topic.
Online tools included in the content-based course.
| Tool | Purpose | Characteristics |
|---|---|---|
|
|
||
| Hyperlinks | To link key terms to glossary definitions. | Available in the reading materials |
| Personal wikis | To provide an individual space for students to write their reflections. | Only accessible to the owner of the wiki and the administrators of the course |
| Multiple-choice questions | To encourage students to practise and reflect on the course concepts. | Automated feedback provided for both correct and incorrect answers |
| Polls | To stimulate thinking on the topic and how it relates to others’. | Enabled students to see the general responses of the group |
| Podcasts | To make content more user-friendly by the use of the human voice (Nie et al. |
Brief (less than a minute) |
| Discussion forum | To offer a channel of general support. | Only built-in communication tool available |
| Final exam | To assess learning. | Included only closed questions, which were automatically graded |
Sales supervisors (
Four people dropped out of the course at different stages. Ten sales supervisors who did not participate in the course formed a control group.
Four main data sources were used, as shown in
Data sources.
| Source | To gain insight into | N |
|---|---|---|
|
|
||
| Diagnostic surveys | Previous knowledge on the course content. | 46 |
| Evaluation surveys | Perceptions on learner–content interactions and learning. | 40 |
| Think-aloud sessions | Strategies when engaging with the course content. | 8 |
| Activity logs (number of clicks) | Engagement with the course. | 47 |
| Exams | Student learning. | 43 |
Two online surveys were used in this study to obtain an insight into individual perceptions and tendencies within groups (Baruch and Holtom
The think-aloud method consists of observing participants, while they verbally articulate their behaviours, feelings and thoughts as they engage with an activity. Throughout this process, the researcher's input is minimum, generally limited to prompts to keep talking when participants fall quiet. Data are audio recorded for further analysis (Young
The think-aloud method is recommended for the study of learner–content interactions (Anderson
The Moodle log system provides interesting information about participants’ online behaviours and activities within a course (Estrada et al.
A final exam with multiple-choice, matching and true/false questions evaluated knowledge acquisition.
At the beginning of the course, all participants received information about the study and answered a diagnostic survey. The researchers then used the think-aloud method to observe a convenience sample of eight students – located in two different cities – as they engaged with a content-based learning design. Data were audio recorded and transcribed.
The think-aloud transcripts were coded and analysed using NVivo software. Themes for categorization were based on students’ navigational decisions and potential evidence that learning was taking place.
At the end of the course, 43 students completed the exam and the evaluation survey. Central tendency measurements and percentages were obtained where applicable. Open questions were coded using emergent themes. Employees from the control group also sat the final exam.
Moodle log entries were checked and categorised as passive or active. Viewing a resource (e.g., a discussion forum, a wiki, a page with reading material, etc.) was considered passive. Views of the front (landing) page of the course were excluded. Active contributions included clicks that resulted in an observable response (e.g., editing a wiki, selecting a poll answer). Medians were obtained.
Finally, the information from the different data sources and methods was compared and contrasted.
Procedure timeline.
Results were grouped according to the research questions: (1) how did students engage with a content-based course; (2) how did students find answers to their questions; and (3) how did students achieve learning outcomes.
The evaluation surveys indicate that participants generally followed the recommended structure and spent an average of four and a half hours on the course, out of the recommended five hours. Most students (35/40; 88%) reported being engaged or very engaged with the activities. Half claimed that there was nothing they could have done to further benefit from the course. Fifteen students said time had been an issue, but it was unclear whether they meant that senior management should give them more time to study, or that they should organise their time more efficiently.
Think-aloud data revealed that students used different strategies to make information more relevant or personalised. These included the following: Asking questions to themselves. Example: Writing notes. Example: Relating the information to their own context. Example: Paraphrasing. Example:
Some students read superficially, skimming through the text. However, activities seemed to encourage them to go back and spend time on deeper readings. One student explained it as follows: Lots of times, […] we read once and think, “I've read”, and we answer; and then we read again and think, “If I had read twice, I would have answered correctly”. You won't gain anything by going too fast. It's better to take the necessary time to read better.
During the think-aloud sessions, six of the eight participating students had questions that were not answered by the content of the course (e.g.,
In the evaluation surveys, 29 out of 40 participants had no suggestions to improve the course, but three mentioned the importance of having embedded social interactions in the course. Students did not use the general discussion forum, which was available for questions and comments. Only six people viewed it during the duration of the course.
Although the course fostered no social interactions, the think-aloud method provided some evidence of potentially meaningful peer exchanges happening outside the virtual learning environment. During all of the sessions, either via phone calls or face-to-face interactions, work colleagues distracted students when they were navigating through the course. They interrupted to discuss job matters (e.g., retailers and sales), which were directly or indirectly related to the content of the course. Participants did not seem particularly bothered (or surprised) by these distractions, as colleagues also represented a source of support.
Students were asked whether they had use Moodle's private messaging system. Eighteen people answered. Fourteen had sent at least one private message to another participant. Ten had sent three or more messages.
The reading resources and activities were valuable for achieving learning outcomes. All survey respondents considered that the materials fostered their reflection on the course topics, and all but one (39/40) reported having learned ‘a lot’ or ‘very much’.
In the diagnostic survey, students’ average self-assessment of their own previous knowledge of the course topic was 7.6/10. This initial self-diagnosis is consistent with the control group's mean examination result (7.1/10). Students who completed the course performed better than the control group in the exam (9.5 versus 7.1).
The results of this study provide evidence of content-based learning designs as engaging, effective alternatives for online courses in corporate settings. Participants benefitted from self-pacing, that is, the flexibility to study whenever it suited them, without depending on others’ input to move forward. They engaged with their course following the structure, guidance and recommendations provided. Most students successfully completed the course and performed better than the control group.
Activities requiring explicit responses and automated feedback were useful as a means of ensuring comprehension and encouraging a return to earlier parts of the content when confusions arose. Learner–content interactions performed functions usually carried out by teachers (Anderson
Some participants attempted to contextualise the materials, making it more relevant to them and their work. However, as in Cotton and Gresty's study (
Students were resourceful when attempting to obtain extra help. They moved beyond what the course offered. They took notes they could look back to, sought communication from peers via Moodle's private messages, or turned to work colleagues available face-to-face to discuss ideas. This finding is consistent with the notion that learner–content interactions are limited in comparison to the more meaningful learning experiences that exchanges between people may create (Anderson and Garrison
The value of informal learning activities has been highlighted in the past (Ozolins, Hall, and Peterson
The conclusions from this research can be mapped against three areas: student engagement, learner support and effectiveness of content-based learning designs.
Students engaged with a content-based online course offered by their organisation by following the guidance available and attempting to make the materials relevant to their own context. Structured learner–content interactions were designed into the course and provided standard opportunities for the acquisition of critical knowledge and skills. These processes did not depend on online facilitators or peers, and constituted a “safety net” for the achievement of the learning outcomes.
Students were resourceful in their search for support. If the materials did not provide answers to their questions, they looked for viable alternatives, such as reviewing their own notes and identifying colleagues to talk to, both online and face to face. These informal learning activities are valuable because of their potential impact on the achievement of learning outcomes and their application in the workplace.
Content-based courses can provide an engaging route to effective and efficient online learning in corporate settings: they help students achieve learning outcomes without the deployment of significant resources during delivery. However, excluding social interactions from online courses may result in course materials being the students’ only source of help. Some will find alternative ways of obtaining adequate support. Others might not. Broadening the range of support options available to students, that is, “humanising” support, may foster more meaningful, contextualised and rewarding learning experiences.
The findings of this study will inform learning and design and delivery decisions, and improve future versions of the course at the participating organisation. The short duration of the course and the relatively small sample prevent these results from being generalizable to all populations and settings. However, the findings presented in this article may be valuable to educators and trainers in similar contexts.