ORIGINAL RESEARCH ARTICLE

Performance, behaviour and perceptions of an open educational resource-derived interactive educational resource by online and campus university students

Erin J. Ward and Brian L. Lindshield*

Department of Food, Nutrition, Dietetics and Health, Kansas State University, Manhattan, KS, USA

Received: 17 January 2020; Revised: 25 June 2020; Accepted: 26 June 2020; Published: 24 August 2020

Platforms are now becoming available to allow for incorporating interactive elements into open educational resources (OERs), but little has been published about their use and effectiveness. Students enrolled in online and on-campus sections of an intermediate human nutrition course at a public Midwestern University in the United States used an OER that was adapted to an online platform, where it included embedded videos and summative assessments (interactive educational resource). Data were collected from the learning management system, course performance, resource platform and a survey. Student course grades were positively correlated with use of the interactive educational resource and percentage of questions correctly answered. Overall survey response rate was 84/109 (77.1%). Student respondents reported higher use of the interactive educational resource and preferred it over a static PDF or hard copy. Students were most motivated to utilise the interactive educational resource by the opportunity to earn extra credit followed by desire to earn a good grade. Student respondents reported that they were satisfied with their experiences using the interactive educational resource, and with a high likelihood, would recommend future students to use it. While these findings are limited to one semester at one university, they support future research efforts into the efficacy of interactive educational resources and OER-enabled pedagogy.

Keywords: Human Nutrition, Open Educational Resource; Interactive Learning; Interactive Textbook; Formative Assessments; Low-Stake Summative Assessments; Interactive Educational Resource; Open Education; Open Pedagogy

To access the supplementary material, please visit the article landing page

*Corresponding author. Email: blindsh@k-state.edu

Research in Learning Technology 2020. © 2020 E.J. Ward and B.L. Lindshield. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Citation: Research in Learning Technology 2020, 28: 2386 - http://dx.doi.org/10.25304/rlt.v28.2386

Introduction

Affordability of post-secondary education continues to be a concern as total cost in 2019–2020 has doubled in the inflation-adjusted US dollars compared to 30 years earlier (The College Board 2019). Books and supplies are estimated to cost students attending 4-year US institutions $1240, and many students without adequate savings or earnings cannot afford required books and supplies, which may be detrimental to their success (The College Board 2019). A 2018 survey of 1651 former and current US students found that:

Thirty percent of survey respondents said they had forgone a trip home to see family, 43 percent said they skipped meals, 31 percent registered for fewer classes, and 69 percent worked a job during the school year -- all to save money for books. (Inside Higher Ed 2019)

Alternatives to traditional textbooks can reduce the costs of books and supplies. Open educational resources (OERs):

are teaching, learning and research materials in any medium – digital or otherwise – that reside in the public domain or have been released under an open license that permits no-cost access, use, adaptation and redistribution by others with no or limited restrictions. (UNESCO 2019)

That is, OERs include built-in permission to retain, reuse, revise, remix and redistribute the material (Wiley 2018). A 2019 meta-analysis found a significant 29% decrease in the risk of college students withdrawing from open textbook courses compared to commercial textbook comparison courses, with equal learning outcomes between the courses (Clinton and Khan 2019). Students in the University of Georgia system (21 822 students), OER course students’ final grade point average (GPA) was significantly higher, and DFW rates (students earning a D or F or those who withdrew, W) were decreased, compared to non-OER courses. Further, they found that the OER course student improvements in GPA and DFW rates were greater among Pell recipient, part-time, and non-white students that had lower rates of student success (Colvard, Watson, and Park 2018). Students overwhelmingly have reported that they like free, open-access course materials (Feldstein et al. 2012; Rockinson-Szapkiw et al. 2013; Lindshield and Adhikari 2013b; Delimont et al. 2016).

While open textbooks are an important step forward, there is increasingly a desire for interactivity in online educational resources. An OER has been used for teaching an intermediate-level human nutrition course at a Midwestern United States university since 2010 (Lindshield and Adhikari 2011, 2013b). Previous research has found that students were interested in and favourably rate this OER (Lindshield and Adhikari 2013b) and were supportive of a course fee to support use of it and other similar OER (Lindshield and Adhikari 2013a). In an interest to increase student engagement with course material, the OER, previously available to students in two electronic formats, Google Drive™ and PDF, was adapted to an interactive online platform. This next generation of the OER, allowed students to interact with content through embedded videos (instead of external links) and embedded questions, as a form of formative and summative assessment, inserted throughout the text. Research from other science, technology, engineering and math (STEM) fields suggests that interactive educational resources containing questions and animations can improve student grades and learning experiences (Edgcomb et al. 2014; Liberatore 2017). Interactive educational resources, which can include audioclips, videos, animations and/or other interactive features, can increase student engagement, achievement, interest and provide additional representations of information (Lim 2017; Mills 2016). University STEM students reported liking the interactive nature of the textbooks and ability to receive immediate feedback (Liberatore 2017; O’Bannon, Skolits, and Lubke 2017). Furthermore, students had high access rates and activity completion rates with interactive textbook use (Edgcomb et al. 2015). Student engagement was positively associated with web-based learning technology and those who made most use of technologies achieved higher-order learning and gains in education, competence and development (Chen, Lambert, and Guidry 2010).

Research suggests that frequent formative assessment and repetition is advantageous to student learning (Bushway and Flower 2002; Liberatore 2017; Viegas, Alves, and Lima 2015). Low-stake assessments may encourage ‘assignment-driven’ students to more routinely engage with the course material (Holmes 2018). For students in online learning environments, opportunities to evaluate their performance can help students to self-regulate (Sharp and Sharp 2016). Questions throughout an interactive educational resource allow an instructor to track student’s progress, performance and engagement with the material, and better support student learning (Akbar 2016). There is limited research that has used log data (i.e. when questions were answered, times that students were in a platform etc.) available from interactive educational resources (Henrie, Halverson, and Graham 2015).

We hoped that introduction of questions inserted throughout the interactive educational resource would offer students frequent opportunities to assess their understanding, gain immediate feedback and encourage them to engage with course materials more frequently. Our primary objectives were to understand how the students perceived the interactive educational resource to analyse their behaviours (from the platform data) and how these behaviours correlated with course performance.

Methods

Educational resource and questions

The OER used for this study has been in use for nearly a decade (Lindshield and Adhikari 2011, 2013b). The sophomore/junior-level human nutrition course is offered three times per year online (Spring, Summer and Fall semesters) and on-campus during the Spring semester. OER content was reviewed, updated to fit newly created questions and uploaded to the Top Hat Textbook™ platform for classes beginning in January 2019. This online platform was accessible via a full website or tablet and phone app. However, since the interactive educational resource is not open, it is broadly described as an educational resource (OER + interactive educational resource) or an interactive educational resource (online platform version) in this manuscript. Students had access to the interactive educational resource, in addition to a static PDF OER, which could be used digitally or in printed form. Content from both versions was the same with the exception that videos were embedded in the interactive platform (only available via external link in the PDF) and questions were only included in the interactive education resource.

Several question types were utilised: a large majority were multiple choice or true/false with a handful of matching, sorting or click-on image questions. Questions were written to be quick to answer while emphasising key information that students should learn from each section/subsection. The educational resource is organised into 13 chapters that are divided into multiple sections and subsections. Over the entire interactive educational resource, 380 questions were distributed with at least one question per nearly all sections and subsections. Students were offered one attempt to answer questions for credit through each chapter’s close date. Chapter close dates were set to encourage students to work through the material on a routine basis. The platform would indicate to the student whether they answered the question correctly, and each question had an explanation that could be accessed after students answered the question to provide additional clarification.

In the interactive educational resource, the applicable chapter/sections were made available at the beginning of each exam period with defined close dates. Most were full chapters (Supplementary Table 1), with the exceptions being ‘chapter 12’ refers to 12.1–12.74 and ‘chapter 13’ refers to 12.8–12.93 and chapter 13. All chapters (except chapter 1 in the campus section) were open for at least 1 week. While reminding that chapter questions were closing on the morning of their due date, the instructor also announced the percent of questions in that chapter that the campus versus online sections had gotten correct at that time through the university’s learning management system (LMS). This information was also announced and shared during the campus course class the morning of the due dates. Extra credit was awarded based on number of questions answered correctly for each chapter (point scale included in Supplementary Table 2). After each chapter was closed, the content and questions remained available to students to answer questions, though no extra credit was awarded for doing so.

Table 1. Questions correct, final course grade and final letter grades for students grouped by question performance.
Question performance n Mean questions correct (%) Mean course grade (%) Final letter grade Count (%)
A B C D F
<50% 6 37.0 59.1 0 0 2 (33%) 2 (33%) 2 (33%)
50%–59.9% 6 54.8 57.2 0 0 2 (33%) 1 (17%) 3 (50%)
60%–69.9% 18 65.9 71.3 0 4 (22%) 7 (39%) 5 (28%) 2 (11%)
70%–79.9% 26 76.0 76.5 4 (15%) 6 (23%) 9 (35%) 7 (27%) 0
80%–89.9% 41 84.0 85.4 15 (37%) 19 (46%) 6 (15%) 1 (2%) 0
90%–100% 12 92.9 92.5 10 (83%) 1 (8%) 1 (8%) 0 0
Note: Final letter grade, n (%), for each question performance group, across rows.
Final letter grades scale: A = 89.5%–100%; B = 79.5%–89.4%; C = 69.5%–79.4%; D = 59.5%–69.4%; F < 59.5%.


Table 2. Educational resource frequency of use and preferences.
Use Campus n = 51–50) Online (n = 33–32)
Frequency of total educational resource use
 Never 0 1 (3.0%)
 Less than once per month 0 0
 At least once per month 0 0
 At least once every 2 weeks 0 2 (6.1%)
 At least once per week 8 (15.7%) 8 (24.2%)
 Two–three times per week 19 (37.3%) 9 (27.3%)
 More than three times per week 24 (47.1%) 13 (39.4%)
Preferred educational resource format
 PDF 11 (21.7%) 4 (12.5%)
 Hard copy 2 (3.9%) 0
 Interactive platform 38 (74.5%) 28 (87.5%)
How often educational resource was accessed via each format
 PDF 16.2% ± 28.2% 14.2% ± 22.4%
 Hard copy 2.7% ± 12.7% 0
 Interactive platform 81.1% ± 30.2% 85.8% ± 22.4%
How often accessed the interactive educational resource was accessed via different devices
 Desktop/Laptop (full site) 91.8% ± 17.9% 81.3% ± 33.6%
 iPad/tablet (app) 4.5% ± 16.1% 5.7% ± 18.9%
 iPhone/android (app) 3.6% ± 8.5% 13.0% ± 29.8%
Note: Data are presented as mean ± SD values for continuous variables or n (%) for categorical variables.
One student responded ‘Never’ to use of educational resource.
One student only answered first two questions and did not complete the survey.

Survey

The survey was approved by the Kansas State University Institutional Review Board (IRB #9739). Students enrolled in campus (n = 61) and online (n = 48) sections during the final weeks of the Spring 2019 semester were invited to participate in the survey. Students who withdrew from the course (n = 1, campus; n = 1, online) or did not utilise the interactive educational resource (n = 1, online) were not included. Each student received a unique URL via email from the Kansas State University Qualtrics® survey platform to complete the confidential survey. Unfinished respondents received reminder emails from Qualtrics® twice during the 2-week open period. The course instructor reminded campus students to complete the survey in-person and offered a small amount of time during one class period to complete the survey. Both campus and online sections were reminded to complete the survey by the instructor via announcements through the university’s LMS.

Consent information was displayed when all students opened the survey and survey questions were presented one at a time after consent. The survey included five demographic questions followed by a maximum of 25 questions related to student frequency of use, behaviours and perceptions/opinions. Branching logic was used to ensure that students only answered relevant questions based on previous responses (survey, with branching logic, included in Supplementary material).

Data analysis

Course performance and question progress

Only students who received invitation to take part in the survey were included in the data analysis, campus (n = 61) and online (n = 48). Final course grades were obtained from the university’s LMS. To assess progress in completing the questions, the time at which three questions, first, middle and last, from each chapter were completed was downloaded from the platform and the hours completed before the due date were calculated. For sections with an even number of questions, the first of the two middle questions was selected.

A heat map was used to visually represent question progress; one heat map with all students from both sections was prepared at http://www1.heatmapper.ca/ (University of Alberta, Edmonton, Alberta, Canada). Students were charted by final course grade, with the highest scoring student at the top and lowest at the bottom for each section.

To compare survey responses with observed question progress behaviour, completion of the middle question was analysed for each student. Students who did not complete at least 10/13 of the selected middle chapter questions were considered ‘non-completers’ (campus n = 2, online n = 4). Students who answered the middle question in >72 h before the due date for at least 50% of the chapters were considered progressors. While students who answered the middle question in <24 h before the due date for at least 50% of the chapters were considered procrastinators. All remaining students were considered part progressors/part procrastinators; they either answered the middle question in 24–72 h before due date at least 50% of the time or did not fall into any category for at least 50% of the time.

Survey data

Survey responses were calculated as count and percentage or mean and standard deviation (mean ± SD), where applicable. Likert scale questions (21–25, Supplementary material) were analysed for differences with the Mann–Whitney–Wilcoxon test in SAS Studio® software (significance at p < 0.05, Version 3.71, SAS Institute Inc., Cary, North Carolina).

Results

Course and question performance

Questions performance

All students included in analyses chose to use the interactive educational resource. The majority of the questions were answered correctly by students in both campus and online sections (Supplementary Table 3). The campus section answered more questions correctly for every chapter compared to the online section.

Table 3. Behaviours related to educational resource and interactive platform use.
Behaviour Campus (n = 50) Online (n = 32)
Percentage of the educational resource read 82.4% ± 23.4% 83.3% ± 26.9%
Percentage of time in platform answering questions and reviewing answers (vs. reading, watching videos, etc.) 60.4% ± 24.8% 55.0% ± 24.9%
Students who used the educational resource during class/while watching class videos 50/50 (100%) 16/23 (69.6%)
Percentage of educational resource use during class/while watching class videos 39.2% ± 24.5% 33.4% ± 26.9%
Note: Data are presented as mean ± SD values for continuous variables or n (%) for categorical variables. 23 students in online section reported watching class videos.

Selected questions progress

The heat map illustrates that students who did better in the course answered questions earlier (Figure 1). Furthermore, students who did poorer in the course tended to leave more questions unanswered.

Fig 1
Figure 1. Selected questions progress. Students for each section are organised by final course grade; highest at top and lowest at bottom. Completion of first, middle, and last questions in each chapter are displayed on the x-axis. Black spaces represent unanswered questions.

Course grades compared with question performance

Positive linear trends were observed in both sections comparing final course grades to percentage of correctly answered questions (Figure 2). Students were grouped based on question performance in 10% increments: mean questions correct, mean course grades, and final letter grades were determined for each group (Table 1). Mean course grades were positively correlated with question performance groups. No students in the top group, 90%–100% of questions answered correctly, earned D or F final course grades, while no students in the bottom two groups, <50% and 50%–59.9%, earned A or B final course grades. Furthermore, no students who correctly answered at least 70% of the questions correctly earned a failing grade of F in the course.

Fig 2
Figure 2. Campus (R2 = 0.46) and Online (R2 =0.49) course final grades versus questions correct. R2 is the coefficient of determination off of each sections trend line.

Survey data

Response rate

Overall response rate was 84/109 students (77.1%) between both sections. Campus section had an overall higher response rate, 51/61 students (83.6%), compared with the online section, 33/48 students (68.8%). One student in each section did not finish the survey; these students’ responses were not included in the reported results.

Demographics

Online section students who completed the survey tended to be older and had at least one degree (Associate or Bachelor) or some graduate-level education (Supplementary Table 4). The majority of students (61%) reported a grade in the course similar (within 5% difference) to their typical course grades (Supplementary Figure 1). More campus section students reported similar grades than online section students (71% vs. 45%). A higher proportion of ‘A’ and ‘B’ final course grade earning students completed the survey than ‘C’, ‘D’, and ‘F’ students (Supplementary Table 5).

Frequency and preferences

The majority of students, 100% in the campus section and 90.9% in the online section, reported using the educational resource at least once per week (Table 2). One student reported never using the educational resource and skipped to the final, open-ended survey question. For all students who reported using the educational resource, the majority, 74.5% campus and 87.5% online, reported that they preferred and used the interactive educational resource. The full platform site (accessed using desktop or laptop computer) was most often used by students in both sections.

Behaviours

Trends in student question completion behaviour were observed during the semester. Some students appeared to progress routinely through the material, answering questions while learning or reading (progressor), while other students answered all or nearly all of each chapter’s questions on or near the due date (procrastinator). In addition, the third group of students displayed behaviour similar to both categories (part progressor/part procrastinator). An additional category of students may be considered non-completers, who answered some, but not all questions. We asked students to identify which of the first three categories they felt best described them. The majority of students in the campus section selected part progressor/part procrastinator, while very few selected that they were procrastinators (Figure 3). In the online section, students were more evenly distributed between all three categories. The majority of respondents in both sections reported that they felt that the platform and/or embedded questions helped them to be more proactive in learning the material (Supplementary Table 6).

Fig 3
Figure 3. Student interactive platform user style based on question performance and self-reported. ‘Data’ columns are based on question performance data from the interactive platform, and ‘reported’ columns are based on survey respondents self-reported user style. User styles were determined based on completion of middle question for each chapter. Progressor: answered middle question >72 h before due date at least 50% of the chapters. Procrastinator: answered middle question <24 h before due date at least in 50% of the chapters. Part/part (part progressor/part procrastinator): answered middle question 24–72 h before due date for at least 50% of the time or did not fall into any category for at least 50% of the chapters. Non-completer: did not complete selected question for at least 4/13 chapters.

More than half of the campus section fell into the progressor category based on question performance analysis (Figure 3). Campus students reported more than three times as often that they felt they were part progressor/part procrastinator compared with the performance data. Online section students self-reported a more similar breakdown of user styles compared with the performance data, although students self-reported more often as progressors and part progressor/part procrastinators compared with procrastinators. Students in both sections reported reading >80% of the educational resource on average (Table 3). Of the time that students used the educational resource, students in both sections reported that over half of that time was spent answering questions and reviewing answers. When students’ final grades were plotted against their percentage of correct questions and organised by question answering behaviour, a clear negative distinction could be observed between non-completers and the other three categories of students (Supplementary Figure 2 and Supplementary Table 7).

Students were asked to select and rank all applicable statements which described when they completed questions in relation to attending class/watching class videos and reading the educational resource. ‘Following class after or while reading the educational resource’ was selected most often by campus students (88%), followed by ‘before class after/while reading’ (66%) and ‘during class’ (56%; Supplementary Table 8). ‘Before watching class videos after or while reading’ was most often selected by the online students (52%) followed by ‘after or while reading’ (‘did not watch class videos’, 45%; Supplementary Table 9). These options were equally selected and ranked first by online students with approximately two-thirds of online students only selecting one option.

Many campus students were observed using the educational resource during class. All campus section respondents and only online students who responded to watching class videos were asked about their use of educational resource during class/while watching class videos. All 50 of the campus students and 16/23 online students reported that they used the educational resource during class time/while watching class videos (Table 3). Campus students reported that 39% of their educational resource use was during class, while online students reported slightly less, 33%, was while watching class videos. Most campus students reported that they used the educational resource during class for completing daily in-class assignments (66%), reading (64%) and answering questions (62%; Supplementary Table 10). Online students reported that they used the educational resource while watching class videos mostly for reading (56%) and note taking (44%; Supplementary Table 10).

Campus and online students reported different uses of the features available in the interactive educational resource. Campus students reported more use of the highlighting, commenting and figures, while the online section reported more use of the links to external articles and embedded videos (Supplementary Table 11). Both sections reported a high use of the questions for review purposes, after the due date for extra credit, 84% campus section and 87% online section (Supplementary Table 11).

Motivations

Students were asked to select and rank all applicable motivations for use of the interactive educational resource at the beginning of the semester. If motivations changed, those students were additionally asked about them at the end of the semester. Extra credit was selected most often by students at the beginning of the semester (94% campus section, and 97% online section; Supplementary Table 12). Students in the campus course most often selected extra credit as their first motivator followed by desire to earn a good grade as their second motivator, while online students selected desire to earn extra credit and a good grade more equally as first motivators with more variation in the second selected motivator. When ease of use was selected as a motivator, it was most often ranked last. Learning and use as a study tool were most often selected and ranked third for students in both sections. For the students who reported that their motivators changed throughout the semester, the changes in rankings were similar among students in both sections (Supplementary Table 13). However, some differences were noted in where the different motivators were ranked. Extra credit and ease of use were ranked lower, suggesting that students were more motivated by these two motivators at the beginning of the semester than the end, while helpfulness for learning the material (learning) received the highest jump, suggesting that students were more motivated to use the interactive educational resource to support their learning at the end of the semester than the beginning.

Opinions

Students in both sections reviewed explanations more often when the questions were answered incorrectly (6.1 vs. 9.3 and 6.9 vs. 9.5 on a 0-10-point scale for campus and online sections, respectively; Supplementary Table 14). Some students in both sections reported never reviewing the explanations when questions were answered correctly, while no student in either section responded that they never reviewed explanations when they got the questions incorrect. Online students were more likely than campus students to review question explanations in general. Students in both sections reported similarly on the helpfulness of questions and explanations, with the online section reporting slightly higher values than the campus section.

Students in both sections reported similar Likert scale values on topics of satisfaction, material understanding, enjoyment and frequency of use (results were not significantly different between sections; Supplementary Table 15). Students most often rated that they were somewhat to mostly satisfied with the interactive educational resource. Students reported somewhat agree to agree most often, indicating that questions and explanations provided in the interactive educational resource improved understanding of the material, comfort with the topics, and overall confidence in the course. Students similarly reported somewhat agree to agree most often that they enjoyed using interactive educational resource. Students additionally reported that they felt their use of the interactive educational resource was slightly to moderately more compared to if only a PDF of the educational resource was available. On a 0–10-point scale, students in both sections reported a high likelihood of recommending the interactive educational resource to future students, 8.9 ± 1.8 campus and 8.8 ± 2.2 online (mean ± SD).

All students who completed the survey had an opportunity to answer a final, optional open-ended question with any comments. Approximately one-quarter of students chose to leave an answer, 22% in campus section and 26% in online section, and these comments were summarised (Supplementary Table 16). The majority of students in the campus section left positive sentiments about their experiences, while the online students were more often critical in their responses. Some of the positive comments included: ‘I thought [interactive educational resource] was helpful, I wish I would have known how helpful it could have been at the beginning’ and ‘I preferred to use the [interactive educational resource] version because I could actively learn and complete the questions as I read…’. Several students commented that they liked to use the questions as study material, but one student mentioned that they wished more instructors used similar resources. Some critical comments included the one which did not prefer the educational resource: ‘I would have much preferred a traditional textbook’ and another student who commented on quality of content said, ‘I honestly think the [educational resource] is a great tool that just needs to be cleaned up a bit and have more meaningful questions’.

Discussion

An opportunity to incorporate new technology with an existing OER was hypothesised to increase student engagement. We wanted to understand how the students perceived the interactive educational resource and to analyse their behaviours.

Performance

We observed a positive correlation in question performance (correctness) and course performance (final grade) – students who did better on the questions did better in the course. This trend may be due in part to the students just being different kinds of students – those who are high-achievers were going to do well in the course regardless of extra activities provided, and those who were going to do poorly in the course were not going to put in the same effort as high-performers. A similar trend was observed in students in a STEM class in 2017 which used an interactive educational resource with embedded questions and animations: A and B students had noticeably higher participation and reading rates than C, D and F students (Liberatore 2017). This data may also be explained by students who spend more time with the material, such as reviewing answers to make sure they are getting the questions in the interactive platform correct, are going to do better in the course. Student engagement, which can be described as commitment or effortful involvement in learning, is positively associated with student motivation and academic achievement (Henrie et al. 2015).

Question performance

Procrastinators versus progressors versus non-completers

Use of a heat map offered an interesting perspective/visual of student question performance across the entire semester in one figure. The progressor category was predictive of student performance – students in this category were similar and tended to earn higher final course grades. There was more overlap comparing the part progressors/part procrastinators group with the procrastinators group. However, those students who were in the part progressors/part procrastinators group received higher mean scores in both the course and question performance compared with the procrastinators group. Similar to our findings, others have observed that procrastination is associated with poorer performance (Michinov et al. 2011). It may be in part due to lack of interest in assignments (Ackerman and Gross 2005) and internal student barriers such as poor time management skills (Kachgal, Hansen, and Nutter 2001).

The large variation among final grades for students categorised as procrastinators may be explained by different types of procrastinators. Active procrastinators prefer to work under pressure, while passive procrastinators do not act when needed and fail to complete tasks on time (Chu and Choi 2005). Active procrastinators are more similar to non-procrastinators in terms of their time management, self-efficacy and academic performance (Chu and Choi 2005).

Frequency and preferences

Nearly all students who used the educational resource reported using it at least once per week with the majority reporting use of more than three times per week, which is an increase compared to findings reported in 2013 (Lindshield and Adhikari 2013b). The current survey results are encouraging and similar to reported student use with other interactive textbooks (Edgcomb et al. 2015). Increased engagement with course materials is further supported by student responses. The majority of students reported that they felt the interactive educational resource and/or embedded questions helped them to be more proactive in learning the material and that they felt they used it more as a result compared to if only a PDF was available.

Student behaviours regarding educational resource access have also changed. In 2010, many students reported use of two electronic versions and hard copies of the OER with no single format being preferred (Lindshield and Adhikari 2011). Other research from 2010 indicated that students preferred traditional textbooks over e-textbooks (Woody, Daniel, and Baker 2010). While in 2011–2012, some students, mostly in online sections, reported printing a hard copy of the OER (Lindshield and Adhikari 2013b). In the present study, the majority of students reported they preferred and primarily used the interactive educational resource; only one student reported use of a printed copy. While it has been demonstrated that reading from printed materials increases performance and efficiency (Clinton 2019), format may not ultimately impact student performance (Daniel and Woody 2013). However, students may get more easily distracted when offered electronic text options (compared to traditional textbooks), which may increase time that students take to read through assigned material (Daniel and Woody 2013). For this particular course, preference for the interactive educational resource may have been largely influenced by the embedded questions, only available on the online platform. These findings may also be in part explained by changes in student preferences and increased use of online university LMSs.

Behaviours

The majority of campus students selected part progressor/part procrastinator, while very few selected that they were procrastinators. While we did not feel that the low reported number of procrastinators was unexpected, we were surprised by how few students reported that they felt they were progressors based on our categorisation of students by question performance data. Students may not have had a similar understanding of our groupings, or campus students may be more critical of their performance. Differences may also be explained by our methodology; we only used one data point from each chapter to classify students. These results are limited by the students who completed the survey – we cannot predict how students who did not compete the survey would have self-identified.

We also did not provide students the opportunity to categorise themselves as ‘non-completers’. Only 6/109 students were classified as non-completers. Two students displayed behaviour of procrastinators for the questions answered, while the other four students displayed some tendencies of procrastinators, running out of time, but also some behaviour of progressors. More often, the missed chapters by these students were at the end of the exam section, while the chapters which were completed early (progressor behaviour) were the first chapters of each new exam section. It is possible that these students felt motivated after each exam to do better, thus taking a more proactive approach before falling back into their usual study habits.

One of the more interesting findings from survey questions about timing in which they answered questions was that the online students reported equally high frequency of primarily answering questions before watching class videos or that they did not watch class videos. While the campus section most often reported answering questions following class (although during class and before were other popular top-ranked options). The high frequency of students in the online section reporting that they answered the questions before watching the class videos may be partially explained by a difference in the types of students, possibly related to age, in the online versus campus sections. Few campus students, as low as one-third, read assigned materials before class (Skinner and Howes 2013).

Similar to the findings from 2011 to 2012, students in the online section reported increased use of embedded videos and links to external articles compared with the campus section (Lindshield and Adhikari 2013b). Many of the embedded videos and articles are shared in class, so students who routinely attend class in the campus course may not perceive as much value out of these features. An increase in the percentage of students in both sections who reported using videos from 2011 to 2012 compared with 2019 may be partially explained by the convenience of the embedded videos; students did not have to navigate away from the interactive educational resource to view videos (Lindshield and Adhikari 2013b). This same explanation cannot be used for the reported increase use of links to external articles by the online section. However, it is possible that students had increased access to the Internet in 2019; students in 2011 reported more often reading the OER as a PDF or printed copy whereas in 2019, the majority of students reported primarily using the interactive platform, meaning they were already connected to the Internet and had fewer barriers to accessing the external articles. While students reported reading a large percentage of the book, it is interesting that they also reported spending more than half their time working on the questions.

Motivations

Students reported that they were highly motivated by extra credit, particularly in the campus course and at the beginning of the semester. Others have found that students report intentions to complete extra credit, but few actually do (Myers and Hatchel 2019). In the extra credit point award structure for this course, students did not have to complete all required tasks in order to earn partial extra credit, which may account for the high level of participation observed. Higher-earning students, in general, tended to answer more of the questions and more of them correctly, in part simply due to completing more questions. These findings align with research that suggests that higher-performing students are more likely to complete extra credit activities (Harrison, Meister, and Lefevre 2011; Silva and Gross 2004). We believe that the extra credit points offered were appropriate and did not contribute to grade inflation, considering that it accounted for approximately 1% of all possible course points similar to extra credit offered previously (Haber and Sarkar 2017; Silva and Gross 2004).

When responses were analysed for those students who reported that their opinions changed throughout the semester, helpfulness for learning the material (learning) received the biggest jump in ranking, while extra credit and ease of use dropped in rankings. These changes suggest that students felt that using the interactive educational resource helped them learn the material. Several students reported to the final open-ended question that they increased use because they found the interactive educational resource helpful for their learning and studying.

Opinions

Students in 2011–2012 and 2019 rated similarly their level of satisfaction with the educational resource (on a 7-point Likert scale: 2011–2012: campus 5.7, online 5.9 vs. 2019: campus 5.6, online 5.5) (Lindshield and Adhikari 2013b). It is encouraging that students reported similar positive levels of satisfaction with the educational resource over the years.

If students got wrong question, the vast majority reported that they looked at the explanations every time. The goal of the explanations was to provide students with immediate feedback and additional clarification. It appears that because students used the explanations, in particular when they incorrectly answered a question, most students used questions to assess their understanding of the material (as a method of formative assessment). The immediate feedback also likely contributed positively to students’ experiences – providing them with the correct answer and information they can use immediately. This is consistent with the use of technology in the classroom and learning materials to provide instant feedback being highly liked and accepted by students (Elmahdi, Al-Hattami, and Fawzi 2018; Liberatore 2017; Lim 2017; Viegas et al. 2015). We observed that many students reached out, either through email or the course discussion board, if the explanation did not provide enough clarity. Because the educational resource, including the embedded questions, is a living document, the quality of the text, questions and explanations continues to improve through students’ questions and interaction.

Limitations

The quality of this data is limited by a relatively small number of students, 109 students included in the course data and 84 survey respondents, and the use of interactive platform for only one semester. This research is a representative of a single OER for one course at one institution in a platform that was not open; outcomes may be different in a different setting. These findings may also not be applicable to students in primary or secondary schools and non-4-year university settings, technical or 2-year colleges.

Conclusions

Increased use and question correctness of the interactive educational resource were positively associated with final grades earned in a human nutrition course at a Midwestern University. Students reported that they were satisfied with their experiences with the interactive educational resource and they believed that they used it more and preferred it over a static PDF. While these findings are limited to one semester at one university in a platform that was not open, they support the future research efforts into the efficacy of interactive educational resources and OER-enabled pedagogy. In support of this effort, the resource has been migrated to the LibreTexts platform to make it truly open (Lindshield 2020).

References

Ackerman, D. S. & Gross, B. L. (2005) ‘My instructor made me do it: task characteristics of procrastination’, Journal of Marketing Education, vol. 27, no. 1, pp. 5–13. doi: 10.1177/0273475304273842

Akbar, M. (2016) ‘Digital technology shaping teaching practices in higher education’, Frontiers in ICT, vol. 3, pp. 1–5. doi: 10.3389/fict.2016.00001

Bushway, S. D. & Flower, S. M. (2002) ‘Helping criminal justice students learn statistics: a quasi-experimental evaluation of learning assistance’, International Journal of Phytoremediation, vol. 21, no. 1, pp. 35–56. doi: 10.1080/10511250200085321

Chen, P. S. D., Lambert, A. D. & Guidry, K. R. (2010) ‘Engaging online learners: the impact of Web-based learning technology on college student engagement’, Computers and Education, vol. 54, no. 4, pp. 1222–1232. doi: 10.1016/j.compedu.2009.11.008

Chu, A. H. C. & Choi, J. N. (2005) ‘Rethinking procrastination: positive effects of “active” procrastination behavior on attitudes and performance’, Journal of Social Psychology, vol. 145, no. 3, pp. 245–264. doi: 10.3200/SOCP.145.3.245-264

Clinton, V. (2019) ‘Reading from paper compared to screens: a systematic review and meta-analysis’, Journal of Research in Reading, vol. 42, no. 2, pp. 288–325. doi: 10.1111/1467-9817.12269

Clinton, V. & Khan, S. (2019) ‘Efficacy of open textbook adoption on learning performance and course withdrawal rates: a meta-analysis’, AERA Open, vol. 5, no. 3, pp. 1–20. doi: 10.1177/2332858419872212

Clinton, V., Legerski, E. & Rhodes, B. (2019) ‘Comparing student learning from and perceptions of open and commercial textbook excerpts: a randomized experiment’, Frontiers in Education, vol. 4, pp. 1–12. doi: 10.3389/feduc.2019.00110

Colvard, N. B., Watson, C. E. & Park, H. (2018) ‘The impact of open educational resources on various student success metrics’, International Journal of Teaching and Learning in Higher Education, vol. 30, no. 2, pp. 262–276.

Daniel, D. B. & Woody, W. D. (2013) ‘E-textbooks at what cost? Performance and use of electronic v. print texts’, Computers and Education, vol. 62, pp. 18–23. doi: 10.1016/j.compedu.2012.10.016

Delimont, N., Turtle, E. C., Bennett, A., Adhikari, K., & Lindshield, B. L. L. (2016) ‘University students and faculty have positive perceptions of open/ alternative resources and their utilization in a textbook replacement initiative’. Research in Learning Technology, vol. 24. doi: 10.3402/rlt.v24.29920

Edgcomb, A et al. (2014) Student Performance Improvement using Interactive Textbooks: A Three-University Cross-Semester Analysis. Available at: http://static.cs.ucr.edu/store/techreports/UCR-CSE-2014-10030.pdf

Edgcomb, A et al. (2015) Student Usage and Behavioral Patterns with Online Interactive Textbook Materials. Available at: http://alumni.cs.ucr.edu/~aedgcomb/papers/ICERI15_StudentUsageandBehavioralPatternswithOnlineInteractiveTextbookMaterials.pdf

Elmahdi, I., Al-Hattami, A. & Fawzi, H. (2018) ‘Using technology for formative assessment to improve students’ learning’, Turkish Online Journal of Educational Technology –TOJET, vol. 17, no. 2, pp. 182–188.

Feldstein, Aet al. (2012) ‘Open textbooks and increased student access and outcomes’, European Journal of Open, Distance and E-Learning, vol. 15, no. 2. Available at: https://eric.ed.gov/?id=EJ992490

Haber, J. & Sarkar, N. (2017) ‘Sensitivity analysis of extra credit assignments’, Universal Journal of Management, vol. 5, no. 6, pp. 291–300. doi: 10.13189/ujm.2017.050604

Harrison, M. A., Meister, D. G. & Lefevre, A. J. (2011) ‘Which students complete extra-credit work’, College Student Journal, vol. 45, no. 3, pp. 550–555.

Henrie, C. R., Halverson, L. R. & Graham, C. R. (2015) ’Measuring student engagement in technology-mediated learning: a review’, Computers and Education, no. 90, pp. 36–53. doi: 10.1016/j.compedu.2015.09.005

Holmes, N. (2018) ‘Engaging with assessment: increasing student engagement through continuous assessment’, Active Learning in Higher Education, vol. 19, no. 1, pp. 23–34. doi: 10.1177/1469787417723230

Inside Higher Ed. (2019) Textbook Trade-Offs. Available at: https://www.insidehighered.com/news/2018/07/26/students-sacrifice-meals-and-trips-home-pay-textbooks

Kachgal, M. M., Hansen, L. S. & Nutter, K. J. (2001) ’Academic procrastination prevention/intervention: strategies and recommendations’, Journal of Developmental Education, vol. 25, Fall, no. 1, pp. 14–24.

Liberatore, M. W. (2017) ‘High textbook reading rates when using an interactive textbook for a material and energy balances course’, Chemical Engineering Education, vol. 51, no. 3, pp. 109–118.

Lim, W. N. (2017) ‘Improving student engagement in higher education through mobile-based interactive teaching model using socrative’, in IEEE Global Engineering Education Conference, EDUCON, pp. 404–412. doi: 10.1109/EDUCON.2017.7942879

Lindshield, B. L., & Adhikari, K. (2011). The Kansas State University Human Nutrition (HN 400) Flexbook. EDUCAUSE Review Online. Available at: https://er.educause.edu/articles/2011/12/the-kansas-state-university-human-nutrition-hn-400-flexbook

Lindshield, B. L., & Adhikari, K. (2013a) ‘Campus and online U.S. college students’ attitudes toward an open educational resource course fee: a pilot study’. International Journal of Higher Education, vol. 2, no. 4, pp. 42–51. doi: 10.5430/ijhe.v2n4p42

Lindshield, B. L., & Adhikari, K. (2013b). ‘Online and campus college students like using an open educational resource instead of a traditional textbook’. MERLOT Journal of Online Learning and Teaching, vol. 9, no. 1, pp. 26–38. Available at: http://jolt.merlot.org/vol9no1/lindshield_0313.htm

Lindshield, B. (2020) Human Nutrition (Lindshield). Available at: https://med.libretexts.org/Courses/Kansas_State_University/Book%3A_Human_Nutrition_(Lindshield)

Michinov, Net al. (2011) ‘Procrastination, participation, and performance in online learning environments’, Computers and Education, vol. 56, no. 1, pp. 243–252. doi: 10.1016/j.compedu.2010.07.025

Mills, M. S. (2016) ‘A case for authoring multi-touch interactive open educational resources’, TechTrends, vol. 60, no. 5, pp. 456–464. doi: 10.1007/s11528-016-0097-5

Myers, C. A. & Hatchel, J. M. (2019) ‘Personality and cognitive factors related to completing extra credit assignments’, International Journal for the Scholarship of Teaching and Learning, vol. 13, no. 2, pp. 1–7.

O’Bannon, B. W., Skolits, G. J. & Lubke, J. K. (2017) ‘The influence of digital interactive textbook instruction on student learning preferences, outcomes, and motivation’, Journal of Research on Technology in Education, vol. 49, no. 3–4, pp. 103–116. doi: 10.1080/15391523.2017.1303798

Rockinson-Szapkiw, A. Jet al. (2013) ‘Electronic versus traditional print textbooks: a comparison study on the influence of university students’ learning’, Computers and Education, no. 63, pp. 259–266. doi: 10.1016/j.compedu.2012.11.022

Sharp, L. A. & Sharp, J. H. (2016) Enhancing student success in online learning experiences through the use of self-regulation strategies. The role of DevOps in IS curriculum view project [online]. Available at: https://www.researchgate.net/publication/304073173

Silva, F. J. & Gross, T. F. (2004) ‘The rich get richer: students’ discounting of hypothetical delayed rewards and real effortful extra credit’, Psychonomic Bulletin and Review, vol. 11, no. 6, pp. 1124–1128. doi: 10.3758/BF03196747

Skinner, D. & Howes, B. (2013) ‘The required textbook friend or foe? Dealing with the dilemma’, Journal of College Teaching & Learning (TLC), vol. 10, no. 2, pp. 133–142. doi: 10.19030/tlc.v10i2.7753

The College Board. (2019) Trends in College Pricing 2019. Available at: www.collegeboard.org

UNESCO. (2019) Recommendation on Open Educational Resources (OER) [online]. Available at: http://portal.unesco.org/en/ev.php-URL_ID=49556&URL_DO=DO_TOPIC&URL_SECTION=201.html

Viegas, C., Alves, G. & Lima, N. (2015) ‘Formative assessment diversity to foster students engagement’, in Proceedings of 2015 International Conference on Interactive Collaborative Learning, ICL 2015, pp. 929–935. doi: 10.1109/ICL.2015.7318152

Wiley, D. (2018) Defining the open in open content and open educational resources [online]. Available at: https://opencontent.org/definition/

Woody, W. D., Daniel, D. B. & Baker, C. A. (2010) ‘E-books or textbooks: students prefer textbooks’, Computers and Education, vol. 55, no. 3, pp. 945–948. doi: 10.1016/j.compedu.2010.04.005