Trajectories of engagement: a repeated cross-sectional investigation of student perceptions of an online learning environment

Stuart Palmer* and Dale Holt

Institute of Teaching and Learning, Deakin University, Geelong, VIC., Australia

(Received 4 November 2011; final version received 12 January 2012; Published 24 September 2012)

Abstract

Evaluations of online learning environments (OLEs) often present a snapshot of system use. It has been identified in the literature that extended evaluation is required to reveal statistically significant developments in the evolution of system use over time. The research presented here draws on student OLE evaluations surveys run over the period 2004–2011 and include nearly 6800 responses exploring students’ perceptions of importance of, and satisfaction with elements of their OLE. Across the survey period, satisfaction ratings with all OLE elements rose significantly, suggesting a positive student engagement with the OLE over time. The corresponding ratings of importance of OLE elements generally rose significantly, though a number of elements registered no significant difference in the first two years of the survey, suggesting that short period surveys may struggle to reveal statistically significant trends. OLE element use appeared to be closely linked to perceived value. The OLE elements with the highest mean importance and satisfaction ratings related to student access of online learning resources. Other detailed results are also reported. We demonstrate a method for, and one large-scale case study of, quantifying and visualising the trajectories of engagement that students have had with an institutional OLE over time.

Keywords: online learning environment; learning management system; repeated cross-sectional evaluation; student survey

*Corresponding author. Email: spalm@deakin.edu.au

RLT 2012. © 2012 S. Palmer and D. Holt. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons “Attribution 3.0 Unported (CC BY 3.0)” license (http://creativecommons.org/licenses/by/3.0/) permitting use, reuse, distribution and transmission, and reproduction in any medium, provided the original work is properly cited.

Citation: Research in Learning Technology 2012, 20: 17143 - http://dx.doi.org/10.3402/rlt.v20i0.17143

Introduction

In Australia, Deakin University is a major provider of distance and online education. In addition, it teaches on-campus at four campuses located in three cities in the State of Victoria. Initially, in the late 1970s, Deakin University saw itself as a major distance education provider, with some degree of separation between its teaching methods and materials used for on-campus teaching as opposed to off-campus teaching. The use of distance education methodologies and materials for both student cohorts gathered momentum in the early to mid-1990s under the strategic umbrella of flexible teaching and learning, and with a growing “technological imperative” (Holt and Thompson 1995) for the use of online systems for learning delivery and communication. In more recent times the university has implemented institution-wide online teaching and learning systems to provide opportunities to bring together all students in the one learning community. Iterating through a number of commercial learning management systems (LMSs), the university eventually settled on the WebCT LMS in 2003, branding it internally as Deakin Studies Online (DSO). The new LMS was trialled in 2003, and fully implemented in 2004. Concurrently, the university introduced policies requiring academic departments to migrate all online learning environment (OLE) activity to the centrally supported LMS – at that time the LMS officially became the institutional OLE. Since that time, the OLE has expanded through the supplementing of the LMS with a range of satellite technologies including, synchronous communications, lecture recording and streaming, plagiarism detection, etc.; however, the LMS remains the core of the OLE. Another key initiative in the university's strategy to expand its online and distance education profile was to require that, from 2004, all its units of study have at least a basic online presence. Additionally, from 2004, all students enrolled in Deakin University undergraduate courses had to undertake at least one unit wholly online, with few exemptions given. In early 2006, WebCT was acquired by Blackboard Inc, leading to the phasing out of new development and support for the WebCT LMS. In 2010 Deakin University selected the Desire2Learn LMS as the replacement system for the WebCT/Blackboard LMS, and during 2011 commenced a phased cut-over to the new system, migrating content and users throughout the year so that by the first teaching period in 2012, all online support for teaching and learning would be provided via the OLE based on the new LMS.

OLEs are perhaps currently the most widely used and most expensive educational technology tool (Lonn and Teasley 2009; Salinas 2008; West, Waddoups, and Graham 2007), and like many other learning technology trends before them, have been adopted by institutions almost automatically, uncritically (Reynolds, Treharne, and Tripp 2003) and without evaluation of their effectiveness (Mahdizadeh, Biemans, and Mulder 2008). Where OLE evaluation is documented, it often presents a point-in-time snapshot of the use of the system, but does not capture the development of system use over time (Lonn and Teasley 2009). Extended evaluation of OLE usage is required to reveal the detail in the evolution of system use (Browne, Jenkins, and Walker 2006; Conrad 2005), and there is a call in the literature for the application of extended and/or repeated evaluation to better understand the impact of OLEs and to optimise their use in online teaching and learning (Bates and Khasawneh 2007; Davis and Wong 2007; Drennan, Kennedy, and Pisarski 2005; Mikropoulos and Natsis 2011). Existing published extended evaluations of OLEs are often those compiled by industry bodies and comprised of data drawn from across the sector (Browne, Jenkins, and Walker 2006; Smith and Caruso 2010). Such evaluations are valuable for sector benchmarking comparisons, but potentially subsume the nuances of the specific characteristics and context of individual institutions.

Given the scope of Deakin University's commitment (in terms of central infrastructure, policy development and roll-out of online components to all taught units) to online education, it was considered essential to evaluate the effectiveness of this investment. In 2003, a pilot survey of staff and students using DSO was conducted to establish perceptions of importance and satisfaction with various elements of the OLE. Following the full mainstreaming of DSO in 2004, the survey instrument was revised, and the survey process was expanded to include all Deakin University staff and students, and repeated again in 2005. The survey was administered using a university online survey tool. These surveys produced a large pool of data that provided insights into the initial engagement with the institutional OLE by students and staff, and some aspects of the student survey results have been reported previously (Challis 2005; Palmer and Holt 2010).

This previous research covered only the first two years of the full-scale system roll out. While evaluation of two years of operation can reveal interesting trends (Lonn and Teasley 2009), the observed differences may only be minor and/or statistically insignificant (Smith and Caruso 2010). A longer evaluation time frame may be required to reveal the deepest insights into the adoption and use of a new OLE (Davis and Wong 2007). Evaluation surveys that include consistent question items over a longer period of time permit more robust extended analysis of data trends (Browne, Jenkins and Walker 2006; Smith and Caruso 2010). In 2011, a new DSO evaluation survey was developed and administered to all students and staff. Crucially, the set of question items relating to use and perception of the core elements of the OLE was largely common with the previous surveys run in 2004 and 2005. This most recent evaluation survey serves two important purposes. Firstly as a measure of the current state-of-play with the current (but to be retired) LMS, it provides a benchmark measure against which the new LMS can be compared to at the point it eventually comes into full-scale use. Secondly, it provides a bookend measure of the development of student and staff perceptions of the value of the existing OLE across the seven-year period between 2004 and 2011.

The research presented here focuses on the latter aspect – drawing on the large (nearly 6800 survey responses in total) and representative samples of student responses to the common DSO evaluation survey questions relating to perceptions of importance and satisfaction with core OLE elements in the years 2004, 2005 and 2011. This research responds to the call in the literature for additional extended investigation of OLE impact, and contributes to the literature by documenting a significant repeated cross-sectional institutional OLE evaluation – both in terms of respondent numbers and time period covered. Statistically significant changes in student perceptions of the OLE over time are identified; the magnitude and direction of these changes is explored; and the nature of these changes are analysed. We provide a method for, and one large-scale case study of, quantifying and visualising the trajectories of engagement over time that students have had with an institutional OLE.

Methodology

Full details of the 2004 and 2005 DSO evaluation survey, its methodology, respondent samples and results have been presented previously (Palmer and Holt 2010). Broadly, all three instances of the DSO evaluation survey sought responses from students relating to:

The demographic information was used to test whether the sample respondent group was representative of the overall population of enrolled students. All three surveys contained 13 common OLE elements for which respondents were asked to indicate both their rating of importance and their level of satisfaction using ordinal response scales. Response scales of 1–7 were used in 2004 and 2005, and a response scale of 1–5 was used in 2011. The 13 common OLE elements were:

  1. Accessing unit guide and other unit information;
  2. Accessing unit lecture, tutorial or lab notes etc.;
  3. Interacting with unit learning resources;
  4. Using the unit calendar;
  5. Contacting teachers via internal unit messaging;
  6. Contacting students via internal unit messaging;
  7. Reading contributions to online discussions;
  8. Contributing to online discussions;
  9. Completing online quizzes/tests;
  10. Submitting assignments;
  11. Receiving feedback on assignments;
  12. Working collaboratively in a group; and
  13. Reviewing unit progress.

All three surveys also asked respondents to indicate their level of agreement using a response scale of 1–5 with a range of overall satisfaction measures relating to DSO. The overall satisfaction item “the use of DSO enhanced my learning experience” was included in all three surveys. The open-ended written comments collected in all three instances of the survey are a large and rich source of qualitative data in their own right, but are not included here due to practical space limitations.

This study did not attempt to follow a cohort of specific respondents over time – for many students this would not be feasible. We sought to include students from all years of study, including those in the final year of their studies, and who would only be able to respond in one year. Even if initially surveyed in the first year of their studies, virtually no undergraduate students would still be enrolled in the same program of studies at the end of the period covered in this study. Instead, we sought a representative sample of the student population in each survey year. Here we compare the ratings of importance and satisfaction for each of the 13 OLE elements and identify any statistically significant differences across the three surveys. We also develop a method for visualising the trajectory of the importance-satisfaction data across the three surveys. Finally we compare the level of agreement with the common overall satisfaction item across the three surveys. Together these analyses provide insights into the development of student engagement with the OLE over an extended time period. As required by Deakin University human research ethics procedures, all of the surveys were anonymous and voluntary.

Findings and discussion

Response rate and demographic information

Table 1 presents a summary of the response rates obtained in the 2004 and 2005 DSO evaluation surveys. The demographic match between the sample and population for both years was generally very good across the dimensions of gender, mode of study, level of study, enrolled faculty and enrolled campus. The full comparison has been detailed elsewhere (Palmer and Holt 2010).


Table 1.  Response summary for 2004 and 2005 DSO evaluation surveys.
Year Enrolled population Respondent sample Response rate
2004 31641 2908 9.2%
2005 32354 2526 7.8%

In 2011, a new online system was used for the administration of the survey that saved all progressive responses entered, resulting in differential response rates for different sections of the survey. The 2011 effective response rate for those completing the entire survey was 5.8%, although higher response rates were obtained for some sections of the survey. A range of demographic information was available for the overall Deakin University student population, as well as collected as part of the survey, including gender, enrolled faculty, enrolled campus and duration of current enrolment. This permitted a comparison between the respondent sample and the overall student population on these demographic dimensions, as presented in Table 2.


Table 2.  Response rate and demographic information for 2011 DSO evaluation survey.
Population Sample
Number of respondents 22760 1322
Gender
  Female 59.5% 67.8%
  Male 40.5% 32.2%
Faculty
  Arts and Education 30.4% 30.1%
  Business and Law 36.9% 28.6%
  Health 19.8% 22.9%
  Science and Technology 12.9% 16.9%
  Other - 1.5%
Campus
  Geelong – Waurn Ponds 12.1% 14.1%
  Geelong – Waterfront 5.8% 7.5%
  Melbourne – Burwood 50.3% 43.5%
  Warrnambool 3.2% 3.5%
  Off-campus 28.6% 31.4%
  Mean enrolment duration 2.24 years 2.42 years

In the 2011, the survey was administered in the first teaching period and it was not possible to include newly commencing students in the population sample, hence the size of the target population group was somewhat reduced compared to 2004 and 2005. However, we attempted to quantify the possible impact of this slightly different population group on student perceptions of online aspects of their study. Deakin University conducts an evaluation of teaching of every offer of the majority of units of study. This evaluation asks students to indicate their level of agreement (on a scale of 1–5) with a range of statements, including item 9 – “The online teaching and resources in this unit enhanced my learning experience”. Commencing students are most likely to be found enrolled in first level units of study offered in the first teaching period of the year. Using data from the Deakin University student evaluation of teaching database for an entire annual period (mid-2009 to mid-2010), we compared the mean ratings for item 9 for first level units offered in the first teaching period to those from all other units of study. The means were identical to the fourth significant figure (first level = 3.7657; other = 3.7656) and, following confirmation of homogeneity of variance, an analysis of variance (ANOVA) test indicated no significant difference between the ratings from both classes of units (F1427=3×10−6; p>0.998).

Although the response rates obtained in all years were comparatively low, they were not unexpected for an online voluntary survey (Cook, Heath, and Thompson 2000). The generally good match between the sample and population demographic characteristics in all years, and the confirmation that commencing students do not hold significantly different views from other students about the value of online aspects of their study, suggest that we can have some confidence in drawing more general inferences from the respondent data.

Extended importance-satisfaction analysis

In all three years, the DSO evaluation survey asked respondents to rate the importance of, and their satisfaction with, a range of elements of the OLE at Deakin University. In 2004 and 2005, a rating scheme of 1 = low importance/satisfaction and 7 = high importance/satisfaction was used. In 2011 a rating scheme of 1–5 was employed. Research indicates that re-scaling of scale item data is possible (Dawes 2008), and in the following analysis we have re-scaled the 2004 and 2005 importance and satisfaction data to the range 1–5. For both importance and satisfaction a “not applicable” option was also provided to permit students not using a particular element to avoid having to provide a contrived rating. Table 3 provides a summary of the mean responses for the importance and satisfaction ratings from all three years, with the percentages of “not applicable” responses (which are the same for importance and satisfaction rating pairs) shown in parenthesis.


Table 3.  Mean importance and satisfaction ratings for 2004, 2005 and 2011.
Mean rating (1–5) ( n/a percentage shown in brackets)
OLE element (Importance and Satisfaction) 2004 2005 2011
1 Accessing unit guide and other unit information (Imp) 4.30 (1.9) 4.51 (0.9) 4.72 (0.2)
Accessing unit guide and other unit information (Sat) 3.42 3.71 4.13
2 Accessing unit lecture, tutorial or lab notes etc. (Imp) 4.60 (3.4) 4.65 (2.7) 4.81 (2.0)
Accessing unit lecture, tutorial or lab notes etc. (Sat) 3.31 3.58 3.90
3 Interacting with unit learning resources (Imp) 4.05 (7.7) 4.02 (7.3) 4.43 (1.1)
Interacting with unit learning resources (Sat) 3.12 3.34 3.73
4 Using the unit calendar (Imp) 2.35 (20.6) 2.20 (25.0) 2.91 (15.2)
Using the unit calendar (Sat) 2.65 2.81 3.18
5 Contacting teachers via internal unit messaging (Imp) 4.03 (7.4) 4.02 (6.5) 4.08 (6.1)
Contacting teachers via internal unit messaging (Sat) 2.99 3.31 3.59
6 Contacting students via internal unit messaging (Imp) 3.39 (11.7) 3.38 (10.1) 3.61 (9.5)
Contacting students via internal unit messaging (Sat) 2.94 3.28 3.56
7 Reading contributions to online discussions (Imp) 3.92 (9.0) 4.02 (5.2) 4.28 (1.9)
Reading contributions to online discussions (Sat) 3.30 3.61 3.82
8 Contributing to online discussions (Imp) 3.62 (10.3) 3.63 (7.5) 3.97 (2.0)
Contributing to online discussions (Sat) 3.10 3.44 3.75
9 Completing online quizzes/tests (Imp) 3.60 (32.1) 3.83 (26.1) 4.32 (13.3)
Completing online quizzes/tests (Sat) 2.93 3.34 3.76
10 Submitting assignments (Imp) 4.45 (21.3) 4.50 (21.8) 4.70 (4.9)
Submitting assignments (Sat) 2.95 3.27 3.70
11 Receiving feedback on assignments (Imp) 4.48 (17.7) 4.54 (18.7) 4.63 (7.0)
Receiving feedback on assignments (Sat) 2.53 2.76 3.29
12 Working collaboratively in a group (Imp) 3.41 (29.4) 3.34 (30.0) 3.76 (17.2)
Working collaboratively in a group (Sat) 2.67 2.86 3.09
13 Reviewing unit progress (Imp) 4.21 (15.1) 4.26 (14.3) 4.07 (9.8)
Reviewing unit progress (Sat) 2.70 2.98 3.24

A definitive indication of the significance of the differences between the mean ratings for an item over the three surveys is obtained from an ANOVA test using mean item rating as the dependent variable and survey year as the grouping variable. A requirement for the ANOVA test is that the variation of the mean rating be similar in all three survey years. Levene's test of homogeneity of variance failed for most survey items, and in these circumstances a robust ANOVA test using the Welch test statistic was performed instead. Significant differences in mean ratings between years were observed for most survey items. To establish which year pairs have significant differences in mean ratings, post-hoc pair-wise testing was performed. Where equal variance was assumed, Tukey's “honestly significant difference” post-hoc test was used; where equal variance was not assumed, Tamhane's T2 post-hoc test was used. Based on a significance level of p<0.01, Table 4 indicates the presence of significance differences in mean survey item ratings between survey year pairs – “No” indicates no significant difference; “↑” indicates a positive significant difference; and “↓” indicates a negative significant difference.


Table 4.  Significant differences in mean ratings for year pairs 2004, 2005 and 2011.
Significant difference?
OLE element (Importance and Satisfaction) 04–05 04–11 05–11
1 Accessing unit guide and other unit information (Imp)
Accessing unit guide and other unit information (Sat)
2 Accessing unit lecture, tutorial or lab notes etc. (Imp) No
Accessing unit lecture, tutorial or lab notes etc. (Sat)
3 Interacting with unit learning resources (Imp) No
Interacting with unit learning resources (Sat)
4 Using the unit calendar (Imp)
Using the unit calendar (Sat)
5 Contacting teachers via internal unit messaging (Imp) No No No
Contacting teachers via internal unit messaging (Sat)
6 Contacting students via internal unit messaging (Imp) No
Contacting students via internal unit messaging (Sat)
7 Reading contributions to online discussions (Imp)
Reading contributions to online discussions (Sat)
8 Contributing to online discussions (Imp) No
Contributing to online discussions (Sat)
9 Completing online quizzes/tests (Imp)
Completing online quizzes/tests (Sat)
10 Submitting assignments (Imp) No
Submitting assignments (Sat)
11 Receiving feedback on assignments (Imp) No
Receiving feedback on assignments (Sat)
12 Working collaboratively in a group (Imp) No
Working collaboratively in a group (Sat)
13 Reviewing unit progress (Imp) No
Reviewing unit progress (Sat)

Figures 1 and 2 present one way to visualise the extended importance and satisfaction ratings for the OLE elements over the three DSO evaluation surveys. Corresponding year pair mean values of importance and satisfaction are plotted as points on a two dimensional chart of importance (vertical axis) versus satisfaction (horizontal axis). The points for all three years for each OLE element are joined to form a two-segment vector chain that describes the trajectory of the mean ratings of importance and satisfaction for that OLE element over the period of the three surveys. The numbering of the OLE element trajectories in the chart below are the same as those used above. Note that compressed scales are used on both axes. For purposes of clarity, the OLE elements have been separated into two figures to avoid overlapping of trajectory lines. To permit absolute comparisons between figures, both figures have been plotted using the same axes ranges.

Fig 1
Figure 1.  Importance-satisfaction trajectories for 2004–2011: Part A.

Fig 2
Figure 2.  Importance-satisfaction trajectories for 2004–2011: Part B.

For example, consider the vector chair for element 8 – “Contributing to online discussions” shown in Figure 1. The 2004 importance and satisfaction mean rating pair is denoted by the diamond at the left end of the vector. In 2005, denoted by the circle at the mid-point of the vector, there was essentially no increase in the mean importance rating, but a significant increase in the mean satisfaction rating, hence a horizontal line joining the two points. In 2011, denoted by the arrowhead, there were significant increases in the mean ratings for both importance and satisfaction, hence an upward sloping diagonal line joining the two points.

Tables 3 and 4, and Figures 1 and 2 reveal that the changes in mean ratings between many of the survey year pairs were significantly positive, as evidenced by the general trend of the trajectories in Figures 1 and 2 moving toward the top right-hand corner. In addition, the mean satisfaction ratings increased significantly between all survey year pairs, and the mean increase in satisfaction ratings across the whole period 2004–2011 (0.63) was more than twice that for importance ratings (0.30), suggesting that exposure, training, experience, general increases in student IT skill and confidence levels, and perhaps other factors have resulted in an increasingly positive engagement with the OLE by students over time. It is reasonable to assume that as the institution has progressively embedded the OLE across campuses and programs of study that the perceived importance of the system might be viewed as higher, with increasing weight being given to use of the system by university management, teaching staff, etc. One striking difference was that only one item returned no significant difference across any of the survey year pairs, that was the importance rating for the element “Contacting teachers via internal unit messaging”. This result may be due to the fact that the internal LMS messaging/email system was largely unused, as it duplicated the email systems provided by the university and other external providers, and did not integrate with any external email systems.

A second marked difference from the general positive trend was that only one item returned a significantly lower mean rating across the longer duration survey year pairs (2004–2011 and 2005–2011), and that was the importance rating for the element “Reviewing unit progress”. While this result suggests that there is an opportunity to better use the OLE as part of engaging students in self-managing their studies, it is noted that the absolute mean importance rating for this item remained relatively high. An important overall observation is that many of the mean importance ratings showed no significant increase over the single year observation period of 2004–2005, but did show significant increases in mean rating over the longer timeframes of 2004–2011 and 2005–2011. This confirms observations elsewhere (Smith and Caruso 2010) and supports the proposition that extended evaluation of OLE usage is required to reveal the statistically significant details in the evolution of system use (Browne, Jenkins, and Walker 2006; Conrad 2005).

Considering the proportions of respondents in Table 3 choosing the “not applicable” rating for particular elements shows some distinctive decreases (hence presumed increases in element usage) over the period under consideration. The use by students of online discussions (posting and reading contributions), online quizzes, and online assignment submission and return, have all apparently increased in both absolute and proportional terms. This observation is presumably linked to an increased incorporation of these OLE elements into assessable student learning activities by academic staff.

The element with the lowest 2011 mean importance and satisfaction ratings is “Using the unit calendar”. This result may be due to the fact that the internal LMS calendar system was largely unused, as it duplicates the function of, and does not integrate with, existing calendar systems used by students. The elements with the highest 2011 mean importance and satisfaction ratings are “Accessing unit guide and other unit information” and “Accessing unit lecture, tutorial or lab notes etc.”. These two elements could be considered “basic” OLE elements, and an institution should aspire to get a satisfactory rating from students for these. The provision of learning materials electronically may not necessarily be seen as high value adding educational LMS function, but the high importance and satisfaction ratings attributed to these elements by students suggests that they nevertheless value these OLE functions. The element with the highest 2011 mean importance rating and the lowest 2011 mean satisfaction rating is “Receiving feedback on assignments”. Given the critical importance of timely formative/progressive feedback for delivering information about progress and clarifying expected and actual performance, so as to influence students to take a proactive role in their learning and for their development as self-regulated learners (Nicol and MacFarlane-Dick 2006; Yorke 2003), this result should be of concern, and act as a flag for action that could have a positive impact on the contribution of the OLE to student learning.

Contribution to learning

In 2004 and 2005, the DSO evaluation survey asked students to indicate their level agreement on a scale of 1–5 with the statement “The use of DSO enhanced my learning experience”, and essentially the same question was included in the 2011 survey. The mean response ratings for all three years (with 95% confidence intervals estimated) are presented in Figure 3. Note that a compressed vertical scale is used.

Fig 3
Figure 3.  Mean agreement with ‘The use of DSO enhanced my learning experience’ 2004–2011.

Acknowledging the slightly different respondent groups in the two earlier surveys, and the slight difference in wording of the response question in 2011, there is a significant increase across the period of the three surveys in the perception of students that DSO contributes positively to their learning. This increase mirrors, and in fact exceeds on a percentage basis, the overall mean trend in increasing satisfaction ratings observed for the individual OLE elements.

General discussion

Recent data indicates a significant proportion of LMS owners are considering changing their LMS platform (Instructional Technology Council 2011). This may be due to the age of the existing system, changes in the teaching and learning context of the institution or the emergence of new LMS vendors, including the option of open source systems (EDUCAUSE Learning Initiative 2011). The passing of time means that existing contracts with vendors may expire, and this coupled with recently observed consolidation amongst system vendors and new entrants into the LMS marketplace mean that, as has happened at Deakin University, there will be some changeover of LMSs. The research presented here spans the entire production lifecycle of the retiring LMS at Deakin University, and in documenting the student perceptions of the OLE at the end of long period of system development and refinement; it also establishes a valuable baseline benchmark against which to compare the performance of the new LMS. To this end, it is planned to repeat the DSO evaluation survey for at least 2012 and 2013, as the use of the new LMS, as the underpinning core of the new DSO OLE, becomes mainstream.

We acknowledge that during the period included in this study, the university LMS has gone through a series of upgrades that have altered the operation of some of the functions included in the surveys. However, these changes were largely points of finesse rather than fundamental changes to the operation of the LMS functions, and the same core set of LMS functionality was available throughout the entire survey period. The achievement of a high level of overall satisfaction in 2011 begs the question whether such a high level of reported student satisfaction can be sustained through the migration to the new LMS, or whether the inevitable change and consequent reorientation required by students will cause a step down in perceived contribution to learning and/or any re-tracing of the importance-satisfaction trajectories. It is noted that the perceptions of teaching staff may interact with those of students in determining the ultimate impact of an OLE on student learning outcomes (McGill and Klobas 2009). Although not addressed here, a complementary set of data has also been collected from teaching staff at Deakin University. Planned future research will investigate the interaction between student and staff perceptions of elements of the OLE.

Conclusions

The data from a large, repeated cross-sectional and quantitative survey of student perceptions of elements of an OLE were analysed to reveal the extended development of student engagement with the OLE. Drawing on nearly 6800 survey responses over the period 2004–2011, statistically significant changes in student perceptions of the OLE over time were identified; the magnitude and direction of these changes was explored and the nature of these changes was analysed. Across the duration of survey period, satisfaction ratings with all OLE elements rose significantly, suggesting a strong and positive student engagement with the OLE over time. The corresponding ratings of importance of OLE elements generally also rose significantly over the duration of the survey period, though a number of elements registered no significant difference in the first two years of the survey, providing support for the proposition that short period repeated surveys may struggle to reveal statistically significant trends.

One counter result registering no significant increase in mean importance rating related to an OLE element with limited use – element use appears to be closely linked to perceived value. A second marked difference was a significant decline in importance rating for the element “Reviewing unit progress”, suggesting an opportunity to better use the OLE as part of engaging students in self-managing their studies. Single year “point” observations from the end of the survey period in 2011 were made. The OLE elements with the highest mean importance and satisfaction ratings relate to student access of online learning resources. The OLE element with the highest mean importance rating and the lowest mean satisfaction rating is “Receiving feedback on assignments” – here suggesting another opportunity to better use the OLE as part of engaging students in self-managing their studies.

We have also demonstrated a method for, and one large-scale case study of, quantifying and visualising the trajectories of engagement over time that students have had with an institutional OLE. The survey period documented here matches the production lifecycle of the current LMS at Deakin University, which is currently being progressively replaced by a new system. A key question going forward is whether the relatively high ratings of satisfaction observed at the end of the life of the current LMS are maintained across the transition to the new LMS, or whether the inevitable disruptions will lead to any retracing of the observed trajectories of engagement.

References

Bates, R. & Khasawneh, S. (2007) ‘Self-efficacy and college students’ perceptions and use of online learning systems’, Computers in Human Behavior, vol. 23, no. 1, pp. 175–191. [Crossref]

Browne, T., Jenkins, M. & Walker, R. (2006) ‘A longitudinal perspective regarding the use of VLEs by higher education institutions in the United Kingdom’, Interactive Learning Environments, vol. 14, no. 2, pp. 177–192. [Crossref]

Challis, D. (2005) ‘Eroding distinctiveness: blurring the boundaries between on- and off-campus students by the adoption of learning management systems’, in 17th Biennial Conference of the Open and Distance Learning Association of Australia, eds M. Tulloch, S. Relf & P. Uys, ODLAA, Adelaide, pp. 87–96.

Conrad, D. (2005) ‘Building and maintaining community in cohort-based online learning’, Journal of Distance Education, vol. 20, no. 1, pp. 1–20.

Cook, C., Heath, F. & Thompson, R. L. (2000) ‘A meta-analysis of response rates in web- or internet-based surveys’, Educational and Psychological Measurement, vol. 60, no. 6, pp. 821–836. [Crossref]

Davis, R. & Wong, D. (2007) ‘Conceptualizing and measuring the optimal experience of the eLearning environment’, Decision Sciences Journal of Innovative Education, vol. 5, no. 1, pp. 97–126. [Crossref]

Dawes, J. (2008) ‘Do data characteristics change according to the number of scale points used? An experiment using 5-point, 7-point and 10-point scales’, International Journal of Market Research, vol. 50, no. 1, pp. 61–77.

Drennan, J., Kennedy, J. & Pisarski, A. (2005) ‘Factors affecting student attitudes toward flexible online learning in management education’, The Journal of Educational Research, vol. 98, no. 6, pp. 331–338. [Crossref]

EDUCAUSE Learning Initiative. (2011) 7 Things You Should Know About LMS Evaluation, EDUCAUSE, Boulder, CO.

Holt, D. M. & Thompson, D. J. (1995) ‘Responding to the technological imperative: the experience of one open and distance education institution’, Distance Education: An International Journal, vol. 16, no. 1, pp. 43–64. [Crossref]

Instructional Technology Council. (2011) 2010 Distance Education Survey Results, Instructional Technology Council, Washington, DC.

Lonn, S. & Teasley, S. D. (2009) ‘Saving time or innovating practice: investigating perceptions and uses of Learning Management Systems’, Computers & Education, vol. 53, no. 3, pp. 686–694. [Crossref]

Mahdizadeh, H., Biemans, H. & Mulder, M. (2008) ‘Determining factors of the use of e-learning environments by university teachers’, Computers & Education, vol. 51, no. 1, pp. 142–154. [Crossref]

McGill, T. J. & Klobas, J. E. (2009) ‘A task-technology fit view of learning management system impact’, Computers & Education, vol. 52, no. 2, pp. 496–508. [Crossref]

Mikropoulos, T. A. & Natsis, A. (2011) ‘Educational virtual environments: A ten-year review of empirical research (1999–2009)’, Computers & Education, vol. 56, no. 3, pp. 769–780. [Crossref]

Nicol, D. J. & MacFarlane-Dick, D. (2006) ‘Formative assessment and self-regulated learning: a model and seven principles of good feedback practice’, Studies in Higher Education, vol. 31, no. 2, pp. 199–218. [Crossref]

Palmer, S. & Holt, D. (2010) ‘Students’ Perceptions of the value of the elements of an online learning environment: looking back in moving forward’, Interactive Learning Environments, vol. 18, no. 2, pp. 135–151. [Crossref]

Reynolds, D., Treharne, D. & Tripp, H. (2003) ‘ICT – the hopes and the reality’, British Journal of Educational Technology, vol. 34, no. 2, pp. 151–167. [Crossref]

Salinas, M. F. (2008) ‘From Dewey to Gates: a model to integrate psychoeducational principles in the selection and use of instructional technology’, Computers & Education, vol. 50, no. 3, pp. 652–660. [Crossref]

Smith, S. D. & Caruso, J. B. (2010) The ECAR Study of Undergraduate Students and Information Technology, 2010 – ECAR Research Study 6, EDUCAUSE, Boulder, CO.

West, R., Waddoups, G. & Graham, C. (2007) ‘Understanding the experiences of instructors as they adopt a course management system’, Educational Technology Research and Development, vol. 55, no. 1, pp. 1–26. [Crossref]

Yorke, M. (2003) ‘Formative assessment in higher education: moves towards theory and the enhancement of pedagogic practice’, Higher Education, vol. 45, no. 4, pp. 477–501. [Crossref]