ORIGINAL RESEARCH ARTICLE

Shaping the future of learning using the student voice: we’re listening but are we hearing clearly?

Chris Meadows*, Kate Soper, Rod Cullen, Catherine Wasiuk, Colin McAllister-Gibson and Phil Danby

Learning Innovation, Manchester Metropolitan University, Manchester, England

Abstract

Student voice data is a key factor as Manchester Metropolitan University strives to continually improve institutional technology enhanced learning (TEL) infrastructure. A bi-annual Institutional Student Survey enables students to communicate their experience of learning, teaching and assessment on programmes and specific units studied. Each cycle of the survey contains approximately 40–50,000 free text comments from students pertaining to what they appreciate and what they would like to see improved. A detailed thematic analysis of this data has identified 18 themes, arranged into six categories relating to the ‘Best’ aspects of courses, and 25 themes, arranged in seven categories in relation to aspects of courses considered to be ‘in need of improvement’. This student data was then used as a basis for semi-structured interviews with staff. Anecdotally, evidence suggested that student expectations and staff expectations around TEL and the virtual learning environment (VLE) differed. On-going evaluation of this work has highlighted a disconnect. In significant instances, academic colleagues seemingly misinterpret the student voice analysis and consequently struggle to respond effectively. In response to the analysis, the learning technologist’s role has been to re-interpret the analysis and redevelop TEL staff development and training activities. The changes implemented have focused on: contextualising resources in VLE; making lectures more interactive; enriching the curriculum with audio–visual resources; and setting expectations around communications.

Keywords: student engagement; student voice; analytics; technology enhanced learning; student feedback

Citation: Research in Learning Technology 2016, 24: 30146 - http://dx.doi.org/10.3402/rlt.v24.30146

Copyright: © 2016 C. Meadows et al. Research in Learning Technology is the journal of the Association for Learning Technology (ALT), a UK-based professional and scholarly society and membership organisation. ALT is registered charity number 1063519. http://www.alt.ac.uk/. This is an Open Access article distributed under the terms of the Creative Commons Attribution 4.0 International License, allowing third parties to copy and redistribute the material in any medium or format and to remix, transform, and build upon the material for any purpose, even commercially, provided the original work is properly cited and states its license.

Received: 23 October 2015; Accepted: 20 September 2016; Published: 21 November 2016

*Correspondence to: Email: c.meadows@mmu.ac.uk

Introduction and context

Student engagement data and feedback is now a key strategic driver for universities (see Alderman, Towers, and Bannah 2012). Recruitment of students is predicated on accurate information about potential university destinations to enable decision-making. Academics and university administrators need accurate data to help them plan, monitor and improve. In order to facilitate benchmarking and indicate market performance University’s require timely and accurate data. At a higher level, governments and sector agencies need this information to plan funding and policy development and to highlight accountability. Much of the data needed to satisfy these requirements centres on the performance indicators supposedly surrounding the concept of student engagement (see Harvey 2011; Reed and Whatmough 2015; Ritchie and Spencer 2002).

Conceptually, ‘student engagement’ is not a cohesive or universally interchangeable term. It has theoretical roots in Astin’s Student Involvement Theory (Astin 1999) and indeed in Tinto’s Student Departure Theory (Tinto 1975). As demonstrated by Trowler’s (2010) comprehensive literature review, there are distinct character differences on either side of the Atlantic. Student engagement is perhaps more comprehensively defined in North America and Australia as the result of specific engagement questionnaires. The National Survey of Student Engagement (NSSE) and the Australasian Survey of Student Engagement (AUSSE) are grounded in a refined definition student engagement predicated on ‘a body of knowledge built up since the mid-1980s establishing correlation between student’s investment of time, effort and interest in a range of educationally-orientated activities, and favourable outcomes’ (Trowler and Trowler 2010, p. 8). Both the NSSE and AUSSE seek to ‘stimulate evidence-focused conversations about student’s engagement in university study’ (AUSSE 2009, p. vii).

The UK, in contrast, has an absence of student engagement surveys resulting in much of the student engagement data being gleaned from other ‘traditions such as student feedback surveys (such as the National Student Survey, NSS), student representation and student approaches to learning’ (Trowler 2010, p. 3).

There is a coalescing conceptualisation of student engagement in the United Kingdom. The Higher Education Funding Council for England (HEFCE) views student engagement as ‘the process whereby institutions and sector bodies make deliberate attempts to involve and empower students in the process of shaping the learning experience’ (HEFCE 2008, p. 8) Subsequently the HEA began piloting a student engagement survey with nine institutions in 2013, rising to 32 institutions with 24,000 respondents the following year (Buckley 2014). It brings the UK picture more in line with the US and Australian surveys as the UKES comprises 50 questions – 39 of which are derived from the NSSE. Consequently, the future research into the state of student engagement is likely to be comparable.

At the institutional level, Manchester Metropolitan University (MMU) has invested significantly in developing a technical infrastructure to support teaching, learning and assessment (Stubbs 2014a). This development has paved a way for MMU to inform its activity. By developing core quality assurance mechanisms such as the Institutional Student Survey (ISS), the university has been able to capture student feedback as never before. Two large-scale institutional projects have worked in sequence to improve the student experience and capture the student feedback in driving institutional change. The Enhancing the Quality of Assessment for Learning (EQAL) (see Bird et al.2015) and the TRAFFIC project (see TRAFFIC 2015) brought about a tremendous amount of synchronicity between various student data systems to facilitate the quick and easy interchange of data. Aligned with a series of web service APIs and a core VLE (Moodle), this has produced a technical infrastructure that has ‘wrapped the institution around the learner’ (Stubbs 2014b, p. 1).

These technical developments revolutionised the approach to technology enhanced learning (TEL) at MMU and laid the foundations for a Quality Enhancement and Assurance drive. This drive has been able to leverage the student voice through feedback and utilised these insights to inform the institutional strategy from a firm evidence base. These developments have been made possible by the introduction of the ISS. This survey operates during key windows (unit/programme end points) during the academic calendar. From 2008 to 2015 the survey ran bi-annually in December and March but since 2015, it runs tri-annually in December, March and June. The survey is presented to all students on taught undergraduate and post-graduate programmes primarily as a series of quantitative ‘Likert’ scale questions (see Appendix 1). Initially on a bi-annual basis but later on a tri-annual basis students are invited to comment quantitatively and qualitatively on their experiences of teaching, learning and assessment at the university.

The ISS is a critical student feedback mechanism into the continuous monitoring and improvement process. A key component in this process is a sophisticated digital dashboard presenting a whole range of quantitative and longitudinal metrics on performance to unit and programme leaders to institute change within programmes of study. Indeed the process aims to:

… support the maintenance of standards, to assure the consistency of learning opportunities and to enhance the quality of the learning experience for students by continually reviewing provision, identifying areas for improvement and taking appropriate and timely actions.
http://www.mmu.ac.uk/academic/casqe/experience/monitoring-improvement.php

Quantitative data from the ISS is augmented with additional student engagement data including VLE usage stats, assessment submission and attendance data where available. All of this data provides a quantitative view of student engagement.

A large volume of qualitative student feedback is also gained about the student experience from the ISS through two free text questions that ask students to respond to the following statements about the course as a whole and individual units:

The best thing about my course/unit is;
In need of improvement on this course/unit is;

The December 2014 data set returned 47,800 free text comments across all levels and areas of study relating to the student experience of teaching, learning and assessment. Although there are obvious uses of this data for module and programme-level enhancement, there is also an opportunity to mine this large data set to investigate student voice and experience in relation to specific issues.

This paper draws together two related pieces of work. The first is a thematic analysis of free text comments relating to the students’ experience of TEL and the use of this analysis by a team of Technology Enhanced Learning Advisors (TELAs) to develop a targeted staff development and training programme in response. The second is an investigation of the interpretation of the TEL themes identified in the analysis by academic staff colleagues supported by the TELAs.

Consequently, this paper aims to:

Methodology

The approach methodology was that of Bricolage (see Kincheloe 2005; Lincoln 2001; Rogers 2012). This was a blend comprising characteristics of grounded theory (see Birks and Mills 2011; Charmaz 2006) and derivatives of phenomenology (see O’Leary 2004). It involved two stages – investigating both the student voice and the staff voice. The research involved several sequential stages mapping the narrative of the collection, analysis, interpretation and utilisation of the student voice data. This was done through a series of iterative phases:

Primarily the research was opportunistic as data was collected from surveys already being conducted. This frames the theoretical approach as primarily grounded theory. However, as initial outputs from the thematic analysis informed and framed the approach to interviews conducted with staff, it could be argued that there are overtones of a phenomenological inquiry. Such complexity of methods in social research has been discussed by Lincoln (2001), McLaren (2001), Kincheloe (2001, 2005) and Rogers (2012) and they have developed the ‘Bricolage’ concept. Bricolage is a multimethod mode of research typically understood to ‘involve the process of employing these methodological strategies as they are needed in the unfolding context of the research situation’ (Kincheloe 2001, p. 324). This is not to be ‘wantonly eclectic with the critical tradition’ but moreover to ‘make the point that any attempts to delineate critical theory as discrete schools of analysis will fail to capture the evolving hybridity endemic to contemporary critical analysis’ (Kincheloe, McLaren, and Steinberg 2011).

Phase 1 – data collection

As previously discussed, the data was derived from the ISS and comprised of a filtered search of over 47,000 free text comments. This filtering was achieved by means of a keyword search of 24 keywords (Moodle; wifi; wireless; print; pc; pcs; mac; macs; software; drop in; mobile; twitter; phone; video; ipad; tablet; facebook; podcast; email; matlab; mymmu; app; mmutube; padlet; it zone). The key words were selected during an initial scan of a subset of 1000 comments derived from the main data set of over 47,000 free text comments. In an initial familiarisation process the keywords were selected as they were considered to characterise comments pertaining to TEL-related issues. This extracted 2072 comments relating to the student experience of the institutional VLE and other aspects of TEL; 4.5% of the total comments received in the ISS pertained to TEL or aspects of technology related to teaching and learning. This provided a data set of 746 comments relating to good practice; 1326 related to areas for development.

Phase 2 – student data analysis

These data sets were interrogated using a framework approach thematic analysis (see Ritchie and Spencer 1994, 2002; Stubbs 1994a). This involved five interconnected stages spanning several of the phases of investigation.

  1. Familiarisation
  2. Identifying a thematic framework
  3. Indexing
  4. Charting
  5. Mapping and interpreting

Familiarisation

The familiarisation process began with two members of the team independently immersing themselves in the data and reading for items of interest. This familiarisation phase involved ‘text segmentation’ (see Guest, Macqueen, and Namey 2012). This is an iterative process of defining text boundaries around identified features of interest. This process involved the analysts continually questioning where meaning begins, ends, intersects and overlaps in the text (see Guest, Macqueen, and Namey 2012). Whilst this phase is presented as the first sequentially, this process of familiarisation permeates all subsequent sequences as the analysts continually question and improve the themes identified refining their application of them. This allowed the researchers to compare bounded, segmented text fragments (Guest, Macqueen, and Namey 2012) in order to:

  1. Evaluate and document the overall utility of the data
  2. Aid the exploration of thematic elements in terms of similarity, dissimilarity and relationships

Thematic framework

The thematic framework was developed through the two independent analysts from the familiarisation phase comparing segmented text to tag arising themes or codes. At the most arbitrary level, this involved the application of what Guest, Macqueen, and Namey (2012), termed ‘key-word-in-context’ (KWIC). This was the identification of a word as the locus for the theme or concept being tagged. Saldaña defines it as ‘a phrase or sentence that identifies what a unit of data is about and/or what it means’ (Saldaña 2009, p. 139). Metaphorically this is the placing of the pin in a map. The benefit of using KWIC in a large data set is that it allows rapid searching as you can quickly search for these KWIC words. Also the analyst is able to quickly search for synonyms allowing rapid tagging of schema and development of a codebook for application. By embedding the synonym searching in the method, this strengthens the process expanding a literal search into a broader concept search. The process culminated in the production of the codebooks in Tables 1 and 2.


Table 1.  Best aspects of programme/unit
Themes Code
Effective online communications (Prompt response to online communications e.g., email, Discussion postings, Regular/Helpful Moodle Announcements) B1
Use of alternatives to Moodle e.g., Facebook for communications B4
Use of discussions for communication between students e.g., those on placements B16
Consistent/clear instructions on projects and assessments available (timely) B2
Variety of assignment types, e.g., podcast B17
Prompt/high quality feedback B18
Moodle Use  
General comments that Moodle is a good thing B5
Induction in how Moodle will be used B6
Clear instruction on self-directed work and how to use general resources in Moodle provided B12
Moodle resources available in advance (of teaching)/kept up to date B3
Ability to print Consistent/high quality learning resources from Moodle B13
Well organised/Consistent/high quality learning resources on Moodle B11
Provision of Content overviews (e.g., providing shorter synthesis of book chapters, papers/lecture notes) B7
Provision/use of audio/video resources (in class, via Youtube, MMU tube and Moodle) B8
Provision of Lecture recordings (Audio/Video) B9
Access to Library resources B14
Use of MMU app for convenience B10
Interactive lectures incorporating classroom technologies e.g., text wall, iPads/Apps B15


Table 2.  Aspects of programme/unit in need of improvement
Themes Code
General issues with poor organisation of progammes/units I1
Poor communication by lecturers (inaccurate info/slow response) I2
Inconsistencies in information provided between tutors I3
Timetables (inaccurate, subject to change, complicated to understand, difficult to keep to) I4
Stop using Facebook and other social media for communications I23
Assessment deadline bunching I5
Assessment briefs lacking/inaccurate I6
Assessment criteria not available on Moodle/not soon enough I7
Moodle flakiness/downtime in relation to assessment submission I9
Moodle flakiness/downtime in General I8
Access to appropriate computers I10
Poor WIFI/internet issues I11
Printing problems/expense I12
Search facility / sort in date order function in Moodle would help I24
Better induction on how Moodle will be used / Clearer steer on how staff will use Moodle as students often unaware of need to keep checking in I13
No clear instruction on self-directed work/no indication of how to use general resources in Moodle I17
Materials and resources not in Moodle I14
Materials and resources not available in Moodle in advance I15
Materials and resources of poor quality/poorly organised in Moodle I16
PowerPoint slides alone in Moodle are not useful I19
Slides used in lectures – not the same as those in Moodle I18
Insufficient use of video in teaching/Moodle I20
Lectures should be recorded and made available afterwards I22
Inappropriate/Over use of Video 125
Lecture/Seminars should be more interactive and not just reading/repeating what is in Moodle I21

Indexing

A team of eight TELAs analysed the data using the codebooks derived from the framework setting stage. During the indexing phase several analysts revisited the familiarisation phase clarifying and improving the coding which could be more formally built into the method. MacQueen et al. (2008) present an exemplified codebook template which contains a code label which is a short descriptive mnemonic. This is further expanded by a short definition. The codebook utilised here in the indexing phase comprised a hybrid version encapsulating a blend of the code label and short definition. However, the scale of the thematic analysis here perhaps justifies the condensing of the codebook.

Phase 3–student data visualisation/interpretation

Charting

The charting phase involved a higher level of abstraction and organisation. In searching for connective meanings and interpretation, the analytical methodology introduced categorisation. This involved the introduction of structure onto the codes or themes through the use of meta-themes. The grouping of themes into linked categories involved the application of a higher level of meaning not directly visible in the data. The meta-themes were derived from the original codebooks and are shown in Table 3.


Table 3.  Meta-themes applied to the codebook
Meta-themes – Best thing about my program / unit Meta-themes – Things to improve in my program / unit
Communication / Collaboration Communciation / Collaboration
Assessment Assessment
Moodle Use Technology Issues
Moodle Organisation Moodle Use
Moodle Content Moodle Organisation
Classroom Technology Moodle Content
Classroom Technology

This codification is demonstrated in adaptations of Tables 1 and 2 to produce Tables 4 and 5, respectively.


Table 4.  Meta-themes and themes – best aspects of programme/unit
Themes Code
Communication/Collaboration  
Effective online communications (Prompt response to online communications e.g., email, Discussion postings, Regular/Helpful Moodle Announcements) B1
Use of alternatives to Moodle e.g., Facebook for communications B4
Use of discussions for communication between students e.g., those on placements B16
Assessment  
Consistent/clear instructions on projects and assessments available (timely) B2
Variety of assignment types, e.g., podcast B17
Prompt/high quality feedback B18
Moodle Use  
General comments that Moodle is a good thing B5
Induction in how Moodle will be used B6
Clear instruction on self-directed work and how to use general resources in Moodle provided B12
Moodle Organisation  
Moodle resources available in advance (of teaching)/kept up to date B3
Ability to print Consistent/high quality learning resources from Moodle B13
Well organised/Consistent/high quality learning resources on Moodle B11
Moodle Content  
Provision of Content overviews (e.g., providing shorter synthesis of book chapters, papers/lecture notes) B7
Provision/use of audio/video resources (in class, via Youtube, MMU tube and Moodle) B8
Provision of Lecture recordings (Audio/Video) B9
Access to Library resources B14
Use of MMU app for convenience B10
Classroom technology  
Interactive lectures incorporating classroom technologies e.g. text wall, iPads/Apps B15


Table 5.  Meta-themes and themes – aspects to improve on the programme/unit
Themes Code
Communication/Collaboration  
General issues with poor organisation of progammes/units I1
Poor communication by lecturers (inaccurate info/slow response) I2
Inconsistencies in information provided between tutors I3
Timetables (inaccurate, subject to change, complicated to understand, difficult to keep to) I4
Stop using Facebook and other social media for communications I23
Assessment  
Assessment deadline bunching I5
Assessment briefs lacking/inaccurate I6
Assessment criteria not available on Moodle/not soon enough I7
Technology Issues  
Moodle flakiness/downtime in relation to assessment submission I9
Moodle flakiness/downtime in General I8
Access to appropriate computers I10
Poor WIFI/internet issues I11
Printing problems/expense I12
Search facility / sort in date order function in Moodle would help I24
Moodle use, organization and content  
Use  
Better induction on how Moodle will be used / Clearer steer on how staff will use Moodle as students often unaware of need to keep checking in I13
No clear instruction on self-directed work/no indication of how to use general resources in Moodle I17
Organisation  
Materials and resources not in Moodle I14
Materials and resources not available in Moodle in advance I15
Materials and resources of poor quality/poorly organised in Moodle I16
Content  
PowerPoint slides alone in Moodle are not useful I19
Slides used in lectures – not the same as those in Moodle I18
Insufficient use of video in teaching/Moodle I20
Lectures should be recorded and made available afterwards I22
Inappropriate/Over use of Video 125
In class  
Lecture/Seminars should be more interactive and not just reading/repeating what is in Moodle I21

The revised codes were charted to produce initial visualisations of the data at an institutional and faculty level. Initially, these were bar charts as shown in Figures 1 and 2. Here the occurrence of the codes was plotted across the meta-themes and themes. Firstly the ‘best’ elements of the programme/unit were plotted.

Fig 1
Figure 1.  Institutional overview of ‘best’ themes.

Fig 2
Figure 2.  Institutional overview themes ‘to improve’.

Secondly the areas for ‘improvement’ were plotted as shown in Figure 2.

Interpretation of these initial visualisations indicated a degree of ‘mirroring’ of certain meta-themes and themes. For example, ‘Communication’ was reported a ‘positive’ facet with 127 reports of ‘Communication’ as a strength of the unit. Yet it was also reported as a ‘negative’ with 155 incidents where students indicated perhaps a quicker response or clearer communications were required.

This ‘mirroring’ was not a contrary report by the students but indicated an inconsistent experience across programmes. Used as a comparator, ‘Communication’ could be reported positively in one unit that was utilised as a benchmark and to report a ‘poor’ experience in a different unit. So students are not commenting differently about the same unit but are highlighting that communications in some units/programmes are perceived as effective and in others are not. Students are positive about a fast response to questions and highly frustrated where this is not the case.

In investigating this concept of ‘mirroring’ further, subsequent visualisations were produced using Tableau as an image medium. These included hotspot analysis to drill into the data at a departmental level to inform the faculty reports that were produced. Also ‘mirrored’ charts were visualised to investigate the mirroring concept further.

Mapping and interpreting

The mapping of hotspots across the themes and departments generated visualisations that gave a clear perception of where issues or strengths lie. They also clearly indicate departments that can be exercised for sharing good practice and for identifying ones that perhaps need greater levels of support. In Figure 3, well performing departments can be seen by strong vertical colour bands that indicate departments performing well across numerous facets and could be leveraged by faculties to lead in good practice sharing. Strong horizontal banding shows strong performance in a particular theme across numerous areas, which perhaps is more pertinent for institutional level attention.

Fig 3
Figure 3.  Departmental hotspot analysis areas of best themes horizontal correlation demonstrates persistently positive reporting of a specific theme across multiple departments. Vertical correlation demonstrates persistently positive reporting of multiple themes within a single department.

Similarly, the representation of areas for improvement indicates areas of practice that need development at an institutional level by strong colour banding in horizontal themes. These may indicate areas of focus for institutional concern. Strong vertical colour banding indicates particular departments that may need further support or interventions at a faculty level to support development. This can be seen in Figure 4, here several themes show particularly high colour strength indicating a high degree of student comment. For example, areas of concern in Figure 4 are Communications and the use of PowerPoint slides in the VLE.

Fig 4
Figure 4.  Institutional hotspot analysis areas for improvement horizontal correlation demonstrates persistently negative reporting of a specific theme across multiple departments. Vertical correlation demonstrates persistently negative reporting of multiple themes within a single department.


The researchers wished to avoid alienating cooperative academic colleagues and the reports produced utilising the hot spot analysis where produced on a per faculty basis with anonymised results shown for other faculties. This allowed the institutional view but moved away from departments being able to compare with each other outside of their own faculties. The hotspot analysis formulated part of the reports they were sent to Faculty Executive Groups for action and comment. Some of the results of which are discussed later.

Further mappings of the ‘mirroring’ effect were also produced to investigate further which themes were prevalent. The early visualisations as displayed in Figures 1 and 2 were considered too visually busy to be easily interpreted. The categories were compared in an adapted area graph with positive results displayed on the top and negative results presented underneath. As seen from Table 3, the meta-themes across both the ‘best’ and ‘to improve’ elements were not uniform, so outlier elements were removed from the analysis. The plot shown in Figure 5 shows the resulting comparison between the meta-themes that remained for comparison.

Fig 5
Figure 5.  Mirrored meta-themes.

The resulting visualisation clearly identified five significant areas that the students commented upon in terms of their experience. The distinct yellow plot referred to ‘general positive comments about Moodle’ which the analysts discarded as it appeared a catchall rather than a meaningful research proposition. The remaining four characteristics that received significant student comment focused on:

  1. Effective communications
  2. Moodle organisation and high quality materials
  3. Use of audio visual resources
  4. Desire for ‘interactive’ lectures.
  5. Moodle a ‘good’ thing (yellow area discounted as discussed above)

The subsequent phases sought to explore the staff perception in relation to the outcomes in order to attempt to determine if there was awareness of these issues amongst academic colleagues and to suggest possible interventions.

At this juncture, the role of the TELAs was perhaps seen as being a lens through which the data could be interpreted on behalf of academic colleagues. It was felt by the researchers that the expressions of the student voice centred on some key concepts with regard to the student experience within the environs of technology.

For example, ‘Effective Communications’: There is a plethora of possibilities with regard to what these means on practical level. Does it refer to speedy response to email queries? Does it refer to meaningful communication rather than being time specific? Does it refer to effective contextualising of resources made available in the VLE for students to use by using effective and efficient instructions? There are a variety of possibilities. As such the researchers felt that there may be a role for them in positioning the student voice feedback. Where this interpretation should appear in the sequence of the research is still a moot point. In this study, it was positioned following staff investigations but perhaps in subsequent iterations of this research the action research cycle might position this interpretive phase between student data analysis and the staff data collection. These results highlighted target areas for intervention. From a TEL perspective, interpretation of these results pointed towards clear intervention strategies.

Effective Communications was interpreted as a need to develop a mutual expectation among staff and students around clarity of communication and setting expectations around the immediacy of communication. Also, it pointed towards strategies for minimising communication need by providing greater context and instructional notes round resources made available. Much of the perceived need for communication stemmed from assessment materials and deadlines that emphasised the need for academic colleagues to focus more attention on to the provision of meaningful communication in relation to assessment.

In a similar vein, feedback pertaining to Moodle use, emphasised the need for academic colleagues to provide greater contextualisation and instructional direction to resources in the VLE to drive the usage and utility value of them.

Students cited further, and more targeted use of audio–visual resources as a desirable outcome. Often the use of these resources were sporadic and seemingly unrelated to content either in the sessions or in the VLE, echoing the earlier concern around setting the context and providing instructional guidance.

Yet these interpretations derived from the TELAs were often not expressed by academic colleagues. Indeed, it became apparent through anecdotal evidence that the interpretations of these student results by academic colleagues deviated dramatically from the TELAs. Why and how this was the case became the next area of focus. The data derived from the student data analysis was utilised as foci for questions to be put to academic colleagues in investigating this disconnect.

Phase 4–staff data collection

The data derived from the ‘student engagement’ analysis pointed to four areas for investigation with academic colleagues.

  1. Effective Communications
  2. Moodle organisation and high quality materials
  3. Use of audio visual resources
  4. Desire for ‘interactive’ lectures.

The approach taken involved semi-structured interviews around these themes. Analysts from the research team conducted these interviews across the various faculties. An open research call was placed to the academic body. Forty academic colleagues indicated a willingness to participate. Four per faculty was set as an initial sample. Due to timing and various logistical issues 15 interviews were conducted and transcribed verbatim. This constituted 13 hours of transcribed interviews. The interviews were semi-structured around four key areas derived from the student feedback. Five prompt questions were utilised in the interviews as foci for discussion. These were:

  1. What does the data indicate about learning technology and the student experience?
  2. What is the student expectation around communication and how do we achieve it?
  3. What constitutes a well organised/resourced Moodle area and how do we achieve it?
  4. What issues arise from either using or planning to use audio/visual resources?
  5. What is meant by interactive lectures? And what challenges do they present?

The original data was digitally recorded and transcribed verbatim. This was then analysed in relation to the meta-themes from the student study. One of the objectives was to determine how student perception deviated from staff perception in relation to these themes. Yet from the outset, it became abundantly clear that other issues pervaded the academic psyche and significant mismatches in perception existed around these four key areas.

Phase 5 – staff results and interpretation

The staff transcripts were analysed for themes relating to the earlier foci and excerpts highlighted to see if the staff narrative reflected the student feedback analysed thus far.

Communications

Some examples as shown in Table 6 demonstrate the varying opinions around communications.


Table 6.  Staff and students’ views relating to communications
Student Quotes Staff Quotes
The support from the tutors. 90% of the tutors have said to me do not email me??? !! We pay for their advice and support. Consistency needs to be there but people are responding at different times and I’m guilty …. but I just think I’ll get that done … but then that creates an expectation on colleagues to do this as well.
Moodle space for communicating with the cohort is especially cherished as it keeps support and contact with the uni during placements. I think their expectation, again if we are talking about fee-paying students, is for a very quick accurate response, that can solve my problem, and off I go.
  … they expect 24/7 [communication]
  … when you go on to Moodle you can say, well I did, and you can see the announcements that I sent out and if you go back to Moodle you can see the information that was sent out that all the other students got, so it is good for covering yourself as well.

Comparing these insights underlines the incongruence between student expectation and staff perception. Staff appear to recognise the expectation for a ‘quick accurate response’ but that ‘responding at different times…creates an expectation that colleagues will do this as well’. In contrast, students indicate some staff refuse to engage at all in email communication and some staff seemingly engage as a useful way of ‘covering yourself’. These typologies of response expand the sense of disconnection ostensibly strengthening the view expressed earlier that TELAs are perhaps critical filters in determining meaningful interpretations of the student data.

Audio–visual

The use of audio–visual resources demonstrated a disparity of views. Students emphasised the positive use that enhanced learning by allowing them to ‘gain a full understanding of topics’ but also criticised practices that were poor when they were ‘rather pointless videos’. In contrast, staff perception of audio–visual resources suggested a fear of being replaced (see Table 7). Also further recognition that colleagues using these resources would then create an expectation among students and potentially create a mismatch. Therefore, a programme approach should be taken either all in or all out (see Table 7). Also prevalent were performance fears whereby technical difficulty or unfamiliarity could change ‘the dynamic, you have to regain your dignity unfortunately’ (see Table 7).


Table 7.  Staff and student views of audio–visual resource use
Student Quotes Staff Quotes
The business lectures are too long (3 hours). Some of the lectures I don’t think have been very useful, e.g., a lot of Youtube videos. We could have been given the links and then we could have the choice to watch them in our spare time. The business plan should have been briefed a lot earlier instead of showing us rather pointless videos. … like when you go and see a play and in the first few minutes if it’s well acted then you buy into it. If for the first few minutes the lecturer is presenting this narrative, is stuck, if the student has to say ‘Sir you don’t do it that way, you only have to press CTRL +Enter’, it changes the dynamic, you have to regain your dignity unfortunately.
He is attentive to the needs of the students with podcasts that allow us to listen to our lecture again to gain a full understanding of the topic (which is necessary in some instances) as well as ensuring that there is an overall good teaching standard in all the tutorials. [re Podcasting] … such and such does it why don’t you do it? It has to be consistent…you have to have an agreement that all of you are going to do it or none of you are going to do it.
  … they won’t come to lectures anymore even though I know it doesn’t necessarily happen because at a private college I’ve done them and they do still come.

Interactive lectures

Performance issues permeated the staff psyche in relation to interactive lectures. Here staff perceived a sense of a ‘West end show’ and difficulty for staff to ‘live up to that expectation’ (see Table 8).


Table 8.  Staff and student views of interactive lectures
Student Quotes Staff Quotes
I know this may be difficult within this subject but I would prefer more interaction within seminar/lectures. Despite loving this area I find it hard to concentrate when the lecturer is simply reading from the board. Makes lecture very hard to concentrate. I’d have liked to have been able to use, not mobile technology but other things for them to use in the class, but that’s the thing after they have found that bit of information, you don’t know that they are looking at, so that’s scary. My fear is impacting in them I think, I’ve lost them then I need to try and get them back in.
I feel that in the lecture we are able to engage more verbally in class through the texting app which helps create discussion about the topic rather than just reading from a board and feel this technique helps us understand each unit better. … it becomes ‘come on entertain me, it does need to entertain, but it’s not a West End show.
  What’s really difficult is when … you’ve got a course that’s technologically driven or uses technology as part and parcel of what they do … then it’s really easy for the students to expect that … What’s very difficult then for other members of staff is where that’s not their normal practice to live up to that expectation.

Further staff concerns centred around control whereby enabling students to use technology led to a sense of ‘I’ve lost them and I need to try and get them, back in’ (see Table 8). Conversely students viewed this approach as enabling for them with the ability to ‘engage more verbally in class through the texting app’ (see Table 8). The absence of interaction was seen as an impediment to learning as students found it ‘hard to concentrate when the lecturer is simply reading’ (see Table 8).

Moodle use

The Moodle usage comments from staff appeared to emphasise the quantity of material and the ‘temptation to put loads on there’ (see Table 9). The predominant feeling from students was that material was ‘hard to understand when it was out of context’.


Table 9.  Staff and student views of Moodle use
Student Quotes Staff Quotes
I struggle to find things on Moodle menus all over the place secondary page? pedagogy page? subject page? and there is no decent search facility … or even a way to view resources in date order would help. … when you’re putting together your Moodle area, it makes sense to you, but not necessarily to them.
It would be good if the student and staff attitudes to Moodle could be brought into line. Staff use it while students don’t necessarily know they have to. I try and really use that [Moodle] student view, I’m still trying to move stuff off, because I’m looking at what it looks like to them. I want to make my life easier, so that they can find things.
Information that is uploaded to Moodle is sometimes not explained or even mentioned in class … although this is good extra reading it’s hard to understand when it’s out of context. … there is a temptation to put loads on there and almost provide lots, because there is that expectation that you only see them for 3 hours per week and that they can carry on their own private learning … But then, is that when they start to think that it’s unorganised and not relevant to the exam?

Summary

Extracting and thematically analysing student voice data in the form of free text comments from the Internal Student Survey has proved to be a valuable tool in identifying key aspects of students’ engagement and experience of TEL. Furthermore, the approach has enabled TELAs to develop targeted staff development and training provision in response to key themes identified. Experience of delivering staff development and training and exploration of academic staff interpretations of the key TEL themes highlighted a sense of fear in academic staff around some of the technologies and related approaches to learning and teaching.

As a consequence of this work, TELAs have made several changes to their practice and approach to staff development and training for academic staff colleagues. There has been a redesign of the training provision to create greater synergies with the institutional Learning Teaching and Assessment Strategy (LTA Strategy). This redesign has resulted in an alignment of training sessions along the six principles expressed in the LTA Strategy (see MMU LTA 2015) the first of which were presented in the summer of 2015. These sessions embedded technology much more discretely into workshops rather than had previously been the case where technology had been the headline element. Initial anecdotal evidence suggests that this had positive impact in terms of attendance, attrition rates and feedback though this requires further evaluation.

In trying to address cohesion between staff interpretation and the student voice data, a VLE template has been developed as starting point for building online unit areas. In this respect, perhaps the main outcome relates to the need to contextualise resources and provide instructional guidance on the use of the VLE as a teaching platform. The VLE template has attempted to scaffold this for academic colleagues and has been trailed in three faculties across the institution. Further iterations of the thematic analysis should be able to determine whether these practical interventions have had a significant impact on the conundrum expressed in one of the staff comments. To consider that ‘when you’re putting together your Moodle area, it makes sense to you, but not necessarily to them’.

References

Alderman, L., Towers, S. & Bannah, S. (2012) ‘Student feedback systems in higher education: a focused literature review and environmental scan’, Quality in Higher Education, vol. 18, no. 3, pp. 261–280. Publisher Full Text

Astin, A. (1999) ‘Student involvement: a developmental theory for higher education’, Journal of College Student Development, vol. 40, no. 5, pp. 518–529.

AUSSE (2009) ‘Engaging for success: Australasian Student Engagement Report Australasian Survey of Student Engagement’, Hamish Coates published by ACER, [online] Available at: http://research.acer.edu.au/cgi/viewcontent.cgi?article=1017&context=higher_education

Bird, P., Forsyth, R., Stubbs, M. & Whitton, N. (2015) ‘EQAL to the task: stakeholder responses to a university-wide transformation project’, Journal of Educational Innovation, Partnership and Change, vol. 1, no. 2. ISSN 2055-4990. Available at: https://journals.gre.ac.uk/index.php/studentchangeagents/article/view/177/244 [accessed 16 November 2016].

Birks, M. & Mills, J., (eds) (2011) ‘Essentials of grounded theory’, in Grounded Theory: A Practical Guide, Sage, London, pp. 11–26.

Buckley, A. (2014) UK Engagement Survey 2014 – The Second Pilot Year. Higher Education Academy, York, UK, [online] Available at: https://www.heacademy.ac.uk/system/files/resources/ukes_report_2014_v2.pdf

Charmaz, K. (2006) Constructing Grounded Theory: A Practical Guide through Qualitative Analysis, Sage, London.

Guest, G., Macqueen, K. M. & Namey, E. E. (2012) Applied Thematic Analysis, Sage, Thousand Oaks, CA.

Harvey, L. (2011) ‘The nexus of feedback and improvement’, in Student Feedback: The Cornerstone to an Effective Quality Assurance System in Higher Education, eds. C. S. Nair & P. Mertova, Chandos, Oxford, pp. 11–26.

HEFCE (2008) ‘Tender for a study into student engagement, Call for tenders edn, Higher Education Funding Council for England, Bristol’, [online] Available at: http://webarchive.nationalarchives.gov.uk/20120118164921/http://www.hefce.ac.uk/pubs/hefce/2008/

Kincheloe, J. (2001) ‘Describing the bricolage: conceptualizing a new rigor in qualitative research’, Qualitative Inquiry, vol. 7, no. 6, pp. 679–692. Publisher Full Text

Kincheloe, J. (2005) ‘On to the next level: continuing the conceptualisation of the bricolage’, Qualitative Inquiry, vol. 11, no. 3, pp. 323–350. Publisher Full Text

Kincheloe, J. L., McLaren, P. & Steinberg, S. (2011) ‘Critical pedagogy and qualitative research: moving to the bricolage’, in The SAGE Handbook of Qualitative Research, 4th edn, eds. N. K. Denzin & Y. S. Lincoln, Sage, Thousand Oaks, CA, pp. 163–178.

Lincoln, Y. (2001) ‘An emerging new bricoleur: promises and possibilities – a reaction to Joe Kincheloe’s “Describing the bricolage”’, Qualitative Inquiry, vol. 7, no. 6, pp. 693–696. Publisher Full Text

MacQueen, K. M., et al., (2008) ‘Team-based codebook development: structure, process, and agreement’, in Handbook for Team Based Qualitative Research, eds. G. Guest & K. M. Macqueen, AltaMira, Lanham, MD, pp. 119–135.

McLaren, P. (2001) ‘Bricklayers and bricoleurs: a Marxist addendum’, Qualitative Inquiry, vol. 7, no. 6, pp. 700–705. Publisher Full Text

MMU LTA (2015) ‘The MMU Strategy for Learning, Teaching and Assessment’, [online] Available at: http://www.celt.mmu.ac.uk/ltastrategy/

O’Leary, Z. (2004) The Essential Guide to Doing Research, Sage, London.

Reed, P. & Whatmough, S. (2015) ‘Hygiene factors: using VLE minimum standards to avoid student dissatisfaction’, E-learning and Digital Media, vol. 12, no. 1, pp. 68–89. Publisher Full Text

Ritchie, J. & Spencer, L. (1994) ‘Qualitative data analysis for applied policy research’, in Analyzing Qualitative Data, eds. A. Bryman & R. G. Burgess, Routledge, London, pp. 173–194.

Ritchie, J. & Spencer, L. (2002) ‘Quantitative data analysis for applied policy research’, in The Qualitative Researcher’s Companion, eds. M. Huberman & M. B. Miles, Sage, London, pp. 305–331.

Rogers, M. (2012) ‘Contextualising theories and practices of bricolage research’, The Qualitative Report, vol. 17, no. 48, pp. 1–17.

Saldaña, J. (2009) The Coding Manual for Qualitative Researchers, Sage, Thousand Oaks, CA.

Stubbs, M. (2014a) ‘Keynote: enhancing the student experience at MMU: clearing a path and interpreting digital footprints’, European First Year Experience Network (EFYE) 2014 Conference 9–11 June, Nottingham University, Nottingham, [online] Available at: http://www.ntu.ac.uk/apps/events/9/home.aspx/event/151843/multimedia

Stubbs, M. (2014b) ‘Transforming the student experience: Manchester Metropolitan University’s EQAL project’, European University Information Systems Conference 2014, Umea University, Umea, pp. 1–6.

Tinto, V. (1975) ‘Dropout from higher education: a theoretical synthesis of recent research’, Review of Educational Research, vol. 45, pp. 89–125. Publisher Full Text

TRAFFIC (2015) ‘JISC Transforming assessment and feedback for institutional change’, [online] Available at: http://lrt.mmu.ac.uk/traffic/

Trowler, V. (2010) Student Engagement Literature Review, HEA, York.

Trowler, V. & Trowler, P. (2010) Student Engagement Evidence Summary, Higher Education Academy, York, [online] Available at: https://www.heacademy.ac.uk/studentengagement/Research_and_evidence_base_for_student_engagement

Appendix


Appendix 1. Internal student survey questions.
Questions about the Course S. Disagree <-> S. Agree
Staff on my course are good at explaining things 1 2 3 4 5
Feedback on my work helped to clarify things I did not understand 1 2 3 4 5
I have received sufficient advice and support with my studies 1 2 3 4 5
The course is well organised and running smoothly 1 2 3 4 5
University resources are appropriate to my learning needs 1 2 3 4 5
The course has helped me to develop confidence and skills to succeed 1 2 3 4 5
Overall I am satisfied with the quality of my course 1 2 3 4 5
Best things about my course Free text
Things I would most like improved on my course Free text
Questions repeated for each Unit S. Disagree <-> S. Agree
This unit was taught well 1 2 3 4 5
Best things about this unit Free text
Things 1 would most like improved on this Unit Free text