Since 2013, Leeds Beckett University has carried out two studies, working with market researchers, into students’ feelings and perceptions of online courses and their learning context. This work has been conducted outside routine data collection for statistical reporting to regulatory agencies, as these exercises do not explore a student’s engagement or behaviour in a rich enough way to assist practitioners in the design of learning products, services and experiences.
The unstated philosophy of both studies has been to ground learning behaviour, and hence engagement, in the whole life of the individual student, taking place – in the case of the second study – over an extended time period. These whole-life studies have included research into the students’ emotional lives, as the role of emotions in learning is of interest not only to researchers but also to practitioners, who engage with students in a real-life context rather than an experimental one.
This paper describes these two studies, their findings and their value in developing and delivering online courses. The first study (2014) was entirely qualitative, covering a small sample over a narrow time window, but it provided rich, nuanced insights into learning context and motivation. The second study (2016) was a longitudinal study of a much larger sample of students, using a mix of qualitative research and quantitative data collection. Both studies help to contextualise the ‘online student’, whose presence and activities online are subject to institutional measurement, in the ‘whole person’ of the student.
Leeds Beckett University, with around 24,000 students, has been running distance-learning courses for almost 25 years. In 2013, the university set up a central Distance Learning Unit (DLU) to support the design and development of distance-learning courses, and to shape how they were promoted. Since its inception, the DLU has carried out two research studies into the university’s online students. In the 2014 study, the focus was an intensive review of the lives and learning of a small number of online learning students. The 2016 study, meanwhile, asked a much larger sample of students about their engagement, emotions and other experiential aspects of their course. It must be stressed that these studies were conducted as marketing research into Leeds Beckett’s online courses, and thus had a commercial purpose, feeding back into course development and marketing rather than being conducted for purely objective social scientific research. (It was the university’s courses that were being examined, not ‘student engagement’ in the abstract.) The two studies were carried out on behalf of DLU by the university’s marketing department (2014) and a market research agency (2016).
As these were exercises in marketing research, there was an implicit understanding underpinning the studies of the online student as a customer, so that, for example, the term ‘net promoter score’ – a marketing measure – was used to sum up their feelings about a course. The surveys did not assert (or hypothesise) that the student–university relationship is essentially a customer–supplier relationship, as this is highly contestable, yet certain aspects of a student’s relationship with their institution have customer-like characteristics – for example, searching and deciding between alternative providers of future experiences and benefits, exercising due diligence in terms of provider offerings, and taking on a personalised financial obligation. Thus it seems appropriate to consider customer-related issues in discussions of student behaviour, and to situate student engagement as commencing with the buying decision, which is firstly a significant exercise of agency and secondly a symbolic statement of trust. The small financial incentives awarded to students to encourage their participation in the studies not only reimbursed them for their time but also symbolised their active participation in the process as decision makers, as being active subjects rather than just objects generating data. Moreover, we at the DLU felt that the presentation aesthetics of marketing research offer a richer and more suggestive picture than those of social-science presentations. The research conducted in both studies was governed by the ethical code of the Market Research Society (https://www.mrs.org.uk/pdf/mrs code of conduct 2014.pdf).
The purpose of both studies was to try to reach beyond the student as a learner into obtaining a whole-life perspective of their experiences, including their emotional journey. There is an increasing interest in the role of emotion in learning, but that phrase -‘emotion in learning’- can be problematic, as it proposes a set of feelings in a boundaried psychological space (learning) rather than as part of a fluid life narrative.
Leaving that aside, there are many perspectives on emotion. As Tyng et al. (2017, p. 2) assert, ‘Although emotion has long being studied, it bears no single definition’; it is instead an umbrella concept covering affective, cognitive, expressive and physiological components which may or may not cohere over time. Tyng et al. cite learning as being a function of one of the primary neural networks for all mammalian brains (the so-called SEEKING module).
Learning is both emotional and cognitive. As Niedenthal, Krauth-Gruber and Ric (2006, p. 230) put it, ‘Affective states also cause or are accompanied by changes in the way in which individuals process information per se.’ However, few studies have applied brain-mapping techniques to semantic learning typical of education. It thus remains a commonplace that examinations and anxiety go hand in hand, although it is also observed that students get anxious over additional aspects related to their education other than tests. Therefore, a focus on examination-related anxiety ignores all the other stressors in the student experience, not least the introduction – in the UK, at least – of financial anxiety.
Evidence drawn from laboratory settings focuses on individuals, whereas social and educational settings are much richer in social cues. Parkinson (2011, p. 411) argues that it is necessary to move away from talking about emotions as ‘a response to private meaning, primarily susceptible to informational [italics added] influences from other people’, as opposed to everyday life, where ‘emotions are oriented to other people’s mutually responsive actions rather than pre-scripted behaviour sequences’.
If learning is seen as a social and cultural process, then it depends on mastery and internalisation of social interactions, and this is where teachers actively contribute in creating the emotional climate of learning. Williams, Childers and Kemp (2013, p. 209) show that positive emotions in a classroom environment can stimulate and enhance learning behaviours by augmenting the scope of individuals’ cognition, attention and action, and build psychological, social, intellectual and physical resources. They also conclude that ‘an educator’s attributes (e.g. display of enthusiasm, communication skills) can create a positive motivating environment for students’ (2013, p. 221). Meanwhile, Black and Allen (2018a) suggest that research has focused too much on anxiety and that, ‘despite the importance of a broad range of emotions in learning, many emotions have received little attention by educational psychologists. Especially lacking are studies of positive emotion, such as hope, gratitude or admiration’ (2018a, p. 45). However, Rowe and Fitness (2018) cite continued challenges in asking the right questions about emotion and learning, suggesting that— as reported by faculty and students—‘negative emotions’ can both promote and inhibit learning, ‘given the complexity of interactions between variables such as task requirements, interpersonal relationships, achievement goals and cognitive resources’ (2018, p. 17).
There is perhaps no simple answer, for everybody, to the question as to whether positive or negative emotions promote or inhibit learning, although they are known to be significant factors affecting the take-up of information. Nor indeed does the same answer apply to different demographic groups. Freerkien’s (2017) study of language students and the interaction between affective, motivational and cognitive factors concluded that, for older learners, motivation is more important, whereas for younger learners affective and contextual factors are more significant; the classroom is thus a dynamic system. Even social-cultural factors – such as how learning is evidenced, publicised and ‘performed’—influence emotion; Huang’s (2011) meta-analysis suggested that ‘mastery’ goals elicited more positive emotions than ‘performance avoidance’ goals: the goal to master a skill is a more positive and effective motivator than pursuing performance-avoidance goals to avoid looking stupid.
The online space is not a classroom, however. Too easily, perhaps, do the designers of online spaces and virtual learning environments (VLEs) fall into a content-publishing mentality: the screen, with its promises of limitless scalability, is a distancing device as well as a space for interaction. Yang, Taylor and Cao (2016) attest that, whilst elearning and the classroom are different in many ways, some of the same principles apply to both, suggesting that it is ‘critical for online instructors and course designers to create a learning environment that is supportive and builds confidence [italics added]’, especially as seeking and obtaining help is critical in elearning (p. 13). Furthermore, Rodríguez-Ardura and Meseguer-Artola (2016) cite several studies showing that successful elearning environments can be designed to elicit subjective experiences of presence through which elearners ‘feel individually placed within a true, humanised, education environment’, in which they feel that they are taking part ‘in a true teaching–learning process, interact[ing] with their lecturers and peer students’ (p. 1008). (The use of the word ‘true’ in those two phrases denotes a value, a feeling of authenticity, not just a statement of fact.)
The 2014 study was carried out on behalf of Leeds Beckett University’s DLU by Sarah Finney and Habib Lodal of the university’s own marketing research department. The DLU’s objectives were originally:
Conversations between the DLU and the university’s market researchers enriched these objectives significantly so that they focused not just on the touch-points of a student’s formal engagement but also on contextual factors such as:
The study engaged with six students, with demographic characteristics illustrated in Table 1 below.
The study started with an hour-long telephone interview with participating students. The students were then asked to keep a study diary for two weeks, including a video component to record how they were feeling. This was followed at the end of the period by an additional interview. (This qualitative approach has precedents—for example, O’Shea, Stone and Delahunty (2015) describe a qualitative survey of interviews with online learners.)
The study was designed to position each student’s relationship with the university within the context of their other relationships. It contained a component covering the student’s pre-purchase behaviour, the justification for this being that every interaction they have with the university is a ‘moment of truth’ for the student’s engagement with the institution – that, by choosing between universities (or between university and a job), the student is choosing between alternative futures. The actual purchasing process, with the real risks of making a wrong choice, can be emotionally draining for students, with a high possibility of ‘post-purchase cognitive dissonance’.
The survey also covered studying as a material practice (i.e. not simply a cognitive exercise): we at the DLU were interested in finding out exactly where in the home or workplace, and in what conditions, the students studied, how they used their devices and how many devices they used. Technology, and its embedding in life routines, is not transparent and neutral; rather, it regulates and mediates the experience of learning. Gourlay (2015) suggests, in a similar vein, a reframing of ‘student engagement which recognises the socio-material and radically distributed nature of human and non-human agency in day-to-day student study practice’ (p. 403).
The students’ motives for studying were based on their career progression, but this was self-selected, reflecting the vocational focus of the university’s distance-learning courses. All of the participants had already taken an undergraduate degree and were employed and/or had a family when applying for a distance-learning course. The ability to fit studying with their current lifestyle was the biggest factor in choosing distance learning, followed by their ability to attain a qualification and, following that, price competitiveness. As learners had to juggle studying with work and family, an engagement model that addresses only study encounters, in other words ignoring the wider ecosystem of life and work, fails to account for the way in which students allocate time and effort, in the context of daily decision-making: a decision to spend time studying is a decision not to do something else. Study is not perhaps an antecedent choice; learning is instead a set of contextually prioritised choices. The participants had many demands on their time and therefore wanted to study without compromising their other responsibilities and work commitments.
The concerns that distance learners had when applying for their course were primarily rooted in uncertainty: about the course structure and delivery; about the accuracy of their forecast weekly study commitment, in terms of hours; and about the assessment criteria and access to module information. In effect, these might constitute – in service marketing terms – the specification of what they were going to experience. Even at the point of purchase, the students were typically reflective enough to consider their own motivation to study in the context of already challenging work/life balances; critical to their concerns were interactions and relationships with their tutors – how often and how they would be able to communicate. Initially, they were less concerned about socialising and engaging with fellow classmates; for them, the tutor–student relationship, or imagined relationship, was critical.
As this was a qualitative survey, we wanted to focus on individual students, rather than accumulating generalities about the entire cohort. Whatever statistical regularities may be drawn from the data, each item is derived from a personally experienced history, from a group of very diverse individuals.
The university’s market researchers developed a number of infographics, as in Figure 1 below, to reinforce the focus on individual learners. This allowed us at least to imagine the learning process in the life of the student, who is recognised as an individual agent and decision maker.
The 2014 survey generated a number of reflections. Firstly, although focusing on students as customers and ‘users’ of learning as a ‘service’, the deployment of marketing research reflected the reality of their agency and decision-making to contextualise their study behaviour. Secondly, whilst employing customer survey techniques, the research was humanistic and person-centred, embedding the learning experience within the context of the life of the student. We could therefore test what might work for each individual student. The notion of ‘student-centricity’ clearly requires a fixed student identity, but the convenience of a dominant student identity is realistic only if the student is already immersed in academic surroundings and prioritises study (O’Shea, Stone & Delahunty 2015). This cannot be expected of online students, and it is unfair to criticise distance learners for not prioritising a ‘student’ identity when they have competing identities. The student may or may not be engaged to a greater or lesser degree than his or her digital avatar collected from management information. If, as Bliuc et al. (2011) suggest, there is a link between student identity and deep learning, the university’s task is to try to ensure that this learning can be captured by students who, by virtue of their life paths, cannot prioritise a student identity whilst juggling several other identities.
It should perhaps not be surprising that students prioritised family first, work second and study third. The study habits of those with families as the dominant contextual factor were characterised by lack of structure owing to childcare and extra-curricular activities, while those whose main contextual influencer was work were able to be more structured in their study. The research showed that distance learners with families tended not to study much over the weekends, unless they did not get time to study during the week.
Online environments are supposed to be virtual, but they also comprise a material and spatial practice. All of the participants in the 2014 study stated that they studied at home, often in their living rooms with the TV on in the background. All of the participants used their laptops to study, usually placed on their lap as they sat on the couch. Some used two devices, a laptop and a tablet, at different times, using the tablet for reading journals and ebooks and/or to make short notes. None of the participants in this small survey suggested that they used a desktop computer. Some listened to course-related audio recordings over their tablet or phone while cooking. Moreover, as can be seen from Figure 2 below, the online course has to compete for space with other things.
This examination of material practice highlighted the importance of study logistics in reducing barriers to learners’ participation in their course. Despite the fact that each student’s course was precisely structured, students wanted all their learning materials to be available in advance. This is obviously very easy for purely online universities, but for Leeds Beckett, which sees online as one end of a spectrum engaged in by the same academics who teach classroom courses, this creates a challenge. Students also wanted some live online tutorials where they could interact with their tutors, and for tutors to be available at specific times – possibly a dedicated two-hour period each week – to answer queries or respond to pressing matters.
The research from the 2014 study confirmed the approach that the DLU and the university, which is after all primarily focused on face-to-face classroom delivery, adopted to the development and delivery of online courses. Whilst development of online courses is a co-creative activity carried out by the DLU and the university’s academic departments (or Schools), the delivery of these online courses, and hence the relationship with the student, is embedded in the university’s Schools and course teams rather separately in the DLU. We also came to realise that supporting and helping students to maintain their motivation through their course could be assisted both through instructional design and through best practice in course delivery by tutors.
In comparison with the 2014 study, the 2016 study used a different marketing research methodology and a bigger sample. In the more recent study, the survey was carried out by dedicated marketing research agency Red Brick Research, who had the resources to scale up the study to incorporate a greater number of students. The later study did not follow on exactly from the 2014 survey, but it shared some similar themes, as shown in Figure 3 below. Its primary purpose was to get a deep and detailed understanding of the student-customer experience in order to capture the nuances of the distance-learning student’s journey off campus.
The survey and learners’ responses were compiled as a business report, largely narrative in nature, supported by data, but designed to assist decision-making.
Unlike the 2014 survey, the 2016 longitudinal survey used a variety of quantitative and qualitative profiles and was carried out over a 10-month period. An opening survey sent to 805 distance learners early in the academic year was completed by 134 respondees (16.67%). From this group, 65 distance learners were recruited to track key performance indicators (KPIs) on engagement, motivation, community, satisfaction and net promoter score (NPS) at regular times over the 10 months, which enabled the market researchers to build a sense of the student journey and to identify student profiles.
The researchers followed up the opening survey with 27 interviews and six focus groups. Video interviews were carried out with 25 participants, using an app that enabled their experiences and opinions to be brought to life. (This measure was included as it seemed a good way, given the richness of perspective offered in the 2014 survey, of ensuring the student voice was heard above the data.)
Finally, a closing survey, similar in structure to the opening survey, of 85 students was carried out so that engagement over time could be mapped.
The Summary section linked the quantitative and qualitative work in an integrated narrative and a set of practical recommendations and interventions. A more detailed, nuanced review fine-tuned some of these conclusions. The reported findings were that learner satisfaction is driven by the ‘academic experience’, which is defined as a combination of teaching and support and is valued over all other things, including a sense of community. In the initial survey, the experience of feeling part of a community was seen by participants as a bonus rather than a necessity in driving satisfaction, although during their course some students became frustrated if their peers did not engage and welcomed the opportunity to engage with their fellow students. The KPI trackers showed a broadly consistent score over the period, but finished higher at the end of the year than at the start.
The study also revealed that the amount of time learners spent studying varied significantly from week to week, both more and less than the recommended 10 hours, reinforcing the point that students will moderate or accentuate their engagement according to non-study concerns. Also, whilst overall student satisfaction was stable and strong, there was not always a consistent learning experience across modules, as tutors engaged in different ways and students valued consistency across modules. The recommended interventions were largely logistical – swifter feedback, ensuring consistency, clear expectations of support, and effective management of ‘hygiene’ factors extrinsic to the learning as such, but useful or even necessary for it to function at all such as technical support. These recommendations are broadly supportive of the DLU’s own proposals.
The study asked two questions about students’ reasons for studying:
These questions attempt to make a distinction between the ostensible rationale for studying and other, less formal reasons for investing such a considerable amount of time and money in studying.
As with the 2014 study, the 2016 survey revealed that learners’ ostensible motivation to undertake their course was career progression. Students were asked to write what they felt were their personal goals for studying, and what made them undertake the course they had chosen and these revealed a richer variety of motivators: 5% wanted to ‘escape from the current situation’; 10% wanted ‘to improve the standard of living for myself and my family’; while other reasons included ‘learning new skills’ (59%), a ‘sense of challenge (57%)’, ‘intellectual stimulation’ (54%), ‘improve quality of life’ (16%), ‘gain new experiences’ (26%) ‘get my dream job’ (12%), ‘gain more confidence’ (28%), ‘do something worthwhile’ (11%) and ‘meet new people’ (4%). So, as well as an instrumental calculus of career development, there appears to be an emotional and experiential aspiration at play, which was revealed when students were asked to write about themselves. All this suggests that some students see career development as an opportunity to access better futures and emotional states, and that learning is seen as a way of providing a vehicle for constructing future life scenarios.
The KPIs used in the 2016 study measured student satisfaction with various aspects of the teaching or academic experience on a Likert scale, ranging from ‘very dissatisfied’ to ‘very satisfied’. Fortunately, 82% were satisfied or very satisfied overall – which, as a matter of interest, is comparable with similar measures recorded in more formal data collection. Satisfaction was broken down into further sub-questions covering:
Support from academic staff was seen as being most important. The survey results indicated that students wanted a personal touch from academic tutors, as well as a single contact point to turn to in order to sort out problems quickly. The common feeling was that, as they were investing time and money on a high-stakes purchase while juggling lots of demands outside the study environment, purely self-service solutions were not welcomed. Learners also expected consistency between modules and to understand how feedback would be delivered and when – an area which can be most dissatisfying.
Such desires have implications for how academic colleagues bring their own personality and creativity into their work, as departure from expectations can be a significant cause of concern. There was no significant correlation found, however, between a student’s satisfaction with their academic experience and the time they spent studying, although this factor did affect their emotional state.
The researchers reviewed student behaviour outside their course environment. The students participating in the study had to juggle multiple elements in their lives, which made it difficult for them to spend as much time as they wanted to on studying – indeed, over 30% found it difficult to set aside enough time to study. Consequently, wasting precious time through non-availability of materials, for example, should be avoided.
Outside the student encounter, the learners voiced several concerns. When asked what worried them most, 68% of participants indicated that their work/life balance was a cause of concern, while more than 62% were concerned with their level of academic success – a number that remained static throughout the course. Twenty-four per cent, meanwhile, were concerned with their emotional wellbeing, while 25% were concerned with the impact of study on their personal and professional relationships and, at the beginning of the 10-month period, 30% were worried about money (although this declined to 21% at the end). As noted in the Introduction, studies that focus solely on test anxiety as the dominant academic emotion fail to hear all this additional ‘noise’.
This element of the survey considered community, and 55% of participants felt that they were part of a community, although definitions of ‘community’ varied from student to student and were driven by the course experience, sometimes taking the form of – for example – formal discussion groups or informal networks (e.g. Facebook or WhatsApp groups) set up outside academic oversight. Although being part of a community was not seen as a motivating factor for enrolling onto a course, distance learners who felt that they were part of a community were more likely to be satisfied with their course. The survey also suggested that student engagement with social media and online learning platforms was the best way to generate a sense of community. Both opening and closing surveys asked students to describe their emotional state, which obviously changed through the process (Figure 4).
It was discovered that older distance learners in the survey are also more likely to feel happy or excited about their course than younger learners, while students who spent more time studying also described more positive emotional states. Similarly, students who felt more that they were part of a community expressed more emotional positivity, describing emotions such as ‘excited’ or ‘energised’. There was also a link between positive emotions and satisfaction: those students who were satisfied with their learning experience were more likely to use emotions such as ‘hopeful’ or ‘energised’ to describe how they currently felt about their studies, while those who disagreed that they felt part of a distance-learning community were more likely to use emotions such as ‘frustrated’. Consequently, although not seen as a priority for learners when compared to the academic experience, feeling that they were part of a community does seem to have impacted on learners’ emotional states and levels of satisfaction, suggesting that a sense of community might be more important than their overtly stated prioritisation might suggest.
It should be noted, however, that the survey did not focus on particular study-related emotions, such as interest and boredom, but on feelings in general. The decision to direct the survey in this way reflected the multiple identities that students had to enact and their expectations. The rationale behind this decision was that, whilst it might be easy to compartmentalise learning emotions in the laboratory, it is perhaps much harder to do so in the lives of students themselves. Even so, what was interesting was that very few respondents described their feelings in terms of typical language as ‘mastery’ or ‘performance avoidance’.
There is a further link of emotion to net promoter scores (NPSs). The survey results revealed that students who spent more time studying were more likely to recommend the course to others, as were those who felt part of a community. Also, 72% of distance learners surveyed who spent 11 or more hours a week studying rated the experience as eight out of ten or higher, compared to 55% of learners who spent less than 11 hours each week studying giving a similar score. Meanwhile, 61% of students who felt part of a distance-learning community gave scores of nine out of ten, indicating that feeling part of a community also has an impact on the likelihood of a distance learner recommending the University. (Leeds Beckett’s overall NPS score is +22, but this rose to +24 by the end of the study.)
In the 2016 study, motivation and engagement were conceptualised separately. During the KPI tracking exercise, students were asked to score their levels of motivation and engagement, which varied according to assessment deadline, as reflected in Figure 5.
The survey found no significant correlation between students’ reports of their motivation and their actual engagement; although their levels of engagement increased over their course, motivation levels stayed the same. Whilst the university might make interventions to increase engagement or reduce frustration, many students reported when interviewed that they regarded motivation as being something personal to them. This suggests that future academic and non-academic support structures might focus on enabling motivated students to maintain their engagement, such as offering tailored support at the right time and providing access to a mentor. This lack of correlation between motivation and engagement might also reflect the age profile of the students; as noted in the Introduction (Freerkien 2017), older students might be better able to separate motivation from other factors.
The KPI tracking scores indicated a generally consistent and positive experience over the survey period. Satisfaction and net promoter score remained at an average of 7.5 out of 10 (Figure 6a).
Finally, the data identified three student types based on their reported KPI scores. These are virtual, not real, profiles, and are built on student self-reports, created from the aggregate data (Figure 6b).
So, behind the consistent KPIs, there are variations in data reflecting different student profiles and behaviours, leading to different potential service offerings.
Both studies engaged with students who studied only online. However, the experience of distance-learning provision will be used to tailor the classroom environment to evolve genuinely blended offers. It is thus possible to envisage a spectrum of encounters, with different degrees of onsite and offsite engagement, and different types of synchronous and asynchronous learning opportunities. These developments are facilitated by the university’s broader technological strategies – for example, lecture capture and the issue of students with Office 365 accounts to facilitate collaboration. The pervasiveness of technology, from internet-enabled whiteboards to mobile devices used by students in class, means that ‘lectures’ are already technologically enabled and have been fundamentally, if not intentionally, changed by digitisation (Gourlay 2012), with the lecturer’s words recorded and open to challenge by the pervasiveness of digital media in the classroom. Research has shown that more digitisation is not necessarily a good thing if unsupported by sound pedagogy and understanding of the many learning- and non-learning-related factors affecting student engagement with it; Burch, Burch and Womble (2017, p. 120) describe a course that made web use compulsory but which received lower engagement scores from the same course on which use of the web was not mandatory. There is thus a blurring of the largely artificial boundaries between digital and non-digital, or ‘distance’ and ‘classroom’.
Furthermore, there has been an erosion of the boundaries between market and non-market provision, influencing the university’s decision to use market research as a specific technique to investigate the student experience. The buying process and the financial commitment are not just antecedents; a student’s debt is a long-term companion to their course and beyond it. Considering the student engagement as only being ‘on the course’ omits consideration of the underlying life narrative and existence of other identities. In England, the state still funds universities, but indirectly via the mechanism of student buying decision at the point of purchase. Whether this market-mimicry approach – with the unintended consequence that a degree has taken on the appearance of a Veblen good – is to be applauded or deplored, the student in effect has been given a significant agency and commitment at the point of purchase. It is harder to exercise this agency once the buying decision has been made, but the 2014 and 2016 research showed (Figure 7) work/life balance issues and money concerns as factors in the forefront of the participating students’ minds. The focus on financial costs should therefore not obscure the significant (if less visible) costs regarding loss of opportunity should the students make the wrong choice or a bad decision, which is hard to reverse. The customer relationship, where the student purchases the benefit in advance, is a highly risky one for the student as a buyer, and the issues here are informational and ethical. Treating students as part customers is not disempowering academics or demeaning them but stating, in stark terms, the real-life risks students undertake in committing to study at this institution as opposed to another.
Of course, the opportunity costs of time and the risks of having to live with poor decisions existed before marketisation, yet students exercise significant agency at the commencement of their studies. An approach to student agency based on student expectations has been modelled by Dziuban et al. (2015) using the concept of the ‘psychological contract’, adapted from research into employment relations, and is valuable here as it reflects student expectations deriving from their role as contracting partner. Although they are not the university’s employees, students do work under direction; their time is, to a degree, controlled through fear of sanctions such as expulsion or extra work. However, gaps between expectations and delivery might be measured in KPIs of satisfaction. While expectation gaps may develop as a result of poor information, they may in fact reflect more fundamental discursive differences between student and non-student identities. Martin et al. (2014) observe that care workers studying for an online degree did not adopt the right ‘student identity’ and suggest that the work environment and ‘compliance’ culture were inhibiting factors. For example, they noted that the students voiced significant disquiet at being expected to do a group summative assessment by peer review on the grounds that students felt it was not their job to assess each other, and did not see the point. This objection and the students’ reported wider failure to engage were attributed to the apparent ‘compliance culture’ of the workplace. It might have been useful, though, to have articulated the way in which the underlying processes and values of academia, which construct students as students and regulates their behaviour, in fact led the authors to their view that the problem was the work environment and culture, rather than the academic practice that may not have fully aligned with them. Likewise, Jonasson (2012) identifies two fundamental discursive clashes – albeit in a more sophisticated way – in the vocational training of Danish chefs, who saw themselves as ‘trainee practitioner chefs’ rather than ‘students’. Clearly, understanding and critiquing these discursive conflicts might lead to better engagement with students who cannot prioritise their student identity over their work identity.
The construction of online learners as students who are judged on the performance of digital work in conforming with regulatory expectations is also shown in some approaches to student engagement. Unlike the positive and contractual self-reporting of emotion and engagement and motivation, digital information is easy to capture – for example, the number of times a VLE is accessed – and there is a natural tendency to view this as a proxy for engagement. The mass collection of data on student digital behaviours may be conducted with wholly beneficial ends in mind but, just as the psychological contract between student and university mimics employment relations, so the mass collection of data mimics the automatic collection of data by social media and digital giants, viewing students as data generators for a measurement system rather than learners with agency.
As Bocconi and Trentin (2014) suggest, mobile and network technology offer a facilitation and ‘tracking of the learning and teaching process’ (p. 525) so that a learning path can be designed whereby any ‘activity that leaves digital traces that may be analysed asynchronously’. Similarly, Dixson (2015) defines engagement as putting ‘energy, thought, effort and to some extent feelings, into their learning’ (p. 146) and identifies two types of digital behaviour, observation and application, which are then mapped to students’ self-reports of how engaged they felt, creating a proxy link between digital traces and student self-reports. However, this overlooks the idea that students’ reports of engagement may perhaps be stimulated by the request to report on it (Burch, Burch & Womble 2017, p. 120) or, even worse, might be positively misleading if students do not understand the questions (Kahu 2013).
This has two implications. One of these, as suggested by Gourlay (2015), is that engagement measures depending on digital traces ‘may serve to underscore restrictive, culturally specific, and normative notions of what constitutes acceptable student practice’ for the very simple reason that engagement is only legitimate if it is ‘communicative, recordable, public, observable and communal’, so that, by implication, ‘listening, thinking, reading and writing or private study are assumed to be markers of passivity and not indicative of engagement’ (p. 403). In other words, such activity is seen as a digital performance of participation, and failure to perform can lead to disapproval or social sanctioning. The student’s identity as a learner is thus being constructed by the needs of the measuring tool and the administrative apparatus that supports it. In other words, a student is in effect presented, modelled and controlled as a digital avatar, a creator of traces that conform to the regulatory and cultural regime of the institution. Underpinning all this is the regulation of student behaviour to produce indicators of engagement – cognitive, behavioural and emotional – that may be based on a flawed model assuming a dominant student identity, when research indicates that this is not realistic for online students. One can envisage a possible future in which a student can measure his or her own engagement on a sort of Fitbit equivalent – a self-policing and self-regulating tool embodying the dominant discourse as to what a student should be, so that learning is an external performance rather than an internal transformation for life. As Zepke suggests, ‘Performativity, the value of what can be produced, measured, recorded and reported, becomes a technology of control’ (2015, p. 702).
Finally, what appears to be absent in this construction of students as data generators is any appreciation of emotion and, perhaps, a humanistic perspective of learning as a personal, agentive or transformative experience for learners. A humanistic perspective assumes people are not reducible to components, have agency and intention, and seek and create meaning (Bugental 1964). There seems to be a discursive gulf between what is measured – data points – and a humanistic perception of the process of education in which learning might be considered a rite of passage, a chapter in a life story, or a process of personal transformation in which social and intellectual opportunity are somehow combined. As Bowers and Lemberger (2016) suggest, statistical regularities may not always provide an accurate guide to what to do in counselling practice with particular individuals, who may deviate in significant ways from the norm owing to life context and personality. They directly negotiate what is, perhaps, a clash of discourses, from the value of data to the personal and experiential.
For the individual student, learning is potentially transformative and perhaps forms part of a whole-life narrative. There is little sense in considering the act of learning as a rite of passage. Not only do a number of studies suggest the importance of emotion as a whole on learning (Maguire et al. 2017; Oriol et al. 2016), but others delve deeper and show that the type of emotion experienced by learners is important: autonomous motivation generates better learning than controlled motivation, whereby feelings of pride and guilt drive the desire to meet internalised social expectations (Cai & Liem 2017). In this model, failure can have real consequences, in which a hyper-competitive environment causes stress and mental illness (Posselt & Lipson 2016).
Finally, Ghori (2016) suggests that established models understate ‘the critical role that students can or cannot play in their own learning and satisfaction’ (p. 5) and suggests that, when students realise they are agentive and have a role to play, they are less dissatisfied (p. 231). The multidimensional models offered by both Kahu (2013) and Ghori (2016) offer a way forward in revealing the student behind the data.
The two studies into student behaviour that were commissioned by Leeds Beckett University’s Distance Learning Unit in 2014 and 2016 were developed largely so that we at the DLU could improve our offerings and services to students as individuals, recognising them as agents with other things to do, for whom a decision to study is one that must be made again and again, on a daily basis. The surveys were not designed with the current research debates in mind, but they do serve to illustrate them. Given criticism of marketisation, or the expectation that students must enact performances of learning to satisfy the needs of data-recording systems, we propose a perspective that recognises their agency, individuality and roundedness, and respects their multiple identities, rather than insisting that online students enact a single student identity. We thus recognised the importance of relationships as much as technology, paying attention to personal relationships that must exist behind the screen for online learning to be a shared experience, not just an ingestion of content. The ambivalent motivational role of ‘community’ – not a stated priority but a driver of satisfaction – suggests that this cannot be ignored, even if it is not strongly promoted. The development of profiles based on real students, as opposed to generalities, which recognises study as a material practice enacted in a daily set of choices between alternatives, supports the need for attention to emotion and relationship-building in driving engagement and satisfaction, and removes the taint of pure instrumentality of student decision-making even in a marketised system.
Whilst the surveys were defined for marketing research, and are thus limited by their purpose, they have proved to be very revealing and have touched on many issues. The research did of course reply on verbal self-reports, and we do not know whether, the description of the research project as ‘market research’ in our early communications with potential survey participants influenced they extent to which they responded ‘as a customer’ as well as ‘as a student’.
However, can this line of enquiry be developed further? Already in this paper, it has been suggested that more work can be done on researching negative and positive emotions and their impact on learning. So, an extension of these studies could be a review of how ‘academic’ emotions relate to non-academic emotions. An approach based on life narratives could offer suggestive nuance about which questions about emotions to ask over time and in context. The evolution of student identities in online environments could also be explored—for instance, exploring how online students integrate their student identities at work and in the home, and how they play back to themselves ideas of self-efficacy; whether digital performances of learning are correlated with deep learning; how tutor ‘presence’ can be developed, and whether artificial intelligence could substitute for it; the differing motivating factors from the task itself to students’ imagined future states; how learning plays a role in life narratives and self-definition, and how it is remembered; and how social-cultural approaches to learning – in which learning is conceived as a social and cultural process – can be applied to online environments. Indeed, there could be studies designed to explore ideas of extended cognition and the extended mind, inspired by online learners and their multiple tools and material practices, which itself could segue into broader posthumanist concerns in the humanities.
ERFH is employed as Director of Distance Learning by Leeds Beckett University.
Black, S and Allen, J. 2018a. Part 4: Academic Self-Concept and Emotions. The Reference Librarian, 59(1): 42–55. DOI: https://doi.org/10.1080/02763877.2017.1349022
Bliuc, A-M, Ellis, RA, Goodyear, P and Hendres, DM. 2011. The role of social identification as university student in learning: Relationships between students’ social identity, approaches to learning and academic achievement. Educational Psychology, 31(5): 559–574. DOI: https://doi.org/10.1080/01443410.2011.585948
Bocconi, S and Trentin, G. 2014. Modelling blended solutions for higher education: Teaching, learning and assessment in the network and mobile technology era. Educational Research and Evaluation, 20(7–8): 516–535. DOI: https://doi.org/10.1080/13803611.2014.996367
Bowers, H and Lemberger, ME. 2016. A person-centered humanistic approach to performing evidence-based school counselling research. Person-Centered & Experiential Psychotherapies, 15(1): 55–66. DOI: https://doi.org/10.1080/14779757.2016.1139502
Bugental, JFT. 1964. The Third Force in Humanistic Psychology. Journal of Humanistic Psychology, 4(1): 19–25. DOI: https://doi.org/10.1177/002216786400400102
Burch, GF, Burch, JJ and Womble, J. 2017. Student Engagement: An Empirical Analysis of the Effects of Implementing Mandatory Web-Based Learning Systems. Organization Management Journal, 14(2): 116–125. DOI: https://doi.org/10.1080/15416518.2017.1325349
Cai, EYL and Liem, GAD. 2017. ‘Why do I study and what do I want to achieve by studying?’ Understanding the reasons and the aims of student engagement. School Psychology International, 38(2): 131–148. DOI: https://doi.org/10.1177/0143034316686399
Dziuban, C, Moskal, P, Thompson, J, Kramer, L, DeCantis, G and Hermsdorfer, A. 2015. Student Satisfaction with Online Learning: Is it a Psychological Contract? Research Initiative for Teaching Effectiveness. University of Central Florida. DOI: https://doi.org/10.24059/olj.v19i2.496
Gourlay, L. 2012. Cyborg Ontologies and the Lecturer’s Voice: A Posthuman Reading of the ‘Face to Face’. Learning, Media and Technology, 37(2): 198–211. DOI: https://doi.org/10.1080/17439884.2012.671773
Gourlay, L. 2015. Student Engagement and the Tyranny of Participation. Teaching in Higher Education, 20(4): 402–411. DOI: https://doi.org/10.1080/13562517.2015.1020784
Huang, C. 2011. Achievement Goals and Achievement Emotions: A Meta-analysis. Educational Psychology Review, 23: 359–388. DOI: https://doi.org/10.1007/s10648-011-9155-x
Jonasson, C. 2012. Teachers and students’ divergent perceptions of student engagement: Recognition of school or workplace goals. British Journal of Sociology of Education, 33(5): 723–741. DOI: https://doi.org/10.1080/01425692.2012.674811
Kahu, ER. 2013. Framing student engagement in higher education. Studies in Higher Education, 38(5): 758–773. DOI: https://doi.org/10.1080/03075079.2011.598505
Maguire, R, Egan, A, Hyland, P and Maguire, P. 2017. Engaging students emotionally: The role of emotional intelligence in predicting cognitive and affective engagement in higher education. Higher Education Research & Development, 36(2): 343–357. DOI: https://doi.org/10.1080/07294360.2016.1185396
Martin, L, Spolander, G, Ali, I and Maas, B. 2014. The evolution of student identity: A case of caveat emptor. Journal of Further and Higher Education, 38(2): 200–210. DOI: https://doi.org/10.1080/0309877X.2012.722200
Oriol, X, Amutio, A, Mendoza, M, da Costa, S and Rafael, M. 2016. Emotional Creativity as Predictor of Intrinsic Motivation and Academic Engagement in University Students: The Mediating Role of Positive Emotions. Frontiers in Psychology, 7. DOI: https://doi.org/10.3389/fpsyg.2016.01243
O’Shea, S, Stone, C and Delahunty, J. 2015. ‘I “feel” like I am at university even though I am online.’ Exploring how students narrate their engagement with higher education institutions in an online learning environment. Distance Education, 36(1): 41–58. DOI: https://doi.org/10.1080/01587919.2015.1019970
Parkinson, B. 2011. How social is the social psychology of emotion? British Journal of Psychology, 50(3): 405–413. DOI: https://doi.org/10.1111/j.2044-8309.2011.02034.x
Posselt, JR and Lipson, SK. 2016. Competition, Anxiety and Depression in the college classroom: Variations by Student Identity and Field of Study. Journal of College Students Development, 57(8): 973–989. DOI: https://doi.org/10.1353/csd.2016.0094
Rodríguez-Ardura, I and Meseguer-Artola, A. 2016. Presence in Personalised Elearning—the impact of cognitive and emotional factors and the moderating role of Gender. Behaviour and Information Technology, 35(11): 1008–1018. DOI: https://doi.org/10.1080/0144929X.2016.1212093
Rowe, A and Fitness, J. 2018. Understanding the Role of Negative Emotions in Adult Learning and Achievement: A Social Functional Perspective. Behavioral Sciences, 8(2): 27. DOI: https://doi.org/10.3390/bs8020027
Tyng, C, Amin, H, Saad, M and Malik, A. 2017. The Influences of Emotions on Learning and Memory. Frontiers in Psychology, 8. DOI: https://doi.org/10.3389/fpsyg.2017.01454
Williams, K, Childers, C and Kemp, E. 2013. Stimulating and Enhancing Student Learning Through Positive Emotions. Journal of Teaching in Travel and Tourism, 13(3): 209–227. DOI: https://doi.org/10.1080/15313220.2013.813320
Yang, Y, Taylor, J and Cao, L. 2016. The 3 × 2 Achievement Goal Model in Predicting Online Student Test Anxiety and Help-Seeking. International Journal of E-learning and Distance Education, 32(1): 1–15.
Zepke, N. 2015. Student engagement and neoliberalism: Mapping an elective affinity. International Journal of Lifelong Education, 34(6): 696–709. DOI: https://doi.org/10.1080/02601370.2015.1096312