Start Submission Become a Reviewer

Reading: Investigating the Usage and Perceptions of Third-Party Online Learning Support Services for ...

Download

A- A+
Alt. Display

Articles

Investigating the Usage and Perceptions of Third-Party Online Learning Support Services for Diverse Students

Authors:

Mollie Dollinger ,

La Trobe University, AU
X close

Sarah Cox,

La Trobe University, AU
X close

Rebecca Eaton,

La Trobe University, AU
X close

Jessica Vanderlelie,

La Trobe University, AU
X close

Sam Ridsdale

La Trobe University, AU
X close

Abstract

This article will explore usage patterns and perceptions of online learning support among university students. As higher education expands to include increasingly diverse student cohorts, alternative online-supported learning services have gained attention as a mechanism to support student success. However, there is a paucity of research regarding student perceptions and usage patterns for online support and the impact of these services on students’ learning experiences. To address this gap, this study explored student usage patterns and perceptions of impact of students enrolled in a large research-intensive university in Australia, using data collected through the third-party provider and a supplementary student survey from July 2018–June 2019. Overall, 90.4% of students considered their interaction with the service to be positive, with 81% reporting that the service assisted their learning. While the service is not aimed at replacing face-to-face tutoring of academic skills support, this study suggests that online-supported learning services may provide an increasingly relevant and useful service to students and supplement face-to-face offerings.
How to Cite: Dollinger, M., Cox, S., Eaton, R., Vanderlelie, J. and Ridsdale, S., 2020. Investigating the Usage and Perceptions of Third-Party Online Learning Support Services for Diverse Students. Journal of Interactive Media in Education, 2020(1), p.14. DOI: http://doi.org/10.5334/jime.555
464
Views
121
Downloads
26
Twitter
  Published on 05 Jun 2020
 Accepted on 03 Apr 2020            Submitted on 22 Nov 2019

Introduction

Between 2010–2017 the uncapping of university places in the ‘Demand Driven System’, gave rise to considerable change in the Australian higher education sector. The 2019 Report of the Australian Government Productivity Commission provides insight into the growing diversity of student cohorts across the country, with increased participation from students from mature-age, regional, remote, and low-socioeconomic backgrounds (Productivity Commission 2019). Studies report that as many as 51% of all students attending university are first-in-family (O’Shea 2016; Spiegler & Bednarek 2013) and students over the age of 25 accounted for 47% of all students in 2015 (Department of Education 2015). Universities Australia (2019), the leading body for higher education in Australia, further reports an increase between 2015 and 2018 of students with a disability (106% increase), Indigenous students (89% increase), students from a low socio-economic status (55% increase), and regional and remote students (48% increase), highlighting the growing diversity in the student cohort. Online student enrolments have also grown in Australia, with one-in-five of domestic students now studying off-campus through online or mixed module learning (Norton & Cherastidtham 2018).

Such alterations in the demographic mix of student cohorts have coincided with changes in the way students choose to engage with their universities and in their support expectations. Students, whether classified as non-traditional or not, are seeking more flexible study options, access to increased online and after hours support, and co-curricular opportunities that allow them to shape their learning, engagement and success at university (Stone et al. 2019; Tseng & Walsh 2016; Wanner & Palmer 2015). The provision of such nuanced and just-in-time support to students in the study environment of their choosing is an important factor in supporting student retention and success (Stone & O’Shea 2019).

In an attempt to better understand the needs of our students, universities are reaching out to students to be change agents, co-creators, or partners, with the aim of enhancing the student experience both at an institutional and individual level (Dollinger, Lodge & Coates 2018; Fielding 2001; Gravett, Kinchin & Winstone 2019). By collaborating with students, universities have gained greater insight into the services and support mechanisms that students desire. One such avenue that can be explored in this new territory is how online platforms and/or study options can amplify flexibility for students and perhaps better align to their lifestyles and/or preferences of learning.

There exist numerous examples in the literature of how higher education has begun to explore information and communication technologies (ICT) and technology-based learning support for students, including online video resources (Darling-Hammond, Zielezinski, & Goldman 2014), utilisation of social media tools (e.g. Holley & Oliver 2010), and even third-world reality spaces where online students can interact with each other through personalised avatars (e.g. Warburton 2009). Significant research supports the hypothesis that online elements within course design and/or in supplementary study skills materials provide generally positive influences on student learning (e.g. Turula 2018). In practice, the use of online modalities supports the achievement of two compatible aims; to support students’ flexible study options (i.e. studying at home, abroad) and to push the boundaries of a traditional brick and mortar learning environment (Ellis & Goodyear 2013; Oliver & Herrington 2003).

As the sector has come to understand, stretching the boundaries of higher education in the digital age not only means reaching globally, but also the deepening of support domestically (Taylor & Newton 2013). Students who work part-time or who live far from campuses can benefit from the shift to online and digital resources that allows them to work flexibly (Alexander 2001; Laurillard 2005). Studies have shown that the majority of students are in favour of using learning technologies (e.g. mobile devices) to support their studies (e.g. Farley et al. 2015) and appreciate when online learning provides a clear structure for the curriculum and supports their self-regulated learning (Paechter & Maier 2010). Literature also suggests that student preferences for using learning technologies or seeking online-supported subjects relate not to their individual learning style but rather to other factors such as commitments outside university, technology competence, and travel difficulties (Zacharis 2011).

For non-traditional students who may have a limited capacity to physically attend campus (e.g. due to work or family commitments, living in a remote location, or a disability or mental health issue), face-to-face study skills provision may not be adequate (LaPadula 2003). For this reason, it is imperative that higher education institutions explore and evaluate suitable online student support services to further assist students, especially those identified as being ‘at risk’. The current study sought to investigate the impact of an online learning support platform on the student experience. The platform had two distinct functions, an online live-chat tutoring service, and an online writing submission which allowed students to get personalised feedback on a range of written submissions.

Contextually, this study takes place at a time of growing student diversity and increased presence of third-party providers competing to offer teaching and study support. The growing marketisation of higher education and the adoption of organisational business strategies (e.g. Brown & Carasso 2013; Molesworth, Nixon, & Scullion 2009), including logistical outsourcing, has led many universities to outsource several services including travel, expenses, parking, facility maintenance, transportation, student housing, and even student support services. Research on the use of third-party service providers notes benefits for transitioning to third-parties such as lower labour costs, flexibility, and improved delivery and service (Daugherty, Stank, & Rogers 1996; Marasco 2008). However, while several key studies have examined this in the for-profit space, there is a lack of analysis on the benefits of third-party services in the higher education context. This study, therefore, will also address a critical gap in higher education research and explore how third-party online study providers can support student success.

Study Design and Methods

The service evaluated in this study was Studiosity, a third-party, online support program established over a decade ago and utilised throughout Australia and internationally. The platform provides two distinct services, an online live chat function (offered 6 days a week) and a writing submission service offered 24/7. The online chat allows students to submit course-related questions and then be matched with a suitable tutor in approximately 5–30 minutes when the platform is open. While the chat function has no videoconference facility, it does include a collaborative whiteboard that can be used for maths and file sharing options. The writing submission accepts formal essays or reports as well as résumés and provides feedback on grammar, spelling, structure, and readability within 24 hours. The platform aims to support the development of foundational study and research skills, with additional elemental academic guidance in areas such as English, Maths, Statistics, Year 12 Science (e.g. Introductory Biology and Chemistry).

This study aimed to evaluate the usage and impressions of impact of an online learning support service offered to students enrolled in a large research-intensive university in Australia. Full ethical approval was obtained prior to the commencement of the data collection (HEC18322). Research questions included:

RQ1 What were the demographics of students who elected to use the optional online support service?

RQ2 How did students choose to engage with the platform (i.e. what services and what behaviour did they exhibit)?

RQ3 What were students’ perceptions of the platform’s services?

To explore each of these research questions, we drew on two sets of data across two distinct time periods. One set of data were collected through the platform which hosted the service (i.e. a website) that could report data analytics on the specific services used by students, login time patterns, and what types of study skill support were accessed (e.g. specific subjects/materials). This dataset spanned both Semester 2, 2018 and Semester 1, 2019. Data sourced from the platform included student participation by course level, student satisfaction (i.e. rating), and student usage by time of day. We will refer to this data as ‘Dataset 1’.

In Semester 2, 2018, the research team also drew upon a supplementary survey (administered though Qualtrics) that was sent to students who utilised the service to obtain additional data. It should be noted that the survey was voluntary, and not all students who utilised the platform chose to participate in the survey (n = 48, 15% of the total use cohort). The survey instrument asked students to respond to a number of questions in various formats (e.g. Likert-type response scale, open-ended responses). Items assessed student perceptions in relation to the impact of the service on their academic performance. For example, students were asked to indicate, on a 5-point Likert-type response scale (strongly disagree = 1 to strongly agree = 5), their level of agreement with statements such as, “I believe I will get a higher grade due to the feedback I received from the service” and “I am more confident in my ability to learn after using the service.” We will refer to this data as ‘Dataset 2’.

Participant Sample

The datasets used in this study were collected from two different, yet overlapping, student cohorts. Data collected by the platform (Dataset 1, see Table 1) included all students who used the platform in Semester 2, 2018 or in Semester 1, 2019. Dataset 2 (see Table 2) constituted a subset of students who used the platform in Semester 2, 2018 and also opted to participate in a voluntary supplementary survey. As can be seen in Tables 1 and 2, over and above the demographic information collected in Dataset 1 (gender, status and age) from students who only engaged in the platform, the supplementary survey obtained additional identifiers, such as campus location and typical final grade aim.

Table 1

Descriptive Data (Dataset 1).

2018 2019

% (n) % (n)

Gender
     Female 72% (225) 82% (521)
     Male 28% (89) 18% (116)
     Indeterminate/Intersex/Unspecified <1% (1)
     Total 100% (314) 100% (638)
Student Status
     Domestic 88% (275) 86% (551)
     International 12% (39) 14% (87)
     Total 100% (314) 100% (638)
Age
     Mature Age (25+) 32% (99) 34% (216)
     Under 25 (<25) 69% (215) 66% (422)
     Total 100% (314) 100% (638)

Table 2

Descriptive Data (Dataset 2).

Total Sample

(N = 48)

Gender, % (n)
     Female 71.7 % (33)
     Male 21.7% (10)
     Transgender 4.3% (2)
     I prefer not to say 2.2% (1)
     Total 100.0% (46)
Student Status, % (n)
     Domestic 85.1% (40)
     International 14.9% (7)
     Total 100.0% (47)
Location, % (n)
     Campus Location A (Metro) 39.1% (18)
     Campus Location B (Regional) 37.7% (17)
     Campus Location C (Regional) 15.2% (7)
     Campus Location D (Regional) 6.5% (3)
     Campus Location E (Regional) 2.2% (1)
     Total 100.0% (48)
Final Grade Aim, % (n)
     Above 80% 63.0% (29)
     Between 70–80% 19.6% (9)
     Between 60–70% 15.2% (7)
     I prefer not to say 2.2% (1)
     Total 100.0% (46)

Results

The first aim of this evaluation was to better understand the characteristics of the students using the platform (e.g. gender, domestic or international, mature age students who begin university after 21 years old). Data drawn from the platform (Dataset 1) indicated that one third of the students who used the platform identified as mature age (31% in 2018 and 34% in 2019) and were more likely to be female (72% in 2018 and 82% in 2019) (Table 1). For those students participating in the follow up survey (Dataset 2), 61% indicated they lived in a regional area and 35% identified that they were aiming to achieve a final grade at or below 79% (note: 80% or higher is an H1 or A equivalent) (Table 2). These findings suggest diversity in the cohort utilising the platform and that this may be an important mechanism by which to support student equity at the university.

It is also interesting to compare participant descriptive data with that of the total student population to see if participation rates in the platform aligned with total proportions of students. For example, the university where the study took place was a 65% female cohort. Yet the proportion of Studiosity users who identified as female was 72% in 2018 and 82% in 2019. Similarly, while only 12–14% of participants indicated they were international students, the university in total has 24% of its student population identifying as international. However, as promotion of the platform was inconsistent (promoted in specific subjects or disciplines), future research and analysis would need to explore whether gender or student status indicate a true preference for platform use.

Platform Utilisation

The platform was utilised by 314 individual students in 2018, with an increase in utilisation exhibited in 2019 (n = 638). However, the data in Table 3 represents the total as measured by ‘interactions’ or usage, not individual participants. To illustrate, of the 314 individuals who used the platform in 2018, their usage resulted in 332 ‘Connect Live’ sessions and 830 writing submissions. Therefore, the data below breaks usage down by interactions, rather than by individual net usage.

Table 3

Writing Feedback Participation by Course Level.

2018 2019

% (n) % (n)

First Year Undergraduate 45% (375) 49% (610)
Second Year Undergraduate 41% (340) 28% (346)
Third Year Undergraduate 9% (71) 13% (158)
Honours Year <1% (2) 1% (11)
Pathway Program/Certificate/Diploma 3% (26) 1% (11)
Postgraduate 2% (16) 8% (102)
Higher Degree Research <1% (5)
Total n = 830 n = 1243

The results of our study also showcased how the two services were utilised across course levels. Consistent with research on the student lifecycle (e.g. Larmar & Lodge 2014), we expected that student usage of the supplementary study support may vary with progression through a course. As can be seen from Table 3, requests for writing feedback were much higher among students in first and second year levels compared to those in third year, honours, postgraduate or higher degree research (HDR) course levels.

Interestingly, the patterns of engagement with the platform varied for the ‘Connect Live’ (i.e. online tutoring) service (Table 4). While first year undergraduates remained the dominant users, this service saw greater engagement with third year undergraduates than second year undergraduates. It is unclear why this shift may have occurred, though it is possible that specific lecturers or subjects promoted the service to third year students, causing an increase in participation. Interestingly, while the research team had anticipated that students involved in the university’s pathway program (Tertiary Preparation Program) would be high users of both services in the platform, outcome data across both years indicated an opposite pattern of usage among this student cohort.

Table 4

‘Connect Live’ Participation by Course Level.

2018 2019

% (n) % (n)

First Year Undergraduate 65% (216) 68% (610)
Second Year Undergraduate 8% (25) 12% (346)
Third Year Undergraduate 24% (79) 14% (158)
Honours Year 1% (2)
Pathway Program/Certificate/Diploma 3% (9) 3% (4)
Postgraduate <1% (1) 3% (4)
Higher Degree Research
Total n = 332 n = 131

The study also investigated what subject areas were being utilised by students through the ‘Connect Live’ service. As showcased in Table 5, the service offered a range of subject areas, some of which were broad in nature (e.g. referencing, report writing, study coach) and others more specific (e.g. Macroeconomics, Statistics). The subject area most commonly utilised by students in 2018 was Maths, with 47% of all ‘Connect Live’ sessions (n = 156). It is important to clarify that this service only offered bridging Maths, with concepts relating to basic algebra that may help serve as a refresher to students undertaking more complex Maths at university. However, in 2019 the engagement with Maths fell dramatically to only 13% (n = 7) of all ‘Connect Live’ sessions. A likely contributing factor to this reduced engagement was the fact that a subject which heavily promoted the Maths ‘Connect Live’ sessions in 2018 was not offered in Semester 1 2019. This pattern also occurred in Chemistry (30%, n = 99 in 2018 vs. 8%, n = 10 in 2019) and again was associated with subject offering differences across semesters. Conversely, the utilisation of essay writing in ‘Connect Live’ flipped, constituting only 7% of all ‘Connect Live’ sessions in 2018 (n = 22) and then 40% of all sessions in 2019 (n = 52). These findings highlight the importance of taking a curriculum-based approach by promoting the service at the subject level to support usage.

Table 5

‘Connect Live’ Participation by Subject Area.

2018 2019

% (n) % (n)

Assignment Research 1% (4) 3% (4)
Biology [Bridging] 4% (12) 7% (9)
Business Studies [Bridging] 3% (4)
Chemistry [Bridging] 30% (99) 8% (10)
English Skills and Concepts 2% (6) 4% (5)
Essay Writing 7% (22) 40% (52)
Library Skills <1% (1)
Maths [Bridging] 47% (156) 13% (7)
Macroeconomics [First Year] 1% (1)
Microeconomics [First Year] 1% (2) 1% (1)
Physics [Bridging] 1% (2) 2% (2)
Referencing 4% (13) 9% (12)
Report Writing 3% (9) 6% (8)
Statistics [First Year] 1% (4) 3% (4)
Study Coach 1% (2) 2% (2)
Total n = 332 n = 131

The evaluation of the platform further sought to explore the use of the platform outside of standard business hours (9am-6pm). As the service was provided to students only as an addition to the traditional face-to-face academic skills support offered by the institution, it was important to understand the role this service played in delivering support at a time when the university was unable to offer the standard service – after hours (6pm-9pm), late night (9pm-3am) and in the early morning (3am-9am).

Submissions for writing feedback that occurred outside of standard business hours accounted for 43% of all interactions in 2018 and 44% in 2019 (Table 6). This means that students who could not get in-person support were able to access support in these hours. Utilisation outside of business hours was even higher for the ‘Connect Live’ interactions and accounted for 70% of all interactions in 2018 and 72% in 2019 (Table 7). Linking back to the earlier discussion of the importance of diversifying student support in the context of the growing diversity within the student cohort, these results may indicate that the service can provide a useful and relevant alternative to students who would like to access support outside of standard business hours when they may have work and/or family responsibilities.

Table 6

Writing Feedback Interactions by Time of Day.

2018 2019

% (n) % (n)

Business hours (9am–6pm) 58% (478) 56% (702)
After hours (6pm–9pm) 23% (189) 18% (220)
Late night (9pm–3am) 15% (123) 22% (273)
Early morning (3am–9am) 5% (40) 4% (48)
n = 830 n = 1243

Table 7

‘Connect Live’ Interactions by Time of Day.

2018 2019

% (n) % (n)

Business hours (9am–6pm) 30% (98) 28% (37)
After hours (6pm–9pm) 33% (110) 31% (40)
Late night (9pm–3am) 37% (124) 41% (54)
Early morning (3am–9am)
n = 332 n = 131

This study also explored students’ perceptions of the services. It is important to note that not all students elected to respond to the satisfaction question at the end of their interaction, therefore not all interactions received a student rating (see Tables below). Overall it is possible to determine that students were satisfied with both the writing feedback (95% satisfied or extremely satisfied) and the ‘Connect Live’ services (78% satisfied or extremely satisfied), with greater satisfaction overall reported for the writing feedback service (see Tables 8 and 9). While not all students expressed satisfaction with the services (5.1% somewhat or extremely dissatisfied), results do suggest that overall, the platform had value for the majority of those who chose to engage with it (see Tables 8 and 9).

Table 8

Writing Feedback Student Rating.

2018 2019

% (n) % (n)

Extremely satisfied 76% (230) 76% (460)
Somewhat satisfied 17% (52) 20% (121)
Neutral 3% (9) 2% (14)
Somewhat dissatisfied 2% (6) 1% (5)
Extremely dissatisfied 2% (4) 1% (7)
n = 301(36% of total interactions) n = 607 (48% of interactions)

Note: No response recorded for 529 interactions in 2018 and 636 interactions in 2019.

Table 9

‘Connect Live’ Student Rating.

2018 2019

% (n) % (n)

Extremely satisfied 59% (143) 60% (64)
Somewhat satisfied 19% (47) 18% (19)
Neutral 10% (23) 9% (10)
Somewhat dissatisfied 8% (20) 9% (9)
Extremely dissatisfied 4% (9) 4% (4)
n = 242 (72% of total interactions) n = 106 (80% of total interactions)

Note: No response recorded for 90 interactions in 2018 and 25 interactions in 2019.

As mentioned previously, the data collected by the platform (Dataset 1) was supplemented by a voluntary survey offered to students in Semester 2, 2018 (Dataset 2). This survey sought to explore student perceptions of impact of the service beyond that of satisfaction (i.e. student rating). In this section, therefore, we will display and discuss these additional findings.

The results of the supplementary survey (Table 10) indicated that a large proportion of students (n = 27, 70%) agreed or strongly agreed that they would get a higher grade because of the feedback they received from the online tutoring service. The majority of students also believed they were confident in their ability to learn after using the service (n = 31, 81%) and that the service was easy to use (n = 31, 88%). Interestingly, 44% of students (n = 17) also indicated that the provision of the service may make them more likely to stay enrolled at the university. Additionally, over half of the students (n = 21, 58%) agreed or strongly agreed that the online learning support service was more helpful for them than other forms of study support such as lecturers’ office hours (Table 10).

Table 10

Students’ Perceived Impact of Online Learning Support Service.

Question Strongly Disagree% (n) Disagree % (n) Neutral % (n) Agree % (n) Strongly Agree % (n)

Q1. I believe I will get a higher grade because of the feedback I received from the online learning support service. 5% (n = 2) 8% (n = 3) 18% (n = 7) 44% (n = 17) 26% (n = 10)
Q2. I am more confident in my ability to learn after using the online learning support service. 5% (n = 2) 5% (n = 2) 8% (n = 3) 47% (n = 18) 34% (n = 13)
Q3. I am more likely to stay enrolled at university because of the support I received from the online learning support service. 11% (n = 4) 11% (n = 4) 34% (n = 13) 26% (n = 10) 18% (n = 7)
Q4. The service helped me more than other types of study support, such as lecturers’ office hours. 6% (n = 2) 3% (n = 1) 33% (n = 12) 36% (n = 13) 22% (n = 8)
Q5. The service was easy to use. 5% (n = 2) 5% (n = 2) 32% (n = 12) 58% (n = 22)

The study also explored students’ perceptions of the service on specific academic skills. When asked to rate the impact of the online learning support service, the majority of students reported a positive impact on their overall grade on assessments (n = 30, 77%), writing skills (n = 28, 74%) and referencing skills (n = 23, 61%). Some students also reported that the online service had a positive impact on their submission of assignment on time (n = 17, 45%) and their study habits (e.g. ability to organise my thoughts, make a work-plan) (n = 14, 38%) (see Figure 1).

Figure 1 

Students’ Perceptions on Impact.

Through an additional evaluation mechanism embedded in the platform itself, students were also asked to rate their overall satisfaction with the service through binary agree or disagree questions. As can be seen in Table 11, the majority of the students agreed that the platform provided the help they needed, was easy to use, and assisted them to feel more confident. Please note that the total number of responses differed across questions, as student chose to respond or not.

Table 11

Student Satisfaction Responses.

Survey Questions Writing Feedback ‘Connect Live’

Agree Disagree Agree Disagree

I got the needed help n = 754 n = 3 n = 251 n = 68
It was easy to use n = 757 n = 14 n = 303 n = 14
I feel more confident n = 729 n = 36 n = 265 n = 49

Limitations

Our findings, while significant, are tempered by a few limitations. Firstly, and most critically, our supplementary survey had limited student responses (between 30–40 depending on the question) as it was a voluntary survey conducted in one semester of administration. In the future, it may be helpful to expand and promote this survey more widely and perhaps offer an incentive to students to help increase completion. Our study was also only situated at one institution, even though the platform is used across many institutions both in Australia and the United Kingdom. Future evaluations could benefit from collaborating with university partners to share data and co-analyse the results. This could help gain insight into whether different university contexts and/or student cohorts engage with the platform differently.

Discussion

The results of our study, at a large research-intensive university in Australia, highlight that online study support may be an important mechanism to support student learning and promote student success. Students who engaged with the platform were diverse, with one-third of the students being mature age (over 21 years old at the time of enrolment) and a significant proportion studying at a regional campus location. The data further signified that a significant proportion of interactions for both the writing feedback and the ‘Connect Live’ functions occurred outside of standard business hours (9am-6pm). These results evidence that the provision of additional online study support may help students who cannot otherwise attend in-person support during university operating hours. Our findings help provide early guidance for administrators and practitioners considering online supported learning technologies to reflect on their own student population and determine which cohorts may be best served by directed interventions. If universities, for example, want to support non-traditional students (such as mature age, part-time, and studying from distant locations) it is likely critical that 24/7 online support services also be made available.

While previous research has indicated students may prefer face-to-face learning support compared to online learning support (see Bishop & Verleger 2013; Otter et al. 2013; Tratnik, Urh & Jereb 2019), we found that a significant proportion of students found online learning support helpful. This finding in particular was of interest given that the online supported services were provided not by the university in question but rather a third-party provider. Yet our findings suggest that despite the service being provided by external parties, the majority of students were satisfied with the quality. The majority of students indicated they were satisfied or extremely satisfied with both the ‘Connect Live’ (78%) and writing feedback (95%) services. In the supplementary survey (Dataset 2) students also ‘agreed’ to ‘strongly agreed’ that the support they received may help them receive a higher grade (69.2%), be more confident in their learning, and may increase their likelihood of staying at university (44%). While future research would benefit from comparing the value and demographic rates of participation for both online and face-to-face study support, these results evidence the potential value, in terms of success and retention, in offering students online-supported help and feedback, particularly out of hours.

Finally, our study has highlighted the volatile nature of student engagement with supplementary services such as the one explored in this study. There was considerable variation in the engagement of the platform by subject area and course level and topics of greatest interest. This may be impacted by the willingness of an academic staff member to endorse the platform to their students. As the research team learned, there were certain subjects where lecturers strongly endorsed the online study support service and others that did not. Future investigation is required to more rigorously ascertain the underlying perceptions held by staff to this type of platform and the motivations for their endorsement (or lack thereof) to students.

Conclusion

Our study aimed to investigate the impact of online study support services to students at an Australian university. This research is important in the context of supporting increasing diversity in the student cohort. Our results indicate that online study support services are useful to students and may serve an important part of providing flexible study options for all students, regardless of campus location and study load.

While online study support services may not replace face-to-face support in the near future, online support may appeal to a certain subset of students, for example those studying in remote locations, or those who have employment or family commitments that make traveling to campus during business hours difficult. Future research should continue to identify and explore the ways in which students in remote locations and/or with family and work commitments could benefit from online learning support, and further examine the importance of not only service provision but also service impact in terms of supporting student confidence, success, engagement and persistence (e.g. Ligorio, Impedovo, & Arcidiacono 2017).

Competing Interests

The authors have no competing interests to declare.

References

  1. Alexander, S. 2001. E-learning developments and experiences. Education + Training, 43(4/5): 240–248. DOI: https://doi.org/10.1108/00400910110399247 

  2. Bishop, JL and Verleger, MA. 2013, June. The flipped classroom: A survey of the research. In: ASEE national conference proceedings, 23–26 June 2013, 30(9): 1–18. Atlanta, GA. 

  3. Brown, R and Carasso, H. 2013. Everything for sale? The marketisation of UK higher education. Abingdon: Routledge. Available at https://scholar.google.com/scholar_lookup?hl=en&publication_year=2013&author=R.+Brown&author=H.+Carasso&title=Everything+for+Sale%3A+The+Marketization+of+UK+Higher+Education. DOI: https://doi.org/10.4324/9780203071168 

  4. Darling-Hammond, L, Zielezinski, MB and Goldman, S. 2014. Using technology to support at-risk students’ learning. Alliance for Excellent Education. 

  5. Daugherty, PJ, Stank, TP and Rogers, DS. 1996. Third-party logistics service providers: Purchasers’ perceptions. International Journal of Purchasing and Materials Management, 32(1): 23–29. DOI: https://doi.org/10.1111/j.1745-493X.1996.tb00222.x 

  6. Department of Education Australia. 2015. Higher Education Statistics 2015. Available at: https://www.education.gov.au/selected-higher-education-statistics-2015-student-data. 

  7. Dollinger, M, Lodge, J and Coates, H. 2018. Co-creation in higher education: towards a conceptual model. Journal of Marketing for Higher Education, 28(2): 210–231. DOI: https://doi.org/10.1080/08841241.2018.1466756 

  8. Ellis, R and Goodyear, P. 2013. Students’ experiences of e-learning in higher education: The ecology of sustainable innovation. New York: Routledge. ISBN: 0415989361. DOI: https://doi.org/10.4324/9780203872970 

  9. Farley, H, Murphy, A, Johnson, C, Carter, B, Lane, M, Midgley, W, Koronios, A, et al. 2015. How do students use their mobile devices to support learning? A case study from an Australian regional university. Journal of Interactive Media in Education, 2015(1). ISSN: ISSN-1365-893X. DOI: https://doi.org/10.5334/jime.ar 

  10. Fielding, M. 2001. Students as radical agents of change. Journal of Educational Change, 2(2): 123–141. DOI: https://doi.org/10.1023/A:1017949213447 

  11. Gravett, K, Kinchin, IM and Winstone, NE. 2019. ‘More than customers’: Conceptions of students as partners held by students, staff, and institutional leaders. Studies in Higher Education, 1–14. DOI: https://doi.org/10.1080/03075079.2019.1623769 

  12. Holley, D and Oliver, M. 2010. Student engagement and blended learning: Portraits of risk. Computers & Education, 54(3): 693–700. DOI: https://doi.org/10.1016/j.compedu.2009.08.035 

  13. LaPadula, M. 2003. A comprehensive look at online student support services for distance learners. The American Journal of Distance Education, 17(2): 119–128. DOI: https://doi.org/10.1016/j.compedu.2009.08.035 

  14. Larmar, S and Lodge, JM. 2014. Making sense of how I learn: Metacognitive capital and the first year university student. Student Success, 5(1): 93. DOI: https://doi.org/10.5204/intjfyhe.v5i1.193 

  15. Laurillard, D. 2005. E-learning in higher education. In: Ashwin, P (ed.), Changing higher education, 87–100. London: Routledge. 

  16. Ligorio, MB, Impedovo, MA and Arcidiacono, F. 2017. Agency online: Trends in a university learning course. Technology, Pedagogy and Education, 26(5): 529–543. DOI: https://doi.org/10.1080/1475939X.2017.1350599 

  17. Marasco, A. 2008. Third-party logistics: A literature review. International Journal of Production Economics, 113(1): 127–147. DOI: https://doi.org/10.1016/j.ijpe.2007.05.017 

  18. Molesworth, M, Nixon, E and Scullion, R. 2009. Having, being and higher education: The marketisation of the university and the transformation of the student into consumer. Teaching in Higher Education, 14(3): 277–287. DOI: https://doi.org/10.1080/13562510902898841 

  19. Norton, A and Cherastidtham, I. 2018. Mapping Australian higher education. Grattan Institute, Melbourne, Australia. Available at https://grattan.edu.au/wp-content/uploads/2018/09/907-Mapping-Australian-higher-education-2018.pdf. 

  20. Oliver, R and Herrington, J. 2003. Exploring technology-mediated learning from a pedagogical perspective. Interactive Learning Environments, 11(2): 111–126. DOI: https://doi.org/10.1076/ilee.11.2.111.14136 

  21. O’Shea, S. 2016. Engaging families to engage students: Exploring how university outreach activities can forge productive partnerships with families to assist first-in-family students navigate their higher education journey. Australian Government, Department of Education and Training: Canberra. Available at http://www.firstinfamily.com.au/docs/OShea_S_NTF_Report_2017.pdf. 

  22. Otter, RR, Seipel, S, Graeff, T, Alexander, B, Boraiko, C, Gray, J, Sadler, K, et al. 2013. Comparing student and faculty perceptions of online and traditional courses. The Internet and Higher Education, 19: 27–35. DOI: https://doi.org/10.1016/j.iheduc.2013.08.001 

  23. Paechter, M and Maier, B. 2010. Online or face-to-face? Students’ experiences and preferences in e-learning. The Internet and Higher Education, 13(4): 292–297. DOI: https://doi.org/10.1016/j.iheduc.2010.09.004 

  24. Productivity Commission. 2019. The demand driven university system: A mixed report card. Commission Research Paper, Canberra. Available at https://www.pc.gov.au/research/completed/university-report-card. 

  25. Spiegler, T and Bednarek, A. 2013. First-generation students: What we ask, what we know and what it means: An international review of the state of research. International Studies in Sociology of Education, 23(4): 318–337. DOI: https://doi.org/10.1080/09620214.2013.815441 

  26. Stone, C, Freeman, E, Dyment, J, Muir, T and Milthorpe, N. 2019. Equal or equitable? The role of flexibility within online education. Australian and International Journal of Rural Education, 29(2): 26–40. 

  27. Stone, C and O’Shea, S. 2019. Older, online and first: Recommendations for retention and success. Australasian Journal of Educational Technology, 35(1). DOI: https://doi.org/10.14742/ajet.3913 

  28. Taylor, JA and Newton, D. 2013. Beyond blended learning: A case study of institutional change at an Australian regional university. The Internet and Higher Education, 18: 54–60. DOI: https://doi.org/10.1016/j.iheduc.2012.10.003 

  29. Tratnik, A, Urh, M and Jereb, E. 2019. Student satisfaction with an online and a face-to-face Business English course in a higher education context. Innovations in Education and Teaching International, 56(1): 36–45. DOI: https://doi.org/10.1080/14703297.2017.1374875 

  30. Tseng, H and Walsh, EJ, Jr.. 2016. Blended vs. traditional course delivery: Comparing students’ motivation, learning outcomes, and preferences. Available at https://www.researchgate.net/profile/Hung_Tseng2/publication/301204339_Blended_vs_Traditional_Course_Delivery_Comparing_Students’_Motivation_Learning_Outcomes_and_Preferences_Quarterly_Review_of_Distance_Education_171/links/57bdac2d08ae882481a51517.pdf. 

  31. Turula, A. 2018. The shallows and the depths. Cognitive and social presence in blended tutoring. Technology, Pedagogy and Education, 27(2): 233–250. DOI: https://doi.org/10.1080/1475939X.2017.1370388 

  32. Universities Australia. 2019. Data Snapshot: 2019. Available at https://www.universitiesaustralia.edu.au/wp-content/uploads/2019/06/Data-snapshot-2019-FINAL.pdf. 

  33. Wanner, T and Palmer, E. 2015. Personalising learning: Exploring student and teacher perceptions about flexible learning and assessment in a flipped university course. Computers & Education, 88: 354–369. DOI: https://doi.org/10.1016/j.compedu.2015.07.008 

  34. Warburton, S. 2009. Second Life in higher education: Assessing the potential for and the barriers to deploying virtual worlds in learning and teaching. British Journal of Educational Technology, 40(3): 414–426. DOI: https://doi.org/10.1111/j.1467-8535.2009.00952.x 

  35. Zacharis, NZ. 2011. The effect of learning style on preference for web-based courses and learning outcomes. British Journal of Educational Technology, 42(5): 790–800. DOI: https://doi.org/10.1111/j.1467-8535.2010.01104.x 

comments powered by Disqus