Start Submission Become a Reviewer

Reading: What Barriers do Students Perceive to Engagement with Automated Immediate Formative Feedback

Download

A- A+
Alt. Display

Articles

What Barriers do Students Perceive to Engagement with Automated Immediate Formative Feedback

Author:

Stephen Foster

The Open University, GB
X close

Abstract

A small preliminary PhD research project used OpenEssayist, a web based automated writing evaluation (AWE) system designed to provide immediate formative feedback to students, to gain insights into how students use such systems. One of the themes which emerged from the data analysis was that most of the students on the module did not make use of OpenEssayist, which raised the question of why? Are there barriers to student use of immediate AWE feedback? The low uptake of use of OpenEssayist reflects the findings of (Attali, 2004), whose research on the Criterion AWE system found that 71% of students did not make use of the redrafting facilities of Criterion and were excluded from his data. All thirty (n = 30) students on the module subject of the preliminary research had the opportunity to use OpenEssayist, regardless of whether they participated in the research, only four students chose to do so. Two students who did not use OpenEssayist were interviewed. The first did not use OpenEssayist for technical reasons, the second did not have enough time to learn about the software. It is not known why the other students did not make use of OpenEssayist. This short paper reports on the preliminary findings of non-use of AWE and outlines how that has led to a research question for a PhD project, which is a work-in-progress. The question is: Are there barriers to student use of immediate AWE feedback?

How to Cite: Foster, S., 2019. What Barriers do Students Perceive to Engagement with Automated Immediate Formative Feedback. Journal of Interactive Media in Education, 2019(1), p.15. DOI: http://doi.org/10.5334/jime.516
15
Views
2
Twitter
  Published on 10 Sep 2019
 Accepted on 13 Jun 2019            Submitted on 15 Feb 2019

Introduction

Hattie and Timperley (2007) observed that feedback is one of the most powerful factors affecting learning, though that effectiveness varies depending on how the feedback is delivered. Helping students to learn by encouraging dialogue with feedback is an important part of a lecturer’s role and feedback can play an important role in developing essay writing skills and motivating students to achieve their academic goals (Irons, 2008). However one of the challenges facing busy lecturers is how to provide timely feedback which facilitates student learning (Irons, 2008).

This article reports on work-in-progress in respect of one element of a PhD research project. The overall aim of the project is to build on previous automated writing evaluation (AWE) research, and address some of the criticisms identified in the literature about the usefulness of AWE by asking students about their perceptions of AWE feedback and how it can support them in their writing of essays and development of essay writing skills. This paper will discuss one component of the project, the aim of which is to explore student perceptions of the barriers to engagement with AWE. The paper will highlight an observation regarding the use of AWE by students then report the results of a small-scale preliminary project which gave students access to OpenEssayist, an AWE tool developed by The Open University and Oxford University. It will then discuss the results of the preliminary project and the further research which is planned.

Literature

Stevenson and Phakiti (2014) conducted a meta-analysis of the research into AWE feedback and the extent to which students perceive that AWE software can help them with essay writing and develop their essay writing skills. Their review concluded that there was quite modest evidence to suggest that AWE feedback improved the quality of student essay writing and was consequently valued by students. They identified methodological concerns with some of the AWE research they reviewed. For instance, they cited Attali (2004) who omitted 71% (n = 23,567) of the 33,171 essays in his sample from subsequent analysis as the participants had not redrafted their essays. This is a large number of participants to exclude and, as Stevenson and Phakiti pointed out, quite correctly, that for over two-thirds of a cohort not to make use of the re-drafting opportunities given by the AWE software provided raises a question regarding the extent to which AWE software is able to encourage students to re-draft their essays. Indeed, 48% of Attali’s students who did re-draft did so only once. This finding compares with that of Warschauer and Grimes (2008) that students might not re-draft an essay more than once. Schroeder, Grohe and Pogue (2008) also conducted research into AWE software and 43% of their students who could have used the software did not. Whilst this is less than the 71% (n = 23,567) of Attali’s students who did not redraft their essays, it does demonstrate that Attali’s research was not unique in having students not make full use of AWE. Taken together these observations raise the question of why students do not use, or make full use, of AWE tools when they are provided.

Court (2014) summarised the position regarding AWE research when she pointed out that more research on how students understand and use feedback is required. A useful starting point for exploring how students use AWE might be the students themselves (Price et al., 2010). Laurillard (2002) made the point that students must be able to understand feedback in order to make use of it. This is an important consideration as MacLellan’s (2001) research found 30% of students stated they did not understand the feedback they received. Might a lack of understanding of feedback received from AWE tools result in low student usage of AWE? Carless (2006) observed that students are often more interested in their assessment mark than feedback, thus implying that students might not value feedback as much as lecturers believe they should. Gibbs and Simpson (2005) suggested a greater understanding by students of what is expected of them in their assessments might help them to understand the feedback they receive. Do some students actively resist the use of AWE, perhaps preferring a return to more tutor centric tuition?

Small Preliminary Research Project – Aim and Methods

The aim of the project was to investigate the extent to which the different types of formative feedback provided by an AWE tool helped students to draft their assignment essays. The AWE tool used for the project was OpenEssayist. Its textual and graphical feedback is intended to encourage students to reflect on the content and structure of their essay. This self-evaluative approach accords with what Beaumont, Doherty and Shannon (2011) saw as the fundamental aim of feedback. The approach to feedback provision provided by OpenEssayist makes it an appropriate vehicle for the investigation of automated formative feedback.

To facilitate the research a link to OpenEssayist was provided within two documents which were posted on the module virtual learning environment (VLE) and sent to participants via an email; A Brief Introduction to OpenEssayist and OpenEssayist User Guide. The version of OpenEssayist used for the project required participants to have logon credentials which were different to their usual Open University logon credentials, these were also sent to participants by email.

The preliminary project research participants were drawn from a cohort of 30 (n = 30) Open University Masters’ level students of whom 15 students (n = 15) could be invited to participate. Of the students subsequently interviewed, 66% had made use of OpenEssayist and 33% had not. Semi-structured interviews were used to gather data and provided an opportunity to probe answers by asking ‘why’ and ‘how’ questions to elaborate answers. The interviews were digitally recorded and undertaken after the students had had the opportunity to use OpenEssayist for two module assignments. The interviews were transcribed then coded using NVivo. The data was subjected to Braun and Clarke’s (2006) six-phase thematic analysis process. Within the preliminary dissertation the six research participants were anonymised and referred to by the abbreviations RP1 to RP6.

Results – Interviews with Non-Users of AWE

One of the themes which emerged from a review of the data, post the report write up, was that most of the students on the module did not make use of OpenEssayist. All 30 students on the module were invited to use OpenEssayist, regardless of whether they participated in the research, only four students did so. Two of the students who had not used OpenEssayist, who were subsequently known as RP4 and RP5, agreed to be interviewed about their assignment writing. This provided an opportunity to ask them why they had not used OpenEssayist.

RP4’s primary reason for not using Open Essayist was technical difficulties accessing the tool. They stated they had to go back to the original email to access the link; though they did acknowledge that one way around this would have been to bookmark the link. Furthermore, once the link had been found the logon page kept re-loading and would not allow RP4 to logon with their credentials. The logon problem was probably related to the beta version of OpenEssayist being used for the research. RP4 stated that the logon process was too complicated and that if they had not already agreed to take part in the research they would have given up trying to access OpenEssayist earlier than they did. They said a tool such as OpenEssayist should have, “…a single click button from the front page of the module website…” and that the logon credentials should be the same as those used to log on to the institution’s VLE.

RP5’s primary reason for not using OpenEssayist was lack of time to use the software. They were really busy with work and family commitments, and were additionally studying a Master’s level module. They said, “…I haven’t used it, because I didn’t leave myself time”.

Results – Use of AWE

Analysis of the interviews with the four students who did use AWE identified that OpenEssayist helped them with the drafting of their essay. For example, all four students who used OpenEssayist felt that, to varying degrees, it assisted them to structure their essays. RP1, a less experienced student, commented that they felt OpenEssayist had provided a ‘scaffold’ which allowed them to write a better assignment answer. While RP2 commented that they were not confident with structuring an essay and that OpenEssayist had assisted. For instance, they said that the summarisation feature of OpenEssayist helped them to identify word repetition and re-structure their work. Following on from this, students commented that OpenEssayist gave them confidence in their writing. For example, RP2 said they were “not really confident about how to structure an essay, and that’s where this [OpenEssayist] has helped”. RP6 commented that OpenEssayist gave confidence that they had covered the topic areas required to answer the assignment question.

One unexpected finding was that whilst use of OpenEssayist did not significantly change the way students planned and wrote their assignments, it did potentially make that process quicker. RP2 commented that they actually wrote less drafts of their essay “….because it’s [OpenEssayist] given me the feedback to be able to get straight to where I need to change, whereas before I didn’t have that so I just relied on other people reading it and thinking I needed to change so it drastically reduced the amount of drafts I did”. Indeed, three of the four students who used OpenEssayist felt that its summarisation of their assignment helped them to complete the assignment more quickly. For instance, both RP2 and RP6 commented that they always checked their assignments to ensure they had covered the points required and that OpenEssayist’s summarisation process made a check of the essay quicker. RP6 said, “to me it was really quite helpful, you can really check back with the assessment criteria, it takes a bit of the work, not going through the whole essay again, and basically highlighting for example, those you actually see what has become of the essay and have it side by side on the screen […]”.

Analysis and Discussion

The reasons that RP4 and RP5 did not use OpenEssayist were primarily context specific. As OpenEssayist was not formally part of the module it was not fully integrated into the module VLE. This meant the process of accessing it was more convoluted than it would otherwise have been, involving the opening of a ‘user’ document, finding the link within the document, clicking on the link and then locating the provided logon credentials, which were additional to the users’ normal University logon credentials. This situation arose because of the way the research was setup and could have been overcome if there had been sufficient lead time to integrate OpenEssayist into the module prior to the start of the research. In previous research, by Whitelock et al. (2014), this was done and access problems were not encountered. However the outcome does suggest that if AWE software is not readily accessible it might not be used.

Furthermore Open University students frequently start their degree study later in life. This often means they have work and family commitments which limit the time they can devote to their studies. This might mean that AWE tools, such as OpenEssayist, are less likely to be used if students are unable to find the time to get to know how to use the software.

In addition to the two students interviewed who did not use OpenEssayist, 80% of the other students on the module did not use it. Which means that in total 87% of students who had access to OpenEssayist did not use it. Other than the two students who were interviewed, it is not known why the students did not choose to make use of OpenEssayist. It is possible that not all students were aware the software was available to them. For, as Sutton (2017) observed, students do not open every email they are sent.

The low uptake of use of OpenEssayist, reflects the findings of Attali (2004), regarding students’ not making full use of Criterion. In the preliminary project, there were two primary reasons why the participants interviewed did not use OpenEssayist:

  • Technical difficulties accessing the software.
  • Lack of time to learn about the software.

There may be other reasons why students did not use AWE software and it would be useful to seek to identify what these reasons might be to enable them to be considered by module staff when planning the use of AWE within a study programme.

Conversely module staff might wish to consider why some students do use AWE. The four participants who did use OpenEssayist felt it helped them with the drafting of their essays and gave them some confidence that their essays covered the criteria set for the assignments. RP1, a less experienced student, felt OpenEssayist had helped them write a better essay. It was interesting to note that some students thought that OpenEssayist might make the overall writing process quicker by, for example, helping them more quickly review their writing. If quickening the process of assignment review is something which could be generally attributed to AWE, then students might wish to weigh time saved reviewing an assignment against the time taken to become familiar with AWE software, which might facilitate students to overcome lack of time as a barrier to use.

Future Work

Prior to the study it was anticipated that student non-use of OpenEssayist would be quite high, though 87% was an unexpectedly high percentage. A review of the dissertation findings has resulted in the formulation of a research question regarding non-use of AWE by students, which will be included as a subsidiary research question within a future PhD study titled: How do students use an automated immediate formative feedback system to help them write assignments? That students find barriers to using AWE reflects the findings of some of the literature on AWE, such as Attali (2004). It would be useful for educators to identify what some of those barriers might be. For instance, Matsumura and Hann (2004) pointed out that since computers have become more integrated into the educational environment there has been an increasing number of students who exhibit anxiety when using them, and this may affect the students’ choice of feedback method and result in students opting not to use AWE software. Similarly, the results of the small preliminary study indicated that one student did not use AWE because of a lack of time to learn about the software. To help understand why there can be some resistance to the use of technology Davis (1986) developed the technology acceptance model which, amongst other things, sought to identify users’ motivation for using alternative methods. As Davis, Bagozzi and Warshaw (1989) later identified, the non-use of computer systems by managers and professionals was not uncommon. Thus, might some time-pressed students choose not to learn how to use AWE software? Might some, particularly more experienced, students be confident in their academic writing and decide that AWE will not assist them?

The future PhD study will consider potential barriers to the use of AWE. It will draw participants from two Open University Level 3 modules. Students will be invited to participate in the research even if they have not used OpenEssayist to obtain feedback. Some of the research interview questions will focus on the non-use of AWE, with the aim of gathering data to establish what some of the barriers to engagement might be. Access to OpenEssayist for the PhD research will be from a web link from the ‘Resources’ section of the participating modules’ websites, which is intended to overcome some of the software access challenges faced by students during the preliminary research.

Conclusion

This article has outlined an additional finding from a small research project exploring the use of AWE by students. The finding indicates that some students do not make use of the opportunity to use AWE when it is provided to them. The article has highlighted that there is evidence from AWE literature that some students do not make full use of AWE and the finding of Attali (2004) has been specifically cited. The literature also highlights some additional reasons why students might not use AWE at all. For example, Matsumura and Hann (2004) highlighted that students may experience some anxiety when using computers and Davis (1986) recognised some user reluctance to accept technology.

The article has identified that an area of future research will be to find out what students perceive are the barriers to their engagement with AWE. Data to help answer this question will be gathered from two modules of Open University Level 3 students.

Acknowledgements

The author would like to acknowledge the support of his supervisors, Professor Denise Whitelock, Dr Karen Kear and Dr Simon Cross.

Competing Interests

The authors have no competing interests to declare.

References

  1. Attali, Y. 2004. Exploring the feedback and revision features of Criterion. The National Council on Measurement in Education. April 12–16 2004, San Diego, USA. 

  2. Beaumont, C, O’Doherty, M and Shannon, L. 2011. Reconceptualising assessment feedback: A key to improving student learning? Studies in Higher Education, 36(6): 671–687. DOI: https://doi.org/10.1080/03075071003731135 

  3. Braun, V and Clarke, V. 2006. ‘Using thematic analysis in psychology’. Qualitative Research in Psychology, 3(2): 77–101. DOI: https://doi.org/10.1191/1478088706qp063oa 

  4. Carless, D. 2006. Differing perceptions in the feedback process. Studies in Higher Education, 31(2): 219–233. DOI: https://doi.org/10.1080/03075070600572132 

  5. Court, K. 2014. Tutor feedback on draft essays: Developing students’ academic writing and subject knowledge. Journal of Further and Higher Education, 38(3): 327–345. DOI: https://doi.org/10.1080/0309877X.2012.706806 

  6. Davis, F. 1986. A technology acceptance model for empirically testing new end-user information systems: Theory and results. Thesis (PhD). Sloan School of Management, Massachusetts Institute of Technology. 

  7. Davis, F, Bagozzi, R and Warshaw, P. 1989. User acceptance of computer technology: A comparison of two theoretical models. Management Science, 35(8): 982–1002. DOI: https://doi.org/10.1287/mnsc.35.8.982 

  8. Gibbs, G and Simpson, C. 2005. Conditions under which assessment supports students’ learning. Learning and Teaching in Higher Education, 1: 3–31. 

  9. Hattie, J and Timperley, H. 2007. The power of feedback. Review of Educational Research, 77(1): 81–112. DOI: https://doi.org/10.3102/003465430298487 

  10. Irons, A. 2008. Enhancing learning through formative assessment and feedback. Abingdon: Routledge. DOI: https://doi.org/10.4324/9780203934333 

  11. Laurillard, D. 2002. Rethinking university teaching: A conversational framework for effective use of learning technologies. 2nd ed. Abingdon, UK: Routledge Falmer. DOI: https://doi.org/10.4324/9780203304846 

  12. MacLellan, E. 2001. Assessment for learning: The differing perceptions of tutors and students. Assessment & Evaluation in Higher Education, 26(4): 307–318. DOI: https://doi.org/10.1080/02602930120063466 

  13. Matsumura, S and Hann, G. 2004. Computer anxiety and students’ preferred feedback methods in EFL writing. The Modern Language Journal, 88(3): 403–415. DOI: https://doi.org/10.1111/j.0026-7902.2004.00237.x 

  14. Price, M, Handley, K, Millar, J and O’Donovan, B. 2010. ‘Feedback: all that effort, but what is the effect?’ Assessment & Evaluation in Higher Education, 35(3): 277–289 [Online]. DOI: https://doi.org/10.1080/02602930903541007 

  15. Schroeder, J, Grohe, B and Pogue, R. 2008. The impact of Criterion writing evaluation technology on criminal justice student writing skills. Journal of Criminal Justice Education, 19(3): 432–445. DOI: https://doi.org/10.1080/10511250802476269 

  16. Stevenson, M and Phakiti, A. 2014. The effects of computer-generated feedback on the quality of writing. Assessing Writing, 19: 51–65. DOI: https://doi.org/10.1016/j.asw.2013.11.007 

  17. Sutton, H. 2017. Don’t get lost in the noise: Craft emails students will read. Enrollment Management Report, 21(9): 6–7. DOI: https://doi.org/10.1002/emt.30364 

  18. Warschauer, M and Grimes, D. 2008. Automated writing assessment in the classroom. Pedagogies: An International Journal, 3(1): 22–36. DOI: https://doi.org/10.1080/15544800701771580 

  19. Whitelock, D, Twiner, A, Richardson, JT, Field, D and Pulman, S. 2014. ‘OpenEssayist: real-life testing of an automated feedback system for draft essay writing’. #design4learning: from blended learning to learning analytics in HE. DOI: https://doi.org/10.1145/2723576.2723599 

comments powered by Disqus