Introduction

Upholding academic integrity has always been a priority for academic institutions (Spaulding, 2009). However, knowing that a student registered on a particular programme is the same student sitting a test, and ensuring they abide by the institution’s code for academic integrity is often viewed as more challenging for online distance learning programmes than for their traditional campus-based counterparts (Lanier, 2006; Grijalva, 2006; Olt, 2002; Spaulding, 2009; Ullah et al., 2012). The introduction of certification for Massive Open Online Courses (Parr, 2013) has led to an increased interest in student authentication in online distance learning contexts as exemplified by Coursera’s Signature Track (Bartholet, 2013).

More recently there has been a trend towards the adoption of technologies that support remote invigilation (Bartholet, 2013; Eisenberg, 2013). “Remote web invigilation” typically refers to invigilation where a webcam is used to record examinees during a test; invigilators and/or others can subsequently review the recorded session at a convenient time. “Remote live invigilation” typically refers to invigilation where a webcam is used to enable human proctors to watch and monitor examinees remotely as they complete exams.

This paper summarises the joint work between the School of Computer Science and UH Online at the University of Hertfordshire (UH) in evaluating the use of remote live proctors for the purposes of improved student authentication and invigilation in online tests. The following were also motivating factors for both parties:

  1. The increased use of online tests on online distance learning programmes, and the need to support students in adhering more closely to technical instructions (e.g. logging into the assessment system) as well as providing more detailed information to UH staff when problems occur.
  2. Unlike in previous years, specific online distance learning student groups are now more likely to meet the hardware and network requirements of remote live invigilation.
  3. The growth of remote invigilation vendors has led to more choice in the types of remote invigilation available at more competitive price models.

The key features of the Pilot Study are introduced next.

The Pilot Study

The pilot study was organised into two phases as shown in Table 1.

Phase Module Participants’ Country of residence

1 Level 6, BSc(Hons) Computer Science Kenya
Saudi Arabia
Trinidad & Tobago
United Kingdom
Zambia
2 Level 4, BSc(Hons) Computer Science Egypt
Kenya
Slovakia
United Kingdom

Table 1

Pilot Study Overview.

Phase 1 included two objective tests, one formative and one summative. The tests consisted of both multiple-choice and multiple-response questions. Both assessments were password protected meaning only the proctor from the service provider was able to launch the assessment. In addition, the assessment was delivered via a secure browser window meaning no other application could be launched during the test.

Phase 2 consisted of one summative test. A summary of the tests included in the pilot can be found in Table 2.

Phase Test Number of participants Type Duration No. of questions

1 1a 4 Formative 30 minutes 20
1 1b 8 Summative 30 minutes 20
2 2 9 Summative 50 minutes 40

Table 2

Phases 1 and 2 detailed.

In this work, an external service provider was used for the remote live invigilation. For completeness, a brief summary of the authentication and invigilation process is presented below.

How the remote live invigilation worked

There are a number of providers that offer remote live invigilation, including ProctorU (2015a) and Software Secure (2015). There are a number of administrative steps that precede the remote live invigilation. These include setting up the test with the service provider, which entails providing the necessary details of the examination such as dates, times, access information and crucially the examination rules. In addition, the contact details of the person overseeing the exam (usually the module leader) are also given to ensure proctors can call upon the institution for assistance if required.

Once a request has been accepted, students are invited to book a preferred time slot within the window of opportunity specified by the module leader. The service is available 24 hours a day 7 days a week so there is potential for exams to take place across the complete range of time zones. After a booking has been made, students are sent a confirmation email and the module leader is also notified.

On the day of the exam, students are required to log in at the time of their chosen slot. An email reminder is sent 24 hours prior to the exam together with full instructions on how to do this or request help if need be. If a student forgets to log in then the service provider will contact them via telephone.

Once logged in, students are prompted to download and run the software that will connect their webcam and desktop to a live proctor. After a connection has been established, the following authentication and environment checks may begin:

  1. The first of these is the identity check, in which students are asked to present a government or academic photo identification card. The proctor checks that the photo and name on the card match with the person who has scheduled the appointment and is present onscreen via webcam. Nothing is recorded or captured at this stage. Instead it is simply logged as having been checked.
  2. Next, the proctor takes a digital photo of the student and stores this on the service provider system for future reference. If there is already a photo on file, the proctor will compare it to the person onscreen.
  3. US citizens are required to answer a series of multiple-choice challenge questions based on public records; a typical example would be selecting a previous postcode from a choice of four. Non-US citizens are typically required to present a second form of photo identification.
  4. Following authentication, students are asked to pan over their work area using their web cam and hold a reflective surface to the camera to ensure there are no disallowed materials or persons present.

If all the checks are completed to the satisfaction of the proctor, the examination rules are read out by the proctor and the student is invited to log into the assessment application. Depending on how the assessment is configured, the proctor may be required to enter a password to launch the assessment on behalf of the student.

Finally, once the assessment is running, the proctor will monitor the student’s desktop, their environment (for example, sound) and also their conduct. Any unusual behaviour is logged and a response is made in accordance with the specifications provided by the module leader. Typically a student is first warned so that they may rectify their behaviour, if the behaviour continues then the proctor can use their judgment to gather enough evidence (photos and screenshots) to report back to the institution.

It should be noted that, prior to the pilot taking place, some UK-based participants expressed specific concerns about data protection and privacy, for example:

  • being viewed by “a stranger” and sharing their living environment;
  • showing personal identification to “a stranger” and the possibility of identity theft;
  • giving “a stranger” remote access to a personal computer and the potential for data to be removed or malicious software to be installed;
  • the need for guarantees that the proctoring service would adhere to data protection and privacy laws.

Participation in the pilot study was optional. Participants used their University student ID card as the main means for test authentication. Furthermore, participants were provided information relating to the service provider’s safety and security certifications as well as their compliance with a number of safety and security frameworks (ProctorU, 2015b) before deciding whether or not they should take part in the pilot.

Following the completion of their proctored assessment, participants were invited to share their experiences with the research team. Nine out of 21 participants replied, and the key findings from this evaluation can be found below.

Participants’ Attitude Towards Remote Live Invigilation

In order to gain a deeper understanding of participants’ attitude towards remote live invigilation, participants were invited to answer three open questions via email.

  1. Does having a live proctor make you feel more supported should something go wrong during a test (e.g. technical problem, sickness etc.)?
  2. You have taken a test with a live proctor. Did you feel that having a proctor hinders or enhances online assessment in any way?
  3. Do you think we should use remote live invigilation in other modules?

Table 3 summarises participants’ responses to question 1. It can be seen from Table 3 that 7 out of 9 participants reported that the availability of a live proctor made them feel more supported as exemplified by the following quote:

Yes, it is preferable to have someone on hand in the event of a problem during the exam”.

Question Yes Neutral No

1. Does having a live proctor make you feel more supported should something go wrong during a test (e.g. technical problem, sickness etc.)? 7 2 0

Table 3

Feeling more supported during assessment process (N = 9).

As can be seen from Table 4, no participant reported that the remote live invigilation hindered their online assessment experience. Interestingly, one student said they had initial concerns about being watched but, in practice, this was not an issue as the invigilation process was largely unobtrusive:

I don’t feel it enhanced my exam, but it did not hinder it either. The speed of the exam would not have allowed a student to search notes even if they wanted to. At the start I was concerned with the thought of having someone watching me, not because I could not cheat, but because I felt it may have added extra pressure. Once I had started I was glad the proctor went invisible, so to speak, And I was able to take the exam uninterrupted.”

Question Enhances Neutral Hinders

2. You have taken a test with a live proctor. Did you feel that having a proctor hinders or enhances online assessment in any way? 2 7 0

Table 4

Perceived impact of remote live invigilation on online assessment experience (N = 9).

Furthermore, some participants reported that the presence of a live proctor enhanced their assessment by allowing them to concentrate more and worry less, as captured by the following quote:

[Remote live invigilation] enhances online assessment because student need not worry about explaining any errors before, during and after the exam. A live proctor takes care of that and hence student’s level of concentration is higher as a result.

Table 5 shows that 8 out of 9 participants were supportive of extending the use of remote live invigilation to other modules. These participants listed factors such as increased credibility of the course as well as the feeling of being “looked after” and valued by the institution as the main reasons for their support.

Question Yes Neutral No

3. Do you think we should use remote live invigilation in other modules? 8 0 1

Table 5

Participants’ perceptions as to whether the use of remote live invigilation should be extended to other modules (N = 9).

The participant who did not support the use of remote live invigilation in other modules commented that the authentication process took too long resulting in some assessment anxiety:

Usually during this period before an exam I take the time to relax and mentally prepare for the exam. In this case I felt having to go through this lengthy process, made me feel nervous and anxious.

Summary and Future Work

The work reported here is concerned with an initial evaluation of remote live invigilation in an online distance learning programme. As part of this work, participants took place in an online test in their home environment. Student authentication and invigilation were carried out by a remote live proctor via web conferencing and screen sharing technologies.

Despite the limitations of a small scale study, the results reported here indicate that remote live invigilation presents a potential solution to the issue of student authentication and cheating in online examinations. Furthermore, in spite of some initial concerns about data protection and the impact that feeling “watched” might have on their online assessment experience, participants’ feedback on the use of the remote live invigilation was positive overall with some even suggesting that the presence of a proctor might reduce stress if things go wrong.

Often in Higher Education high-stakes assessment is in the form of project work and coursework which are submitted electronically or in hard-copy. Student authentication in these circumstances often relies on the experience of tutors who certify that the work was that of the candidate. Any form of assessment is at risk of cheating by, for example, impersonation or collusion. It should be noted that the security of online examinations does not relate so much to keeping out intruders, as in a secure banking system, but rather to authenticating the candidate to ensure that the test is completed by the candidate and not some other person on their behalf.

The challenges in authenticating candidates and ensuring that they adhere to assessment regulations in online assessment have been described by Rowe (2004) and Rogers (2006) amongst others. Whilst it may not be possible to ensure that an online examination system is totally secure, it is the view of the authors that remote live invigilation goes some way to providing assurances that the person taking the exam is indeed the candidate, and to a greater extent that they are working alone and unaided.

As part of our future work, we plan to carry out a further study involving a large cohort in order to gain a deeper understanding of students’ attitudes towards remote live invigilation. As mentioned earlier this was a small scale pilot, and the participants who shared their experiences were volunteers and therefore subject to self-selection bias.

Another area for future work would be to investigate the design and use of challenge questions based on specific learning contexts (for example, the candidate’s contribution to a forum discussion) and information known only to the candidate to further discourage impersonation.

Finally, it is hoped that the use of remote live invigilation would allow for greater flexibility in the assessment formats used in online distance learning programmes. For example, it would make it feasible for online distance learning students to take part in timed practical programming tests under supervised conditions.

Competing Interests

The authors declare that they have no competing interests.