Online exam review

The following is a review of the first set of CIPFA online examinations held in June 2016.

Online examination project review

1. Project objective
To grow CIPFA student and member numbers sustainably by delivering exams globally in a fair and equitable way that allows faster results and better access to data and performance analysis.

2. Background and overview
In updating the Professional Qualification syllabuses, CIPFA took the opportunity to review its assessment strategy and introduce online examinations at the same time as the new syllabus is implemented.

The principal requirement was that professional examinations assess the breadth of the syllabuses through multiple choice questions and the depth of the knowledge and skills through written long form questions, including ones that require the completion of financial statements.

Students can now take examinations in a number of ways:
  • Fully online with remote invigilator
  • Fully online with a local invigilator
  • Using an offline version of the system with a local invigilator
  • On paper with a local invigilator (reserved for special projects and circumstances)
3. Benefits of new system
  • Faster identification of exam issues and therefore targeted user support
  • Ability to securely manage exams to enable students to take exams out of allocated sessions (i.e. unable to take the test as allocated on a Monday, re-assigned for another day) - something that was not possible previously
  • Faster receipt of scripts (and where MCQs were used, faster receipt of marks)
4. Innovation
The project involved a number of innovations in assessment including remote invigilation (where exams are invigilated via a web based system), access to a suite of practice tests and online marking. A wide range of assessment tools allowed examiners an improvement in assessing student's; knowledge and understanding of the subject.

5. Feedback from November 2015 pilot and resulting system improvements
Following the review of the first 2 exams held in November 2015, a number of improvements were agreed and implemented improve the usability of the platform. Specific system improvements included:
  • Integration of exam software (Calibrand) and invigilation software (ProctorU)
  • Feedback report to show questions, student answers, model answers and marks/comments from marker (for tutor marked tests) for practice tests
  • Inclusion of PDF viewer (for formulae sheet, statistical tables and accounting ratios)
  • Independent scrolling for split screen questions containing scenarios, financial data and other additional information
  • Upgrade of answer editor, including: copy/paste, highlight text, etc.
  • Table manipulation (insert rows, columns, etc. with single click)
  • Table presentation and layouts
  • Systems requirements check button for internet speed, screen resolution, browser, OS, etc.
  • Automated browser compliance check (displays message if user opens software in unsupported browser).
On the process side, all email communications were redesigned and to be more focused. We provided free technical tests for ProctorU invigilation as well as generic systems tests for students to familiarise themselves with the systems and build confidence ahead of live exams. We have recorded and published three webinars for students and invigilators alike.

6. Feedback from students employers prior to June 2016 sitting
On 23rd May, the CIPFA student Network sent Rob Whiteman an official letter of concern regarding the forthcoming exams, summarising the situation as: "In summary, the recent changes seem to have been poorly planned, badly implemented, and carried out too quickly without sufficient stakeholder engagement and consultation."

A number of suggestions for immediate improvement were made, some of which were adopted, whereas others would have required more time than available to implement. The full letter is attached.

In addition, a number of employers and students made direct contact with Rob and Giles Orr outlining similar concerns. Additional changes to processes were made in the days leading to the exams including the reversion of some class exams to paper-based assessments and rescheduling individual examination times where required.

7. Issues during the exam period
A number of issues were experienced during the exams themselves, including time zone mismatches between systems, online invigilation failures and other technical issues, with a number of students having delayed or rescheduled tests.

The majority of these were resolved during the allocated exam period, but many students suffered additional stress, and a number of UK employers expressed dissatisfaction with the process.

8. June 2016 implementation summary
The project to move CIPFA professional and international qualifications to an online system culminated in 14 out of 16 papers in the June 2016 sittings being available electronically, compared to only 2 papers the previous December. In total, 1415 electronic tests and 737 paper tests were booked (the latter being largely driven by locations with unreliable power supply - Bangladesh, Somalia, certain UN locations).

Overall, the actual e-assessment process worked better than expected, with 81% of all electronic assessments being completed as planned, 7% switching to paper-based tests and an overall no-show rate of 12% (i.e. students not taking a booked exam - this is in line with previous exam sittings.

9. Ongoing feedback
Feedback has been sought from students and employers who experienced particular difficulties in the lead up to and during exams. Additionally, a survey was sent to all participants shortly after exams were completed.

This has enable us to triangulate and pinpoint the specific issues with the approach, processes and overall project execution.

10. Remote invigilation - exit survey
Students were naturally nervous of remote invigilation as the majority had no direct experience of the process and were nervous of technology failure on the day.

Over 80% of students that sat their exams via remote invigilation were satisfied or very satisfied with the experience, an increase from 74% in the November 2015 pilot exams. Only 8% of students were dissatisfied or very dissatisfied, mainly due to connection issues caused by slow internet.

11. Student survey
A full survey is being undertaken to seek the views of all students on the June 2016 exam cycle, their experience and opinion including bookings and process, communication tuition, materials and the online exams. Early analysis (265 respondents) show high levels of dissatisfaction with:
  • Quality of learning material content
  • Practice questions (number of attempts, number of questions, availability)
  • Timeliness of responses to queries
  • Unavailability of exam centres
70% of students said that they would want to continue to take the exams under the new approach.

12. Summary of issues identified during the exam diet

Full list of issues and proposed resolutions/actions is shown in the table below. The key issues are around:

  • Insufficient support to employers regarding hosting exams
  • Poor pre-exam communications
  • Suspension of practice tests during the exam period left some student without a practice environment for up to a week.
  • Inconsistent invigilation support, resulting in variable practices that were all resolved but added to student stress.
  • System glitches experienced by individual students usually due to an update to their local PC after their technical test and/or using a different PC to their original.
  • Random connection issues.
We are collating all issues and investigating each and every one in detail to minimise future occurrence.

13. Results and marking
Results are also expected to be better than previous sittings, as a result of the hard work by students, tutors and exam team. Students were given unprecedented access to practice tests using the online system (Over 1,000 students took more 6,000 practice tests online) which undoubtedly helped their preparation for the new format online exams. Examiners found the new online marking system straightforward to use and be able to read typed answers rather than attempting to dis cypher student's handwriting was a definite advantage.

14. Conclusion
The first full sitting of CIPFA's online exams has taken place with relative success in June 2016. There are however, significant issues that need to be addressed and the opportunity to improve the overall student experience.



Immediate Solution

Proposed Solution for Longer-term consideration


Pre-exam communications, outlining the new approach, system improvements and other communications were delivered in a haphazard fashion, relying almost entirely on a series of e-mail communications that relied on the recipient (student and/or employer) keeping track of all the changes.


Completely redesign communications strategy to be web based and to point students, employers & invigilators to website rather than relying on email alone.

Email communications to be kept to a minimum.

Clearer messages for practice tests, updates.

Online page to be created to show current and future developments and improvements along with status updates.



Learning materials have a demonstrable drop in quality, driven by syllabus & format changes and timescales necessary to meet translation deadlines.

All errors have been corrected for July 2016 republication.


Improvements to learning materials will be included in the annual updates, ready for 2017.


Workbooks are not aligned to exam requirements (no practice questions that can be completed in class)


The alignment of practice tests and workbooks is most pronounced in PSFR module issue. The practice tests will be amended for Nov 16 exam preparation. 



Pre-exam teaching by CETC and NTU did not align with the new exam processes, resulting in contradictory techniques being taught. Kaplan did adopt the new processes and used the system extensively in teaching sessions.

Teaching organisations are revising their strategies to incorporate use of online system

Full roll-out form August 2016



The new system required significant changes of approach by UK employers and students. The impact of these changes was underestimated by the exams team. One option – to host local exam centres by employers required significant and support being provided by the employers. This incurred significant additional costs for little added value.

Potentially offering a discount for exams taken at employer offices with a local invigilator.

CIPFA to investigate provision of better support for employers of large groups of students sitting together, including:

  • Providing trained invigilators. Reinstating exam venues in UK



Is PSFR exam suitable for online delivery?


PSFR examiner has seen no evidence of issues being caused by online delivery. 2 scripts showed students redoing pro formas manually.


A further review of PSFR rubric will be undertaken in time for 2017 exams.


The exam delivery system underwent significant revisions as issues were identified by users during practice and mock exams. These improvements were not clearly communicated to students, resulting in confusion over the use of the software.

Online page is to be created to show current and future developments and improvements along with status updates.


System requirements made more obvious and easier to check upfront, eg using clear error messages when using unsupported browsers.


Practice test results for MCQs only offer the ‘right’ and selected’ answer, rather than all possible answers, making it difficult for students to track where errors were made.

Feedback report that students receive is scheduled to be enhanced. 



Practice questions were removed from the system the day before the full exam sessions commenced. This was done for good systems reasons (bandwidth, server capacity and potential to confuse students regarding which test they were actually meant to take) This meant that some students were left up to a week without practice questions before their own exam. Practice questions were issued to students requesting them, but this was on an ad-hoc basis.

Practice tests will be available up until the day of the exam.

We will also be introducing unlimited attempts at unmarked tests to allow as many attempts as they wish.

We will be removing time constraints on smaller practice tests, while we retain the timed conditions for full exam practice (specimen and mock exams)



The downloadable version of the exam system was problematic and sensitive to local computer changes. Issues were identified with secure user systems (e.g. government PCs) with students advised to transfer to personal computers, adding to delays and stress levels.

Further developments of offline browser is underway, with additional professional testing of browser.


Explore ‘bring your own device’ to CIPFA centres, including offering suitable equipment as part of training package.

Make available practice tests as downloadable tests.



The Student Support team and other staff were available from early morning to early evening throughout the exams period. However, the multiple communication routes into the team meant that some calls were misdirected. The direct line to the team was communicated via email and not made easily available on the website.

A redesign of the website if being undertaken to provide clearer presentation and more detailed information on the website.


The Student Support team will continue to provide extended hours and out of office hours support during exams. These will be communicated to invigilators and students ahead of exams.


A triaging system was used to identify immediate exam issues to fix and ones that related to later exams. This resulted in a variable level of service being received by students who were never certain when their query would be responded to.

Triage system was necessitated by the volumes of calls and emails. A clearer email query queuing system is being implemented.


The Proctors used for remote invigilation were variable in quality, with some giving wrong information regarding exam requirements. Whilst most of these were rectified, it did delay some students taking exams, adding to stress.


The issues with ProctorU are being dealt with via contract management, including on-site visits to the HQ. Better communication as well as technical testing will be an important part in understanding the setup.


Alternative modes of exam delivery are being considered for candidates who are unable to use ProctorU.