Assuring Policy Administration on the EIS Insurance Platform

Assuring the replacement of an off the shelf PAS with a bespoke customised EIS platform

The modernisation agenda

One of the UK’s leading general insurers embarked on a journey to gain competitive advantage by implementing a bespoke Policy Administration System (PAS) using EIS’s insurance platform.

For such a major and critical piece of work they needed partners. Partners with the experience and strength to deal with complex high-risk activities. They settled on EY as their implementation partner and SQC as their assurance partner.

The scale of the challenge

This was a major, multi-year, software development programme. It involved large teams drawn from the insurer’s in house teams, from EIS, from EY and from SQC. SQC alone delivered nearly fifty person years of complex and diverse assurance work in support of the programme.

The initial remit

The activities that SQC were engaged to perform for the programme were diverse. Initially, they were the core assurance activities, the ones that might be owned by a customer’s own QA or test function. These included:

  • Shaping the overall test strategy for the programme. Including defining the boundaries between testing delivered within development functions, such as the EY team, and testing owned by the programme, that is owned by SQC.
  • Scoping and planning the integration, end to end, non-functional and migration testing activities.
  • Programme test management, the activity that oversaw all testing for the programme including development function testing, SQC delivered testing, third-party testing and business acceptance testing.
  • Leadership, staffing and delivery of solution integration, end to end, non-functional and migration testing.
  • Defect administration, managing a vast number of diverse forms of defects in a multi-organisational environment.
  • Defect expediting, driving and coordinating efforts to fix key defects and remediate problem areas.
  • Facilitating business acceptance activities throughout the programme’s life. Helping business teams to engage early and ultimately assess and agree to accept the new solution.

The extras

As is often the case, throughout the programme, capability and responsibility ‘gaps’ emerged. Work that no one was ‘officially’ responsible for, or that represented an unforeseen challenge.

As is always the case, SQC stepped in to deal with these challenges. The many ‘gaps’ dealt with included:

  • Technical troubleshooting, owning problem diagnosis and management for obscure and complex defects. SQC’s technical depth and leadership aptitudes made a key contribution to resolving these issues.
  • Addressing the programme’s lack of leading progress indicators. Building on the analysis SQC had performed, SQC established models that provided meaningful measures of build progress and realistic forecasts of build completion timelines.
  • Creating and operating a set of advance system mock-ups and simulators needed to eliminate impediments to development and testing. These were essential tools that were widely used.
  • Leading the transition from development into production use, SQC shaped the approach and led delivery of the technical and live proving of the production deployment. The final validation step that gave senior stakeholders the confidence to go live.
  • When migration testing required the use of production data, SQC navigated the complex maze of data privacy, compliance and information security concerns that this created. Ultimately, creating a watertight case and approach allowing the use of this data.
  • Bringing to bear SQC’s experience of working with extremely complex solutions made up of multiple systems, gained in the telecommunications domain, SQC shaped the intricate multi-disciplinary approach needed to move from development, via the initial deployment to production, into a live service and maintenance model.

SQC was prepared for this and have always maintained strong and flexible technical capabilities and a capacity for delivery leadership, because in their experience there will always be gaps that need to be dealt with in order to ‘get the job done’.

Recognition from the Steering Committee

Challenge Conquerors Award – For the team that has overcome the toughest challenges and made it look easy.

“SQC’s Test team had a really big responsibility, being actively involved through the entire duration of the programme. Their role often required them to work under significant pressure, managing tight deadlines and high expectations. Despite these challenges, SQC consistently rose to the occasion, demonstrating resilience and dedication. Each day brought new obstacles, yet they continued to meet these challenges head-on, showcasing their unwavering commitment to the programme.”

Predicting the future

The programme had adopted work management techniques from the Agile play book; Epics, Features, User Stories and Sprints. After a few of these Sprints it became clear to SQC that:

  • The rate at which key build outcomes were being achieved was falling well short of the rate needed to hit the aspirational timeline.
  • The ‘planning’ and forecasting techniques being used did not provide any meaningful forecasts of the future trajectory of the work.

Recognising these factors, SQC brought into play a proven tracking and forecasting approach. One they had used many times over many years, always finding that it was a robust indicator of a project’s rate of progress towards build completion.

Leveraging the comprehensive analysis that had been done to support solution integration testing, this measure soon highlighted the overly optimistic nature of short term forecasts and the difference between the aspirational dates for the programme and its trajectory. Once this was surfaced the programme was able to both institute measures to try and improve the rate of progress and take stock and reset expectations around a realistic timeline.

Ethos and approach

SQC brought to the programme the ethos that it was their job to protect the business, if necessary from itself and from the programme that it had created. This was not without its issues. The programme management team sometimes oscillated between expecting to control testing and wanting to portray the test function as being responsible for quality.

With the SQC ethos came responsibility. As well as owning the structure and content of the assurance and testing activities SQC were responsible for their effectiveness. Whilst business stakeholders may have reviewed what was proposed, internally SQC was adamant that if something significant was missed in testing then that would be down to SQC, no excuses, no blame on the business for not telling SQC to test it.

This reflected SQC’s model that the assurance effort needs to be:

  • Independent, a peer to the delivery programme, not under its control.
  • Opinionated, have a view on what is the ‘right thing to do’ at all levels.
  • Knowledgeable, understands the business, the tech and the delivery.
  • Competent, can plan and execute what it says it will do without external supervision.
  • Autonomous, does not create a ‘burden’ on the wider programme organisation by needing domain and technical support other than in very niche areas.

Taking integration seriously

Complex, large scale and multi-system, the programme was destined to face integration challenges. This was clear to SQC and the strategy adopted reflected that.

SQC observes that most test organisations concept of ‘integration testing’ offers little more than early functional end to end testing. SQC’s approach is different. When SQC performs integration testing the aims and approaches are specialised, clearly different to end to end testing. Integration testing is staffed and planned accordingly. Integration testing is technical and intrusive and involves extensive simulation and assessment of anomalies and failures and observation of the external and internal behaviour of the system.

Whilst end to end testing checks that the solution can do what the business needs, integration testing checks that it will do this under a wide range of challenging or adverse conditions. On the EIS programme, the integration testing found many obscure issues that otherwise, would have slipped into production and caused operational incidents.