Overview

DataValidator

Prudential UK is one of the UK’s largest insurance companies, which means lots of data to process and validate. Driven by a need to accelerate the working-day timetable and improve audit and transparency, Prudential approached Willis Towers Watson to discuss how DataValidator (our data transformation, validation and cleansing software) could provide the solution they required. This was explored firstly through a proof of concept, following which Prudential performed an implementation of DataValidator unsupported by Willis Towers Watson. This article describes that process and goes on to uncover the benefits delivered by DataValidator and some first-hand user feedback.

Background

Prudential’s Data and Experience Analysis (DEA) team oversee the processing of policyholder data, taking it from its raw sources in the Actuarial Data Mart (ADM), through transformation, cleansing and validation, and delivering a set of model points to the modelling teams.

Solvency II requires transparent data processing from source to model and objective metrics to show any data feeding into the actuarial model is complete, accurate and appropriate.
 
 

Solvency II requires transparent data processing from source to model and objective metrics to show any data feeding into the actuarial model is complete, accurate and appropriate. This cannot be guaranteed unless the production process is visible and auditable – every stage of the process must be documented and reported. This was challenging to achieve with the number of processing steps and manual hand-offs within the current process.

About DataValidator

Willis Towers Watson’s DataValidator is a flexible and user-friendly software solution that enables a company to validate, cleanse and transform data, efficiently preparing it for use in financial modelling and reporting processes. The software helps insurers to streamline the production of financial modelling and risk analytics, while maintaining strict governance controls to satisfy challenging regulatory requirements.

Key features of DataValidator:

  • Assess data quality: Users can pre-define custom data check rules, which it subsequently uses to assess the accuracy and completeness of the data. It can also validate the reasonableness of movements in the data across time periods.
  • Transform data: Allows a company to perform complex data transformations. This flexibility makes it possible to get the data in a format that is appropriate for other downstream systems and processes.
  • Audit readiness: Produces reports to capture results of data quality checks and a full audit history of data cleansing actions and transformations to provide the necessary audit trail required for regulatory compliance and best practice governance.
  • Processing power: Supports batch processing, which enables the processing of multiple data extracts simultaneously at the click of a button and live job-monitoring through a clean interface design.
  • Ease of use: Its clean and intuitive user interface and functionality helps users learn their way around the software quickly and to use it efficiently. Its flexibility makes it a useful tool for every-day use and can provide benefits far beyond the regular reporting cycle.
  • Workflow integration: Can be used in business-wide workflows through interfacing with Unify – our systems’ integration and workflow automation platform.

Proof of concept – proving DataValidator was the right solution

After an initial demonstration of DataValidator and discussion on how it could fit into Prudential’s operating model, a decision was made to pursue a proof of concept. This was a focussed six-week exercise to replicate some of the existing data transformation and validation routines in DataValidator to prove it would meet the requirements and demands of the new world.

Throughout this proof of concept, we provided training and on-site support; but the majority of the work was performed by Prudential’s DEA team.

As part of the proof of concept DataValidator was compared against two alternative solutions:

  • An in-house expansion of the ADM (Actuarial Data Mart) to perform all the requirements.
  • A competitor solution.

Prudential were keen for the chosen solution to align with the skillset of the DEA team, and wanted to avoid a general data management solution that required a skilled IT technical team to implement and operate. A key drawback of the competitor solution was that its highly flexible and expressive functionality could lead to processes that were inconsistent and difficult to maintain. Whereas the regimented nature of DataValidator aligned with the DEA team’s skillset and was felt to enforce consistency and standardisation and so would ease future development and maintenance.

DataValidator has been purpose-built for the insurance industry, with its functionality designed to meet demands and requirements of insurance processes and Solvency II requirements.
 
 

DataValidator has been purpose-built for the insurance industry, with its functionality designed to meet demands and requirements of insurance processes and Solvency II requirements. The audit trail and governance provided by DataValidator were highlighted as being superior to the competitor solution and relevant to the needs of the industry.

Prudential felt that to develop an in-house solution would result in a significant cost burden, both to develop a robust and capable solution and to then to continually develop it to keep pace with industry changes. Whereas they could expect DataValidator to be periodically enhanced without significant cost.

The proof of concept was concluded a success and Prudential purchased DataValidator.

Implementation of DataValidator for model point production and validation

The implementation was performed by Prudential’s DEA team, with us providing ad-hoc support. This is a clear demonstration of the user-friendly and accessible nature of the software – there is no reliance on external support to implement or use it.

The implementation was completed over six months, which included:

  • Re-coding the transformations from the legacy systems into DataValidator. In total, 100 DataValidator projects were created to transform the 32 raw data files (coming from 10 separate administration system sources) into the 10 model point files required by the modelling team.
  • Setting up a comprehensive range of data quality checks using DataValidator’s in-built validation functionality. Over 1000 data quality checks were set up to thoroughly validate the accuracy and completeness of the data and ensure the Solvency II requirements were satisfied.
  • Establishing data cleansing rules to automatically correct known issues in the data in a clean and auditable way.
  • Transferring current data checks into DataValidator.
  • Developing a robust and stable production environment for actuarial model points, with data tracked from source to completion.

Overall, the implementation was considered highly successful with DataValidator being used in live production from Q3 2016 to produce model-ready data for all but one of Prudential’s models.

Benefits achieved

Prudential highlighted the following key benefits of DataValidator:

  • Increased confidence in the data
    DataValidator performs a robust and comprehensive range of data quality checks on the data. These are applied consistently and automatically to all product lines, with consistent audit reporting provided by DataValidator's automatic reporting functionality. These clearly provide the detail of any changes made through cleansing and highlights any failures/warnings for investigation.
  • Reduced / improved working-day timetable
    DataValidator delivered significant automation to model point production; data cleansing, transformation, validation and export which are now all processed sequentially, with user intervention limited to specifying input parameters only. This replaced a set of manual processes that required user intervention at each stage.
  • Improvements in controls
    All transformations, data checks, cleansing and validations applied to the data from source to model is automatically captured by DataValidator in clear and auditable reports. This helps to facilitate sign-off of the data and gives users the confidence that they know the history and provenance of the data.

Looking ahead

DataValidator is currently deployed as a policy data transformation, validation and cleansing tool – this is its primary function.

However, its flexibility and user-friendly nature mean it can be applied much more widely throughout the actuarial process. Prudential are currently exploring these further uses and expect to utilise DataValidator to:

  • Assist with asset data validation
  • Aggregate, process and validate model results ready for actuarial review
  • Assist with experience investigations and analysis.