Effective and efficient data transformation and validation is of the utmost importance to aid insurers meet deadlines and minimise errors, especially in the Solvency II framework. In this article we look at how our latest software offering, DataValidator, assisted Guardian Financial Services to improve and consolidate their existing data processes.
Guardian Financial Services employed a number of different solutions for the validation and transformation of policyholder data ready for use in their actuarial model. With Solvency II drawing ever closer, Guardian approached us for a software solution to consolidate this process and improve their working day timetable.
DataValidator provided such a solution and helped to streamline the policy data production process and significantly enhance the ability to validate the accuracy and completeness of data. This article describes the implementation of DataValidator for Guardian and the resulting benefits.
Background to the DataValidator implementation
Guardian has been providing life and pensions products in the UK for over 180 years. Guardian now focuses on closed book consolidation and continues to manage pensions, savings and protection policies for over 900,000 policyholders with approximately £18 billion of assets under management.
As Guardian have amassed a number of closed books from different sources, the policyholder administration data for each book is relatively disparate. This resulted in a large number of separate processes to transform and validate the policyholder data before it could be used in the actuarial model.
Operating and maintaining these processes was proving a burden on Guardian’s actuarial team and with the tighter reporting timetables coming with Solvency II, Guardian wanted to ensure their team was using the available time efficiently and productively, rather than just “turning the handle”.
In tandem with an internal project to consolidate the underlying administration systems, Guardian approached us to discuss the use of DataValidator to provide the desired future solution.
We performed a proof of concept project on a block of business to demonstrate the improvements which could be realised on migration to a purpose-built software solution. This proved a success and Guardian subsequently committed to a full implementation of DataValidator to replace the existing policy data production processes.
DataValidator is a flexible and user friendly software solution that enables a company to validate, cleanse and transform data efficiently, ready for use in financial modelling and reporting processes. The software facilitates the quick implementation of the validation, cleansing and transformation rules, and captures all relevant information to produce full audit trails necessary for strict audit control and governance.
The key features of DataValidator are:
- Assess data quality: Users can pre-define custom data check rules, which are subsequently used to assess the accuracy and completeness of the data. DataValidator can also validate the reasonableness of movements in the data across time periods.
- Transform data: DataValidator allows the user to perform complex data transformations. This flexibility makes it possible to get the data in a format that is appropriate for other downstream systems and processes.
- Audit readiness: Produces reports to capture results of data quality checks and a full audit history of data cleansing actions and transformations. This provides the necessary audit trail required for regulatory compliance and best practice governance.
- Process workflow: The software supports batch processing, which enables the processing of multiple data extracts at the click of a button and live job-monitoring through a clean interface design (Figure 1). It also interfaces with our systems automation workflow tool, Unify.
- Ease of use: DataValidator’s clean, intuitive user interface and functionality helps users learn their way around the software quickly and to use it efficiently. Its flexibility makes it a useful tool for every-day use and can provide benefits far beyond the regular reporting cycle.
Click to enlarge
Over a period of eight weeks, we implemented the transformation routines for each of Guardian’s 30 product lines into DataValidator. This included a thorough review of the existing routines to ensure they were fit for purpose. A reconciliation was performed between DataValidator and the existing system to explain any differences between the two. These could be caused by correction of errors identified or improvements made as part of the implementation work. All differences were explained to within agreed tolerances.
A key aspect of the implementation was the development of a strong system of data validation checks, which thoroughly assesses the data quality and identifies any potential data issues. These checks included:
- Static checks, which are performed against a single data file. An example check would be to validate Gender is either Male or Female, or that age is between 18 and 120.
- Movement checks, which are performed relative to another data file such as from last period’s reporting process. An example check would be to validate unit value has only moved within +/-5% relative to last period – any failures are explicitly reported for investigation.
These data validation checks are designed to assist Guardian with meeting the Solvency II Data Quality requirements by comprehensively validating the accuracy and completeness of the policyholder data. The results of all the checks are clearly presented in automated reports produced by DataValidator (Figure 2).
Finally, Guardian’s business was grouped within DataValidator to most effectively utilise DataValidator’s batch processing ability. This allowed Guardian to substantially reduce the number of separate processes required to produce the model-ready data and reduce the manual overhead in its production.
Click to enlarge
Project success and benefits delivered
As a result of the implementation, the following key benefits were delivered:
- A significant increase in Guardian’s ability to assess the accuracy and completeness of data, with over 2,500 data validation checks implemented representing a 700% increase against the existing process.
- A 70% reduction in the number of separate processes required to produce model-ready data.
- An estimated 50% reduction in the user effort required to produce and validate the model-ready data, driven by processing efficiencies and the automated audit reports (Figure 3).
- An increase in the transparency of the process, easing future maintenance efforts and providing visibility to senior management.