5 Features of Data Governance Best Practice
22 September 2025
3.6 min read
Data governance is the data management discipline that focuses on the quality, security and availability of an organisation’s data[1]. It is a practice that has attracted a particular focus for insurers in recent years with the advent of SAM and IFRS 17 – both regimes which require high volumes and granularity of reliable data to maintain compliance.
Below, we describe some characteristics and features of a rigorous data governance framework that can help insurers set and maintain best practice in this area:
Data Handling Protocol
Insurers should have a documented method for handling data, including storage, naming, and altering. For example, raw policyholder data received in June 2025 may be renamed PHData202506. After the data is cleaned, a table called PHDataCleaned202506 may be created. Once that data is finalised and used to produce a final valuation result, it may be changed to FinalPHDataCleaned202506. The protocol should list rules on how the distinctly named files can be treated and altered, and these conventions should signal to different teams and units the state of the data and how they should treat it. This protocol should enable users to reproduce a past result if necessary. Ideally, the insurer’s IT system should allow conversion of valuation extracts to read-only once signed off.
Standardised Cleaned Data Format
A standard data format should be created for cleaned, model-ready valuation data. Ideally, all model data sources should look the same once cleaned and converted – even if the superset of standard fields is not used by all models. The idea here is to move as much logic as close to the source as possible, rather than handling data nuances at modelling stage. This also allows standard data tools and scripts that work across all data sources to be built.
Error Report following Data Cleaning
A data cleaning method should exist to detect and flag errors, and automatically generate an error report summary. Two types of errors generally exist:
- Non-fatal errors: These may be flagged and summarised, but do not result in a record being kicked out. Instead, an assumption may be applied to allow the record to be included in a reasonable way. The error should still be raised with the admin system.
- Fatal errors: These refer to situations where no reasonable assumption can be applied, and the record should be completely removed and captured in a separate error file, then raised with the admin systems.
Audit Log following Data Cleaning
If possible, audit logs should be automatically created each time a cleaning process is run and each time valuation tables are altered in any way. The audit log would include entries for each change made. This helps immensely in tracking down why results might have changed between runs where they shouldn’t have.
Back-up Protocol for Older Valuation Tables
A protocol of back-up and storage for older valuation tables should exist. The protocol should specify when tables are archived and where, as well as how many years of data are retained unarchived at any given time.
Insight Life Solutions’ actuaries have substantial experience in implementing and maintaining best-practice data governance frameworks. For an assessment of your current protocols and to discuss potential improvements, please contact me at garritn@insight.co.za.
[1] https://www.ibm.com/think/topics/data-governance
Get an email whenever we publish a new thought piece
Over the last year or so, we’ve seen a pronounced trend of life insurers starting to reassess their actuarial systems landscapes and explore the possibilities of modernisation, specifically in the
6.8 min read
Put an actuary in front of a cash flow model and they will, given time, figure out how to run it, update it, modify it and extract its results. That’s
6 min read