Maximising the quality and value of data, while mitigating risk.

Data accuracy is an important topic for any broker. Broker Insights has been working with our friends at the compliance consultancy, UKGI, to pull together top tips for maximising the quality and value of data, while mitigating associated risks.

Back to Blog

 

Both the FCA and the ICO apply a keen focus to data-quality and maintaining accurate and complete records.    

According to Nikki Bennett, managing director at the compliance consultancy, UKGI: “Both regulators will expect to see clear controls and measures in place within firms to ensure the accuracy of data when first collected and the management of data over time.   

“Poor data quality can take multiple forms; from failing to complete all fields in client files, to not checking and updating details in digital records at renewal. It is easy to see how a database can stray from being complete and accurate towards a lesser position.    

“For many firms, ensuring data-quality means periodic file reviews to check that each member of staff is on track with client interactions. While file checks are a cornerstone of governance practices, they are not a ‘catch all’ in regards to data-quality. A staff member tasked with capturing all necessary information and transferring it into digital records may not see the ‘macro’ impact of ‘micro’ discrepancies, with the result for the firm being a database which can easily stray towards a position of inaccuracy.”   

One clear example of data quality being impacted in this way can be found in the listing of classes of business. The class ‘abattoir’ often features with far higher regularity than it necessarily should within software house records. Why? Because it starts with ‘A’ and is amongst the first listings to be found in a drop-down menu of business types.    

As a result, a member of staff that does not appreciate the importance of data quality may list all cases under abattoir, aviation, or any other class starting with A, without appreciating the regulatory repercussions or impact on data processing downstream.   

Another example of poor data practices would be the full and accurate use of postcodes. By failing to input postcode details or check that the postcode supplied relates to the risk, macro data processing activity will be skewed significantly, along with the impact on the case itself.    

Improving data quality   

Broker Insights provides a ‘Data Quality Report’ to each firm using the platform, supporting firm’s principals to maintain oversight of data quality across their entire team.   

When a firm uploads data onto the platform, the Data Quality Report indicates the number of policies which have been successfully ingested, the number without a business description and any that have been rejected.  The firm can then review the applicable records and update fields where required, enriching management information, improving the quality of the firm’s data, and mitigating associated risks.   

Five top tips for driving value from data quality:   

  • Set out and reinforce staff responsibilities for data accuracy and data completeness
  • Provide ongoing training for staff on all systems across the firm’s workflow and revisit training as part of the T&C plan  
  • Monitor regulatory requirements for data accuracy and completeness
  • Regularly analyse the accuracy and consistency of data within the firm’s platforms to ensure that required standards are maintained. Spot checks and the Broker Insights Data Quality Report can support this activity
  • Review new or dormant data fields at client renewal and as part of your placement strategy reviews   

For further details of the regulatory requirements for data management, please contact the compliance consultancy team at UKGI.

Please contact Broker Insights to learn more about the Data Quality Report and how to maximise the value of your data.   

Back to Blog

Keep your finger on the pulse

Web Design by: Purple Imp