Curium
  • Solutions
    • Combining EDM and Data Governance
    • Creating a single point of truth
    • Managing ESG Data sets
    • Supporting a Data Fabric Architecture
    • Data Quality Management
    • Reducing risk and cost of EDM projects
  • Products
    • CuriumDQM
    • CuriumEDM
    • CuriumBI
  • News
  • Blog
  • Insights
    • Whitepapers
    • Case Studies
  • Partners
  • About
  • Contact
  • Menu Menu

Solvency II – making demands on your data quality process

Solvency II is finally upon us and much has been said about how the pressures will ripple across insurers and their asset managers. But how can asset managers ensure that they are managing the data quality demands of Solvency II and meeting the high standards of their insurance clients?

For asset managers, this has presented well documented challenges, requiring them to deliver a variety of often complex asset and reference data in a consistent format and faster than they are used to.  Challenges include: collating data across multiple sources; the need for new data such as classifications, credit ratings, benchmark curves and new data taxonomies for securities instruments, such as CIC and NACE; providing underlying performance data for asset-backed instruments or instruments held within funds of funds for the fund look-through requirement of Solvency II.

In order to deliver the right depth of data at the right time, the underlying foundation of an asset manager’s data management process needs to be solid.  For example, the fund look-through process of providing an expanded view of a fund and all its underlying constituents may highlight anomalies in the asset manager’s data.  It could highlight issues such as showing you have five holdings in the same stock all with different valuations performed at different times.

Such anomalies in your data need to be identified, validated and cross checked and the data re-aggregated. To achieve this kind of control over your data requires processes and workflows which can identify the issues and enable the user to investigate, escalate (where necessary), and resolve them with a consistent, reliable and systematic approach.

When you are able to demonstrate this level of control over your data, your insurance clients can be confident that you are providing a thorough and reliable service, which helps them with their regulatory reporting obligations. Naturally, those asset managers that can successfully demonstrate control over their processes and are investing in servicing their clients will be favoured by the insurance industry.

This approach also provides a means to execute on broader data governance strategies, and delivers the strong foundation you need to be able to respond quickly and more easily to future regulatory requirements.

What Does a Process-Led, Systematic Approach Entail?

So how can an asset manager put in place the right longer-term processes to help themselves and their clients? The following are key data management process capabilities to help with meeting Solvency II and other regulatory requirements:

  • Data Visibility: You need to have visibility across all points of the data architecture – from raw data sources to data hubs to warehouses to consuming applications and more – without the need to physically move or transform that data. This can be done with advanced meta data modelling.
  • Workflow: Any effective governance process needs to incorporate high levels of process control and workflow to enable control and ownership of issues, escalation and resolution according to the data governance procedures. Enabling event-driven workflow (“if that happens, do this”) provides even greater control over your data. Workflows provide scalability for current and future data quality projects.
  • Data Quality: The visibility and workflow capabilities mean that setting up the exception checking for checks, balances, checking for completeness and accuracy, comparisons across all sources and endpoints of the asset and reference data is a relatively easy step. Exceptions all feed into a logical workflow owned and controlled by the business users, ensuring you’re reaching the right level of data quality (see our blog on the Seven Steps to Reaching Data Quality).
  • Audit Trail, Management Info and Business Intelligence: Tie these capabilities together with a full audit trail and the MI and BI capabilities to generate the reports and analysis, and you will be able to demonstrate the rigour in your data management processes to both your insurance company clients as well as the regulators.

Does your data quality management process shape up? Whatever data management platform you have in place, you need to make sure that you have the tools to address these key factors in order to better manage your ongoing client requirements around Solvency II and other regulations.

Sign up to receive regular blog updates

Sign up

Enter your details to receive regular blog updates

    Get in touch Get in touch

    If you would like further information or have any queries about Curium please do get in touch. One of our team will be straight back to you.

      Testimonials Testimonials

      Curium allows us to get the most out of our data management resources.

      Head of Data Management
      Janus Henderson Investors

      Curium has allowed us to better understand and cleanse our accounting and trading system data and also provides a visible reassurance that the improved quality will be maintained.

      Head of Middle Office
      TT International

      Curium's ability to construct master data and provide data quality oversight was critical to the success of the Target Operating Model programme.

      Chief Technology Officer
      Old Mutual Global Investors

      Copyright © 2024 Curium Data Systems Limited. All rights reserved.
      • Twitter
      • LinkedIn
      Responsive web design by Rouge Media Ltd
      7 Steps to Getting Data Quality Right – Part 2 Mondrian Investment Partners Completes Rapid Deployment of Curium
      Scroll to top