Curium
  • Solutions
    • Combining EDM and Data Governance
    • Creating a single point of truth
    • Managing ESG Data sets
    • Supporting a Data Fabric Architecture
    • Data Quality Management
    • Reducing risk and cost of EDM projects
  • Products
    • CuriumDQM
    • CuriumEDM
    • CuriumBI
  • News
  • Blog
  • Insights
    • Whitepapers
    • Case Studies
  • Partners
  • About
  • Contact
  • Menu Menu

Don’t Forget the Internal Data Controls When You Outsource

Asset managers and other financial services companies outsource many business functions across the enterprise to service providers, which enables them to focus on their core competencies such as product development and investment performance. The data management component supporting those functions is often included in the outsourcing arrangement. But while the firm may choose to outsource the function, they cannot outsource the risk or liability of errors due to poor quality data – that responsibility remains firmly within the remit of the asset management firm itself.

Of course, poor quality data will negatively impact the firm’s ability to conduct its business, but it can also lead to repercussions from regulators. So how can asset management firms manage data quality across services that have been outsourced, and how can they demonstrate that they have robust data quality controls to their internal management, their clients and the regulator?

What do the regulations say?

Increasingly stringent regulations specify the need for those asset managers that choose to outsource some of their operational functions to maintain internal control and ability to monitor compliance with all obligations.  It’s the old adage:  You can outsource the function, but you can’t outsource the responsibilities.

To take a couple of examples from regulations that many asset managers will be familiar with:

MIFID Article 13 Section 5 – An investment firm shall ensure, when relying on a third party for the performance of operational functions which are critical for the provision of continuous and satisfactory service to clients and the performance of investment activities on a continuous and satisfactory basis, that it takes reasonable steps to avoid undue additional operational risk.  Outsourcing of important operational functions may not be undertaken in such a way as to impair materially the quality of its internal control and the ability of the supervisor to monitor the firm’s compliance with all obligations.

Likewise, Solvency II regulation under Article 49 makes it clear that outsourcing does not discharge any firm from any of their obligations under the directive and that any outsourced activities do not unduly increase the operational risk.

Article 82 states that the firm shall have internal processes and procedures in place to ensure the appropriateness, completeness and accuracy of the data used (in the case of Solvency II relating to the calculation of their technical provisions).

Internal Reasons for Data Controls

But it’s not just for regulatory reasons that the firm needs to consider its data controls within the outsourcing model.  There are many internal reasons too.  Risk management in all its forms depends on the provision of accurate and complete data, not to forget that clients will take a dim view of errors due to bad data if there is an impact on their portfolios or the error makes its way through to client reports or similar communications.

The normal outsource model will ensure that the service provider is on the hook for its service levels around the performance of the function and it will naturally be paying attention to data insofar as it impacts the ability to provide that service.

But can the service provider carry out the level of rigour around all the aspects of data quality that are necessary for the client to meet its legal, regulatory and other commitments? The provider will maintain a certain level of sign off around the key data sets to fulfil its function, but practical experience suggests that providers are not necessarily looking for the same level of data quality and across the same breadth of data attributes that the asset manager needs to.

To take a simple example: An incorrect sector classification on a particular security probably doesn’t stop the provider from processing the transaction, but uncorrected it may impact asset allocation decisions, client mandates or skew all kinds of downstream processes like performance risk and client reporting.

Without a good oversight and data control process in place, the asset manager has no way to catch or manage data errors which, as we acknowledge, could have a material impact on the manager’s transactions, decision making and reporting.

Three Key Elements

So, what level of control do you need in place so as not to increase operational risk and satisfy the regulator?  There are three key elements that form the backbone of a successful and maintainable oversight process.

1) Access to the data:  Be able to gather the data received from the service provider regardless of the formats and technologies employed, and be able to easily manipulate/transform that data into a form that can be accessible from your data control process.  Fixed format files and static reports are never enough.

2) Make sure that your own business operations teams have the ability and tools to set up, modify and own the control and oversight process around the data sets – this is a BAU process and therefore must sit as close to the business functions as possible.

3) Ensure that the data control process incorporates satisfactory levels of workflow (i.e. a guarantee of ownership, prioritisation, and resolution) over data issues and audit that can give any outside party (regulator, client etc.) demonstrable evidence that the firm has oversight of its service providers.

A good oversight process is not something that can be cobbled together.  It needs thought, resource and a fair smattering of good technology to make it work.   Outsourcing may be the right option for many firms, but will need an investment in oversight and controls that is often overlooked.

Curium v7 introduces new ETL features that make the data gathering component of your data control process even easier to implement, including a range of options for gathering data that is managed from outside the firm’s data architecture.

Sign up to receive regular blog updates

Sign up

Enter your details to receive regular blog updates

    Get in touch Get in touch

    If you would like further information or have any queries about Curium please do get in touch. One of our team will be straight back to you.

      Testimonials Testimonials

      Curium allows us to get the most out of our data management resources.

      Head of Data Management
      Janus Henderson Investors

      Curium has allowed us to better understand and cleanse our accounting and trading system data and also provides a visible reassurance that the improved quality will be maintained.

      Head of Middle Office
      TT International

      Curium's ability to construct master data and provide data quality oversight was critical to the success of the Target Operating Model programme.

      Chief Technology Officer
      Old Mutual Global Investors

      Copyright © 2024 Curium Data Systems Limited. All rights reserved.
      • Twitter
      • LinkedIn
      Responsive web design by Rouge Media Ltd
      How Business Intelligence Enhances Data Governance Data Quality Management – What does Best Practice look like?
      Scroll to top