How can we enhance our data management practices and demonstrate to Management and Regulators that we are in control of our data sets?
Financial services firms are at the receiving end of a continuous onslaught of regulation and directives. Depending on the firm’s market activities and geographical spread, it could be building out its reporting and processes to support the core principles of BCBS 239, MiFID II, EMIR or Dodd-Frank for example. Already in place are a range of regulations based on improving transparency and assessing a firm’s risk and exposure. For insurance companies and asset managers & administrators supporting those insurance companies, it will be the Solvency II Directive. For many asset managers it could also be the AIFMD (Alternative Investment Fund Managers Directive).
Underpinning much of the regulation, and in particular with reference to some of the principles behind BCBS 239, is the drive to establishing better governance over data both from an IT infrastructure perspective as well as improved monitoring and control. Much of the emphasis is around firms being able to demonstrate that they have solid operational processes and business user visibility over the processing, quality and use of their key data.
Recent surveys to assess market participants’ ability to comply with emerging regulation suggest that many firms need to invest more in data governance, IT and internal control functions in order to be able to answer such questions as:
- Do we have a control framework across all key data sets?
- When issues are identified is there a defined and visible process to investigate and resolve them?
- Is there clear ownership and accountability?
- Are there adequate controls throughout the lifecycle of data?
- How automated and far reaching are the checks for identifying data errors?
To comply with some of the core principles of current and upcoming regulation, firms need to establish control and visibility over their data architecture and data processes.
Increasingly, firms need a forensic knowledge of their business at the underlying data level. Curium’s Enterprise Data Management solution, CuriumEDM, is designed to provide the level of control and visibility necessary to meet the new data operating models. Whether it is gathering data into the architecture, validating data or matching and mastering new data sets, CuriumEDM gives business level users full visibility over the rules and logic employed. The data governance modules of CuriumEDM present that information in a clear and precise way that sets the standard for next generation EDM platforms.
CuriumDQM is the data quality workflow tool that provides the visible evidence that there is a working business process around the capture, investigation and resolution of data issues – a crucial element of providing a data platform fit for purpose in today’s regulatory environment.
Compliance with such regulation will increasingly become a case of demonstrating to regulators that the firm has control of its key data sets. Curium provides the necessary control framework for data quality and underpins the business processes supporting regulatory compliance and data governance best practice.
![]() |
![]() |
![]() |
Curium – Innovative data solutions for real business needs