7 Steps to Getting Data Quality Right – Part 2
In our blog last month we looked at the first four of seven steps to ensure you get the quality of your financial data right (read more about it here). In this blog, we delve into steps five to seven.
Once you have visibility into the quality of your data across the core architecture and/or wider enterprise, are monitoring it on an ongoing basis, using sophisticated workflow to help prioritise and solve your data quality issues and have considered your functional requirements, the next essential step is to ensure that you have a full audit trail of all changes related to your data quality. This is the element that effectively ties data quality processing to the overall data governance structure.
Step 5: Ensuring a full audit trail
A full audit trail enables you to tell what actions were performed in relation to a particular data issue or data record, by whom, when, and in some instances, why. Part of the DQM process, also captured in the audit, may involve escalating issues within the data governance hierarchy and gaining approvals to perform actions such as downgrading/upgrading certain issues or, more importantly perhaps, initiating data corrections back to data repositories or line of business systems.
This level of transparency is incredibly important as it gives business management visibility over where any given data value came from or how it may have changed. But it also enables you to satisfy any queries from auditors and regulators. Indeed, a number of regulations now require access to audit trails to ensure compliance, such as Solvency II, MiFID II, BCBS 239, EMIR, Dodd-Frank, and more.
Given this regulatory and business landscape, keeping a full audit trail is a key tenet of any good data governance programme and, as such, is now an essential capability.
Step 6: How to make use of management information
There is a wealth of management information that can be generated through your data management activities. To make the most of it, you need to ensure you have the right tools, capable of gathering and making sense of the high volume of information generated. Ideally you need the ability to feed that data into a business intelligence tool that will help you generate valuable analysis – for use across the organisation.
Some organizations have started to publicise some of the results of their data quality programmes across the firm as it touches the functions and roles of so many people in the organization. It also encourages feedback and helps determine future targets and objectives for improvement. How long will it be before external parties want to see the same information on a regular basis?
Used correctly, management information can help you with areas such as identifying when you might need to put pressure on data providers to fix issues or improve their quality. It can help you identify bottlenecks and inefficient processes and look for ways to improve them. Also, it can help you to demonstrate to management, regulators, and increasingly, prospective clients the controls you have over your data.
Leveraging management information should result in an overall and sustained improvement in data quality, but it can also deliver a level of confidence to the risk and trading functions that can result in better investment decisions and a more certain adherence to other investment and regulatory reporting criteria.
Step 7: Demonstrate your data quality on demand
Production of reports and regulatory output may be today’s approach, in part, to monitoring data quality but the demonstration of this capability may form a part of the future operating model. To show, on demand, the transparency and accessibility over how data quality is being maintained, i.e. show me what happens when a particular ‘data’ related event takes place, would seem to be the ultimate thrust of some of the future regulations.
Can you answer questions such as:
- Do we know at any point what is the current status of all issues which may impact our operational data quality?
- Can any particular system or individual have access to that status information on demand both from an overall management perspective or against a particular quality issue?
If the answer is yes, then your data governance process around data quality is demonstrably working and effective at an operational level.
In summary then, steps 1 to 7 represent the core elements of implementing a business as usual data quality process. Once in place and at the heart of business operations it will enable the firm to deliver the level of data quality oversight required to support both current and future regulatory and data governance pressures.
The seven key steps to getting data quality right are:
- Get operational visibility of the quality of your data across the whole business
- Ensure you can monitor data quality as an automated BAU process
- Implement a sophisticated workflow that can help you to prioritise and solve your data quality issues
- Consider the functional requirements to enable the business users to work efficiently within the data quality process
- Ensure you have a full audit trail of all data and data changes related to your data quality
- Gather, and be able to analyse the management information generated by your data quality process
- Give visibility of the data quality process to anybody who needs it – up to date and on demand
A data governance strategy is one thing but it means little if you don’t have the tools to implement and monitor its effect on the real data and processes within the business.
To find out more about how CuriumDQM is helping financial services firms take control of their data quality process, take a look at curiumdata.com.
Enter your details to receive regular blog updates
Get in touch
If you would like further information or have any queries about Curium please do get in touch. One of our team will be straight back to you.