7 Steps to Getting Data Quality Right – Part 1
Ensuring you have good data quality underpinning all of your investment processes is an obvious requirement for today’s data management professional. But it’s harder than you’d think to get the process of measuring and managing data quality right.
Senior-level data management professionals are all responding to different internal and external drivers. But no matter what the driver, the overarching need is to get data quality right to ensure that all the other processes – investment, risk management, regulatory compliance, etc. – work as they should.
But how can you make sure you’re measuring data quality right? How can you monitor it on an ongoing basis? Once you’ve found a way to monitor it, what can you do to improve it? And how can you get insight into your business by mining the valuable management information that comes from your data quality management activities?
There are seven steps you need to take to ensure you get data quality right. We’ll explore the first four steps here, and delve into the final three steps in the second of this two-part blog.
- Get visibility of the quality of your data across the whole business
- Ensure you can monitor data quality on an ongoing, and even in some areas, real-time basis
- Don’t just set up exception checks; implement a sophisticated workflow that can help you to prioritise and solve your data quality issues efficiently
- Consider functional requirements including having a single entry point for data changes across the architecture, controls for ensuring validity of such data changes, managing user permissions, suppressing the noise of unimportant exceptions and more
- Ensure you have a full audit trail of all data and data changes related to your data quality
- Gather, and most importantly, put in place a process to analyse the wealth of management information generated by your data quality processes
- Sit back and relax as management and the regulators drill into and query your data and processes (ok, maybe you can’t relax but you can feel confident that you will be able to demonstrate the quality of what you have put in place)
Step 1: Visibility into data quality across the enterprise
The ideal approach would be to have a single solution that can provide visibility of data quality and process across the business – regardless of the IT architecture, line of business systems and data management solutions you may already have in place. This solution would not only check that your core data is correct but also that there is consistency of data across the various end points of the architecture as well.
What exactly would this solution look like?
There are lots of ‘data quality’ tools out there, but when you drill into them, you might find a more accurate description for them would be ‘data profiling’ tools. They can give you an assessment of quality of a particular data set and tell you how much is complete or missing and present good-looking graphs to illustrate the point. But this is merely a snapshot in time and they give you no means of resolving those issues.
True visibility into data quality requires a solution that can monitor the entire business, end to end, on an ongoing basis and provide both visibility of the data quality issues that are present, including their current status, and visibility of the data itself to enable effective assessment and corrective action to be taken where necessary.
Step 2: Ensure Data Quality Monitoring is a Regular Process, Not Just a Snapshot in Time
To be effective, a data quality solution needs to monitor for data quality issues on a regular and ongoing basis, and in some cases, even in real time – across your enterprise no matter if it’s based on a central hub or a federated model or some combination of the two; whatever data management platforms you have implemented.
A snapshot in time is of little value in improving data quality in the long term. Monitoring and the resolution of data quality issues should become a core function of the day to day processes within the firm and not just a one off exercise. Look for a solution that gives you quality monitoring on an ongoing basis and not just a snapshot analysis of data quality.
Step 3: Don’t Just Look for Exceptions, Manage Data Quality Issues with Sophisticated Workflow
Even this level of regular, ongoing monitoring alone is not enough to help you with your data quality challenge. You also need to be able to push these issues into an effective workflow so that they can be investigated and resolved, and if needed, so that you can adapt processes to make sure the same issues don’t happen time and again. This is more than a data profiling exercise, as it requires a toolset that incorporates an appropriate level of workflow control to manage data quality issues effectively.
It’s this workflow for data quality issue management that has proved tricky for the industry to get to grips with. A basic workflow might set up thresholds, and any data moving outside these thresholds triggers an exception. But a more sophisticated workflow would allow the business to manage those exceptions efficiently, helping them to prioritise issues (for example using supporting data to identify which data issues could have a bigger business impact than others), route them to the correct part of the business with the capability to deal with them and add supporting context to a particular issue or to adjust the current status of an issue if it cannot be solved instantly.
Step 4: Consider the Range of Functional Requirements
Other things to consider in a data quality solution are whether you are able to make changes and corrections to the data from a single point of entry that would then filter across your whole architecture. Or whether you have clear controls such as checking for the validity of changes to data on entry or managing user permissions to change data, including enabling ‘four eyes’ release or supervisor approval for such changes.
You will also want to ensure you can have the ability to suppress some of the exception ‘noise’, i.e. the option to ignore data breaches that don’t reflect a real business issue. Without the ability to teach a data quality system what is and isn’t important can quickly lead to the data management teams drowning in exceptions they don’t care about and missing the ones that they do.
So you’re not just getting exceptions, you’re getting just the exceptions you care about, prioritised in the right way and you’re able to solve them within the same data quality tool. This approach will ultimately give your business a control environment and the confidence to know you have got your data quality under control.
More in our next blog about steps 5-7.
To find out more about how CuriumDQM is helping financial services firms take control of their data quality process, take a look at curiumdata.com.
Enter your details to receive regular blog updates
Get in touch
If you would like further information or have any queries about Curium please do get in touch. One of our team will be straight back to you.