“We have 20% duplicates in our data source”. This is how the conversation began. It was not that no one cared about the level of duplicates, it’s just that the topic of duplicate records did not get the business excited – they have many other priorities (and they were not building a single view of customer).
The customer continued the discussion thread on how to make data quality relevant to each functional leader reporting to C-level executives. The starting point was affirmation that the business really only care about data quality when it impacts the processes that they own e.g. order process, invoice process, shipping process, credit process, lead generation process, compliance reporting process, etc. This means that data quality results need to be linked to the tangible goals of each business process owner to win them over as data advocates.
So can we deliver business impact metrics? Today we can measure data quality – using data profiling and data quality dimensions and present the results as scorecards and trends. That’s half the battle and is often sufficient to build a compelling business case. The next step involves linking data quality to business impact and identifying the most critical records which impact the business most. To industrialize this process involves further processes. If we can demonstrate trends showing increased business process efficiency linked to improved data quality, we will win over the business faster. At GDE, this is our goal – to measure the business value of enterprise data and govern the business impact of non-compliant data. We believe this is a great way to build sustainable data governance processes and align the business, data management and IT – we are using data excellence to deliver business excellence.
It is illusionary to believe that any enterprise will survive the information age without special focus and intelligent investment to govern the business value and impact of data.