Successful Enterprises are Built on Trusted Data

data trust
Successful Enterprises are Aren’t Built on Data, but Trust of that Data

Let’s face it, if a business is going to tie its fortunes to data analytics, it better make sure it’s the right data. Ultimately, a thriving enterprise is not built on data, but rather, it’s built on the trust of that data.

That’s why ensuring data quality is probably one of the most important tasks an enterprise can undertake. In a recent report, Gartner analysts Saul Judah and Ted Friedman point out that the drive for high-quality data needs to become a top priority for all. “The data quality tools market is growing rapidly, and so is the pace of change around the people, processes, and technologies businesses utilize to maintain their information,” they state.

The cost of missing the mark with data quality is high. “Organizations estimate that they are losing an average of $8.8 million annually because of issues with data quality,” Judah and Friedman report.

Organizations have been wrestling with data-quality concerns for decades, and there are a wealth of great tools and platforms that handle everything from de-duplication to integration. However, never before has data quality been so critical to the life of the business itself. Again, success is built on trust, and that’s where close collaboration with the business is key.

Lisa Morgan of InformationWeek recently did a great job of outlining key steps for achieving data quality:

Acknowledge the importance of data quality:  It sounds obvious, but as Morgan points out, “most business professionals tend not to think about data quality, even though the quality of the decisions they make depends on the quality of the data they’re using for analysis. Data quality isn’t only an IT problem — it’s a business problem.” This means keeping the channels of communication open, and looking at data quality as a business priority, and not as exclusively an IT project.

Don’t go overboard: Not every piece of information flowing through systems needs to be dead-on accurate at all times – every business application has its own degree of requirements. Even so, making every byte of data entirely accurate would be a massive undertaking. Morgan quotes James O’Malley, senior VP of analytics at Porter Novelli: “People think data accuracy is 100% accurate. We’re dealing with tens of millions of pieces of content a month, so it’s hard to get to 100% accuracy.” Quality varies, so apply the best quality to where it counts the most.

Embrace the journey: Data inputs and requirements change on a day-to-day basis, and thus, “errors can be introduced in the data collection process as the data ages, as it is cleansed and transformed, and while it’s being moved among disparate systems. In other words, even accurate data can become inaccurate over time,” Morgan observes. The key is to regard data quality as continuous process and priority, not once-and-done.

Master your data: Morgan quotes Steve Jones, global VP of Capgemini’s big data practice, who makes the case for robust master data management. “If you haven’t got good master data (the whole concept of cross-reference), and there’s one person in the real world, and you have records for them in 20 systems and 30 records in each of those 20 systems, that’s your problem. Master data, metadata, and reference data management are the most important things when you look at data quality.”

Further Reading:

Part 2: Business Leaders Still Have Trust Issues with Data