Tag Archives: bad data
Queensland Police Service Case Study: Use Your Bad Data To Build A Compelling Data Quality Business Case
Some might think that building a data quality business case is difficult and complicated – but it doesn’t have to be.
At InformaticaWorld, I had the pleasure of meeting Graeme Campbell, ex manager of the client services group at Queensland Police Service (QPS) in Australia, where he delivered a compelling presentation titled, Queensland Police Drive Out Crime with Informatica. My key takeaway: build a simple, business-focused and results-oriented business case that inspires action. (more…)
One of the most critical first steps for financial services firms looking to implement multidomain master data management (MDM) is to quantify the cost savings they could achieve.
Unfortunately, a thorough analysis of potential ROI is also one of the steps least followed (a key culprit being disconnects between business and IT).
This shortcoming is spotlighted in a new Informatica white paper, “Five Steps to Managing Reference Data More Effectively in Investment Banking,” which outlines key questions to ask in sizing up the cost implications of bad data and antiquated systems, such as:
- How long does it take to introduce a new security to trade?
- How many settlements need to be fixed manually?
- How many redundant data feeds does your firm have to manage?
- How accurate and complete are your end-of-day reports?
- Do you have the data you need to minimize risk and exposure? (more…)