Recently the US Senate passed legislation, the Improper Payments Elimination and Recovery Act, which would require US federal agencies to identify and reduce improper payments. Improper payments are defined as mistakes, waste, fraud, and abuse cumulatively costing the US tax payers $110 billion per year and growing. This is a staggering sum. Think about it – think of all of the things that we could fund either through government programs or back into our pockets. The financial incentive is there, but the challenge for the federal government is how to discover and prevent these improper payments. (more…)
I’ve been working with a client in the insurance industry that has grown through acquisition. Through that growth, they’ve acquired multiple claims management applications. The time has come to consolidate those systems into one standard claims processing application for the entire organization.
The business drivers for consolidating these systems into one is fairly obvious – standardize the processes, reduce software and hardware costs, reduce integration costs, and create consistency in reporting and analytics. The deployment of this new claims system became a board level issue because of the dramatic positive impact that it would have on the organization and the cost structure.
Intuitively, the IT and claims management team knew that the data they migrated over to the new system needed to be cleansed. What wasn’t obvious was why they should develop a data quality program around this migration project. With a little digging and analysis we discovered many reasons to apply a data quality program to this effort. (more…)
Users of business systems do funny things. They get frustrated because they can’t easily get IT to change systems, add new fields, re-label existing fields, or provide the ability to add more information. So they take things upon themselves to add the information to the records that they are working with. (more…)
Las Vegas, as we all know, is the city of conventions, fantastic shows, great restaurants, and of course casinos. But one thing struck me on this visit that I hadn’t really noticed before. Each of these aspects of Las Vegas are world class in and of themselves. But when you really start to think about it, they are even more attractive as a complete set of activities which the conventioneer can pick and choose from to maximize the value of the trip – me, I chose to spend the day at TDWI sessions, dine at Bouchon, catch the Beatle’s Love show, and twice placed $5 down on #9 at the roulette table. Each of these is world class by themselves, but together, they provided an integrated experience that was greater than the sum of the parts. (more…)
There is a new whitepaper from Evan Levy of Baseline Consulting available on our website, “Master Data Management – Building A Foundation For Success”. The theme of the paper is about what components make up the foundation of any MDM project – whether a customer hub, a product hub, an operational hub, an analytical hub, one that is bought from a vendor or one that you build yourself. In all cases, there is a common infrastructure that time and again, leading practitioners such as Mr. Levy find to be required. (more…)
Many organizations use a half way point of their fiscal year to analyze results of their efforts and decide whether their strategies are working or if changes need to be made. Informatica just passed its half way point for our fiscal year and I had to determine how many Master Data Management (MDM) projects included Informatica products for data integration, data quality, and identity resolution.
While I can’t disclose how we did, I can tell you this, it’s difficult to say exactly which client purchases were for use in an MDM project or not. Of course there are some easy ones to identify as MDM. The client purchased PowerCenter and Identity Resolution to build a Customer Registry. Or, the client purchased PowerCenter and Informatica Data Quality to load high quality data into an Oracle Universal Customer Hub. Clear cut, that client is using Informatica for an MDM project. (more…)
Gartner released their “Hype Cycle for Application Architecture, 2009” last week. If you are not familiar with this report, it’s a great read. It tries to put into perspective from a productivity standpoint where different technologies are in reality, compared with the “hype”.
This year’s report again includes Master Data Management (MDM), and much to my surprise, Gartner put MDM in the “height of inflated expectations” category — the very peak of hype. Does that mean that MDM is not delivering on the promise? The short answer is, MDM does provide value. So what is Gartner getting at? (more…)