Tag Archives: data

Scoping Failure Analysis

In adapting the six-sigma technique of failure mode and effects analysis for data quality management, we are hoping to proactively identify the potential errors that lead to the most severe business impacts and then strengthen the processes and applications to prevent errors from being introduced in the first place. In my last post, though, I noted that the approach to this analysis starts with the errors and then figures out the impacts. I think we should go the other way so as to optimize the effort and reduce the analysis time to focus on the most important potentialities. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality | Tagged , | Leave a comment

Dear Santa…

Back in the good ol’ days, Santa Claus received letters and post cards from children all over the world.  When telephones and faxes became commonplace, they were also used to contact Santa.  In addition to those traditional methods, children today can also use the internet to send emails, Twitter, Facebook and even LinkedIn to notify Santa of their wish list. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Enterprise Data Management | Tagged , , , , | Leave a comment

The Myth of Unlimited Resources

Last time we looked at the failure mode and effect analysis technique from the six-sigma community and slightly adjusted it to be data-centric so that it can be used to anticipate the different types of data errors that could occur and adjust application design to accommodate the prevention of data errors in the first place. This approach really is proactive since you are proactively considering the many different types of errors that could be introduce and then shoring up the process in anticipation of their occurrence. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality | Tagged , , , , , , , | Leave a comment

Counterparty Data and the Legal Entity Identifier (LEI) System

In the second of two videos, Peter Ku, Director of Financial Services Solutions Marketing, Informatica, talks about the latest trends regarding counterparty data and how the legal entity identifier (LEI) system will impact banks across the globe.

Specifically, he answers the following questions:

- What are the latest trends regarding counterparty information and how will the legal entity identifier system impact banks across the globe?

- How does Informatica help solve the challenges regarding counterparty information and help banks prepare for the new legal entity identifier system?

 

 

Also watch Peter’s first video (http://youtu.be/KvyDPzOTnUY) to learn about counterparty data and its challenges.

FacebookTwitterLinkedInEmailPrintShare
Posted in Financial Services | Tagged , , , , , , , | Leave a comment

Know Thy Customer

There has been much discussion, particularly in the UK, about banks restricting the use of their investment and retail arms. The thinking process behind this is that investment banking is much riskier and so by drawing a clear line between the two, consumers will be better protected if another financial crisis should hit. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Customer Services, Customers, Financial Services | Tagged , , , , , , , | Leave a comment

Dating With Data: Part 4 In Hadoop Series

eHarmony, an online dating service, uses Hadoop processing and the Hive data warehouse for analytics to match singles based on each individual’s “29 Dimensions® of Compatibility”, per a a June 2011 press release by eHarmony and one its suppliers, SeaMicro. According to eHarmony, an average of 542 eHarmony members marry daily in the United States. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data | Tagged , , , , , , , | 2 Comments

The Holy Grail Of Data Quality – Linking Data Quality To Business Impact

“We have 20% duplicates in our data source”. This is how the conversation began. It was not that no one cared about the level of duplicates, it’s just that the topic of duplicate records did not get the business excited – they have many other priorities (and they were not building a single view of customer).

The customer continued the discussion thread on how to make data quality relevant to each functional leader reporting to C-level executives. The starting point was affirmation that the business really only care about data quality when it impacts the processes that they own e.g. order process, invoice process, shipping process, credit process, lead generation process, compliance reporting process, etc. This means that data quality results need to be linked to the tangible goals of each business process owner to win them over as data advocates. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Profiling, Scorecarding | Tagged , , , , , , , | 1 Comment

Informatica And EMC – So What’s The Big Idea?

Last month, Informatica and EMC announced a strategic partnership at EMC’s annual user conference in Boston.  This is a significant new relationship for both companies-which in itself is interesting.  You would have thought that the company responsible for storing more data than just about anybody in the world and the company responsible for moving more data than anybody in the world would have come together many years ago.  So why now?  What’s different?

Virtualization changes everything.   Customers have moved beyond virtualizing their infrastructure and their operating systems and are now trying to apply the same principles to their data.  Whether we’re moving the data to the processing, or the processing to the data, it’s clear where data physically lives has become increasingly irrelevant.   Customers want data as a service and they don’t want to be hung up on the artificial boundaries created by applications, databases, schemas, or physical devices. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Application Retirement, Data Integration, Database Archiving, Master Data Management | Tagged , , , | Leave a comment

Dear CEO: Information As A Differentiator

I enjoyed reading Jill Dyché’s recent blog on a BI team’s letter to the CEO of one of her pharmaceutical clients. According to Jill, it “outlined how much money the company could save by pushing out accurate physician spend figures; made the case for integrating R&D data; and outlined the strategic initiatives that would be BI-enabled. It was also specific about new resource needs, technology upgrade costs, and why they were part of a larger vision for an information-driven enterprise.” Her client led a meeting with the CEO, the CIO, the VP of Sales and Marketing leaning on their letter and succeeded in getting a renewed commitment from the executive team, including a 30% budget increase.

The notion that information is truly the differentiator is permeating through to the executive ranks. Of course your application and infrastructure must run smoothly with optimized processes. Many organizations are at parity there. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged | Leave a comment

The Importance of Data for State Spend Regulation

With so many state-level governments adopting legislation limiting or mandating disclosure of payments to physicians, spend compliance is now top of mind for many pharmaceutical companies. Keeping pace with the varied and often conflicting requirements across states has always been difficult. But as momentum for wide-ranging healthcare reform increases, so has talk among stakeholders in Washington and around the industry heated up around creating standards for complying with and enforcing these types of requirements.

Predictably, the idea of national standards for physician spend regulation has both supporters and detractors. Supporters point to the effort required to stay abreast of ever-changing and varied regulation, and the IT cost to support reporting requirements. Detractors decry national regulation as a political power grab rather than a tool for effective social policy, and complain that the general public does not understand the cost of general research and development or the limited profit windows posed by short patent life.

These arguments came to a head last month as Massachusetts introduced the broadest physician-spend law yet. What’s new and different about the Commonwealth’s law? For starters, it’s the first legislation in the U.S. applying to medical device manufacturers as well as pharmaceutical companies. Secondly, the law seems to have teeth that have been missing in the laws of other states, providing for a $5,000 per-incident fine. Previously, the largest public settlement on record was only $10,000 total.

Given the degree of uncertainty about the future of physician-spend legislation, the only certain course of action is to build a reliable, integrated source of physician data that can easily cross-reference to various AP, expense reporting, ERP, and CTMS systems. While reporting requirements will continue to morph, putting in place a reliable data foundation will allow you to rapidly respond to these changes as they occur. There are certain ground rules to follow, however. A strong data foundation must plug equally well into your BI environment, where the bulk of regulatory reporting ill likely occur, as it does into the operational systems that you use to alert personnel to spend limits.

How to go about creating such a strong data foundation? It might not be quite as difficult as it sounds. Read on to discover how one pharma company is using master data management to get a handle on their physician spend management.

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management | Tagged , , , , , , , , , , | Leave a comment