Tag Archives: data
Coincidentally, my company is involved with a number of different customers who are reviewing the quality criteria associated with addresses. Each scenario has different motivations for assessing address data quality. One use case focuses on administrative management – ensuring that things that need to happen at a particular location have an accurate and valid address. A different use case considers one aspect of regulatory compliance regarding protection of private information (since mail delivered to the wrong address is a potential exposure of the private information contained within the envelope). Another compliance use case looks at timely delivery of hard copy notifications as part of a legal process, requiring the correct address. (more…)
In adapting the six-sigma technique of failure mode and effects analysis for data quality management, we are hoping to proactively identify the potential errors that lead to the most severe business impacts and then strengthen the processes and applications to prevent errors from being introduced in the first place. In my last post, though, I noted that the approach to this analysis starts with the errors and then figures out the impacts. I think we should go the other way so as to optimize the effort and reduce the analysis time to focus on the most important potentialities. (more…)
Back in the good ol’ days, Santa Claus received letters and post cards from children all over the world. When telephones and faxes became commonplace, they were also used to contact Santa. In addition to those traditional methods, children today can also use the internet to send emails, Twitter, Facebook and even LinkedIn to notify Santa of their wish list. (more…)
Last time we looked at the failure mode and effect analysis technique from the six-sigma community and slightly adjusted it to be data-centric so that it can be used to anticipate the different types of data errors that could occur and adjust application design to accommodate the prevention of data errors in the first place. This approach really is proactive since you are proactively considering the many different types of errors that could be introduce and then shoring up the process in anticipation of their occurrence. (more…)
In the second of two videos, Peter Ku, Director of Financial Services Solutions Marketing, Informatica, talks about the latest trends regarding counterparty data and how the legal entity identifier (LEI) system will impact banks across the globe.
Specifically, he answers the following questions:
- What are the latest trends regarding counterparty information and how will the legal entity identifier system impact banks across the globe?
- How does Informatica help solve the challenges regarding counterparty information and help banks prepare for the new legal entity identifier system?
Also watch Peter’s first video (http://youtu.be/KvyDPzOTnUY) to learn about counterparty data and its challenges.
There has been much discussion, particularly in the UK, about banks restricting the use of their investment and retail arms. The thinking process behind this is that investment banking is much riskier and so by drawing a clear line between the two, consumers will be better protected if another financial crisis should hit. (more…)
eHarmony, an online dating service, uses Hadoop processing and the Hive data warehouse for analytics to match singles based on each individual’s “29 Dimensions® of Compatibility”, per a a June 2011 press release by eHarmony and one its suppliers, SeaMicro. According to eHarmony, an average of 542 eHarmony members marry daily in the United States. (more…)
“We have 20% duplicates in our data source”. This is how the conversation began. It was not that no one cared about the level of duplicates, it’s just that the topic of duplicate records did not get the business excited – they have many other priorities (and they were not building a single view of customer).
The customer continued the discussion thread on how to make data quality relevant to each functional leader reporting to C-level executives. The starting point was affirmation that the business really only care about data quality when it impacts the processes that they own e.g. order process, invoice process, shipping process, credit process, lead generation process, compliance reporting process, etc. This means that data quality results need to be linked to the tangible goals of each business process owner to win them over as data advocates. (more…)
Last month, Informatica and EMC announced a strategic partnership at EMC’s annual user conference in Boston. This is a significant new relationship for both companies-which in itself is interesting. You would have thought that the company responsible for storing more data than just about anybody in the world and the company responsible for moving more data than anybody in the world would have come together many years ago. So why now? What’s different?
Virtualization changes everything. Customers have moved beyond virtualizing their infrastructure and their operating systems and are now trying to apply the same principles to their data. Whether we’re moving the data to the processing, or the processing to the data, it’s clear where data physically lives has become increasingly irrelevant. Customers want data as a service and they don’t want to be hung up on the artificial boundaries created by applications, databases, schemas, or physical devices. (more…)
I enjoyed reading Jill Dyché’s recent blog on a BI team’s letter to the CEO of one of her pharmaceutical clients. According to Jill, it “outlined how much money the company could save by pushing out accurate physician spend figures; made the case for integrating R&D data; and outlined the strategic initiatives that would be BI-enabled. It was also specific about new resource needs, technology upgrade costs, and why they were part of a larger vision for an information-driven enterprise.” Her client led a meeting with the CEO, the CIO, the VP of Sales and Marketing leaning on their letter and succeeded in getting a renewed commitment from the executive team, including a 30% budget increase.
The notion that information is truly the differentiator is permeating through to the executive ranks. Of course your application and infrastructure must run smoothly with optimized processes. Many organizations are at parity there. (more…)