Category Archives: Scorecarding
Ever wondered if an initiative is worth the effort? Ever wondered how to quantify its worth? This is a loaded question as you may suspect but I wanted to ask it nevertheless as my team of Global Industry Consultants work with clients around the world to do just that (aka Business Value Assessment or BVA) for solutions anchored around Informatica’s products.
As these solutions typically involve multiple core business processes stretching over multiple departments and leveraging a legion of technology components like ETL, metadata management, business glossary, BPM, data virtualization, legacy ERP, CRM and billing systems, it initially sounds like a daunting level of complexity. Opening this can of worms may end up in a measurement fatigue (I think I just discovered a new medical malaise.) (more…)
Following up from my previous post on 2011 reflections, it’s now time to take a look at the year ahead and consider what key trends will likely impact the world of data quality as we know it. As I mentioned in my previous post, we saw continued interest in data quality across all industries and I expect that trend to only continue to pick up steam in 2012. Here are three areas in particular that I foresee will rise to the surface: (more…)
I recently had the opportunity to meet with the board of directors for a large distribution company here in the U.S. On the table for discussion were data quality and data governance, and how a focus on both could help the organization gain competitive advantage in the market. While I was happy to see that this company had tied data quality and data governance to help meet their corporate objectives, that’s not what caught my attention. Instead, what impressed me the most was how the data quality and data governance champion had effectively helped the rest of the board see that there WAS a direct link, and that with careful focus they could drive better business outcomes than they could without a focus on data at all. As it turns out, the path to success for the champion was to focus on articulating the link between trusted data — governed effectively — and the company’s ability to excel financially, manage costs, limit its risk exposure and maintain trust with its customers. (more…)
Gartner recently released their 2011 Magic Quadrant for Data Quality Tools and I’m happy to announce that Informatica is positioned in the Leaders’ quadrant. We believe our position is a testament to the fact that customers like Station Casinos and U.S. Xpress continue to turn to Informatica to solve their most critical data quality challenges.
The publishing of the Magic Quadrant is often a great opportunity to reflect on the state of the data quality market. It should come as no surprise that data quality as a business imperative isn’t going away any time soon. We are continuing to see customers looking for help and expertise in solving a wide range of data quality problems, largely associated with data governance initiatives, master data management (MDM), business intelligence and application modernization. And the association of data quality in these areas is only getting stronger. (more…)
One of the key themes of the Informatica 9.1 release is Authoritative and Trustworthy data. To set the stage, consider the imperatives that organizations are driving such as becoming more customer-centric to drive top line revenue, or optimize just-in-time procurement to drive costs out of the business, or comply with new Dodd Frank regulations. Not only do all these imperatives require organizations to be able to deliver business value faster and faster, but they span the organization across Lines of Business and geographies. In particular they require reliable global business processes and analytics to succeed, and that means that they need trusted data. For example, Procure-to-Pay processes and decisions rely on data across product data, vendor data, and financial data, and that data has to be trusted, meaning that it’s essential to have consistent, correct, and complete vendor price and performance data in order to determine preferred vendors and negotiate better contracts. (more…)
In the U.S., and perhaps around the world, the events of a couple of weeks ago are truly memorable. As is often the case, when significant politically oriented shifts occur, the discussion of public opinion towards those in charge rises to the surface. The past couple of weeks, of course, are no exception. In scanning through the various media outlets, I came across an interesting read regarding the impact of such events on presidential public opinion. Not surprisingly, history shows that immediately after a significant positively impactful event, public opinion ratings for the “Commander in Chief” surge. This is due in large part to the general public essentially offering a reward or pat on the back for a job well done. People, for the most part, lose sight of what it was that caused them to not give a glowing report card in the first place, change their tune, and come to the belief that perhaps they were in fact wrong. Over time, however, we see that this point of view doesn’t last. Again, as history shows us, a short period of time after the bump in favorable public opinion the polls slowly drop back to levels they were before the significant event occurred. Essentially a one off win isn’t enough to sustain long term improvement in public opinion so more must be done.
We launched a coast-to-coast Customer Data Forum road show with visits to Atlanta and Washington, D.C., that attracted business and IT professionals interested in using master data management (MDM) to attract and retain customers.
From the business side, our guests consisted of analysts, sales operations personnel, and business liaisons to IT, while the IT side was represented by enterprise and data architects, IT directors, and business intelligence and data warehousing professionals. In Washington, about half the audience was from public sector and government agencies. (more…)
The last set of blog entries looked at the value of maintaining high quality data to support customer retention activities and processes in reference to life cycle events. To pull my thoughts together, we have looked at the business expectations for retention, business processes that center on life cycle events, and impacts related to data issues. The next step is to consider the underlying data requirements to maintain a high degree of information utilization. I believe we can roll this into two main sets of requirements:
- Managing high quality master data associated with the customer, key aspects of the customer’s life, and the life cycle of that customer’s purchased products and services; and
- Overseeing the observance of expectations associated with following dimensions of data quality: completeness, accuracy, currency, and timeliness.
By preventing data flaws related to master customer and master product data, the appropriate life cycle triggers will fire at the right time, leading to new business opportunities and elongated customer lifetimes. For more on this topic, read my newest white paper entitled: Increasing Confidence, and Satisfaction Through Improved Data Quality.