Category Archives: Scorecarding
When I talk to customers about dealing with poor data quality, I consistently hear something like, “We know we have data quality problems, but we can’t get the business to help take ownership and do something about it.” I think that this is taking the easy way out. Throwing your hands up in the air doesn’t make change happen – it only prolongs the pain. If you want to affect a positive change in data quality and are looking for ways to engage the business, then you should join Barbara Latulippe, Director of Enterprise Information Management for EMC and and Kristen Kokie, VP IT Enterprise Strategic Services for Informatica for our webinar on Thursday October 24th to hear how they have dealt with data quality in their combined 40+ years in IT.
Now, understandably, tackling data quality problems is no small undertaking, and it isn’t easy. In many instances, the reason why organizations choose to do nothing about data quality is that bad data has been present for so long that manual work around efforts have become ingrained in the business processes for consuming data. In these cases, changing the way people do things becomes the largest obstacle to dealing with the root cause of the issues. But that is also where you will be able to find the costs associated with bad data: lost productivity, ineffective decision making, missed opportunities, etc..
As discussed in this previous webinar,(link to replay on the bottom of the page), successfully dealing with poor data quality takes initiative, and it takes communication. IT Departments are the engineers of the business: they are the ones who understand process and workflows; they are the ones who build the integration paths between the applications and systems. Even if they don’t own the data, they do end up owning the data driven business processes that consume data. As such, IT is uniquely positioned to provide customized suggestions based off of the insight from multiple previous interactions with the data.
Bring facts to the table when talking to the business. As those who directly interact daily with data, IT is in position to measure and monitor data quality, to identify key data quality metrics; data quality scorecards and dashboards can shine a light on bad data and directly relate it to the business via the downstream workflows and business processes. Armed with hard facts about impact on specific business processes, a Business user has an easier time affixing a dollar value on the impact of that bad data. Here’s some helpful resources where you can start to build your case for improved data quality. With these tools and insight, IT can start to affect change.
Data is becoming the lifeblood of organizations and IT organizations have a huge opportunity to get closer to the business by really knowing the data of the business. While data quality invariably involves technological intervention, it is more so a process and change management issue that ends up being critical to success. The easier it is to tie bad data to specific business processes, the more constructive the conversation can be with the Business.
Ever wondered if an initiative is worth the effort? Ever wondered how to quantify its worth? This is a loaded question as you may suspect but I wanted to ask it nevertheless as my team of Global Industry Consultants work with clients around the world to do just that (aka Business Value Assessment or BVA) for solutions anchored around Informatica’s products.
As these solutions typically involve multiple core business processes stretching over multiple departments and leveraging a legion of technology components like ETL, metadata management, business glossary, BPM, data virtualization, legacy ERP, CRM and billing systems, it initially sounds like a daunting level of complexity. Opening this can of worms may end up in a measurement fatigue (I think I just discovered a new medical malaise.) (more…)
Following up from my previous post on 2011 reflections, it’s now time to take a look at the year ahead and consider what key trends will likely impact the world of data quality as we know it. As I mentioned in my previous post, we saw continued interest in data quality across all industries and I expect that trend to only continue to pick up steam in 2012. Here are three areas in particular that I foresee will rise to the surface: (more…)
I recently had the opportunity to meet with the board of directors for a large distribution company here in the U.S. On the table for discussion were data quality and data governance, and how a focus on both could help the organization gain competitive advantage in the market. While I was happy to see that this company had tied data quality and data governance to help meet their corporate objectives, that’s not what caught my attention. Instead, what impressed me the most was how the data quality and data governance champion had effectively helped the rest of the board see that there WAS a direct link, and that with careful focus they could drive better business outcomes than they could without a focus on data at all. As it turns out, the path to success for the champion was to focus on articulating the link between trusted data — governed effectively — and the company’s ability to excel financially, manage costs, limit its risk exposure and maintain trust with its customers. (more…)
Gartner recently released their 2011 Magic Quadrant for Data Quality Tools and I’m happy to announce that Informatica is positioned in the Leaders’ quadrant. We believe our position is a testament to the fact that customers like Station Casinos and U.S. Xpress continue to turn to Informatica to solve their most critical data quality challenges.
The publishing of the Magic Quadrant is often a great opportunity to reflect on the state of the data quality market. It should come as no surprise that data quality as a business imperative isn’t going away any time soon. We are continuing to see customers looking for help and expertise in solving a wide range of data quality problems, largely associated with data governance initiatives, master data management (MDM), business intelligence and application modernization. And the association of data quality in these areas is only getting stronger. (more…)
One of the key themes of the Informatica 9.1 release is Authoritative and Trustworthy data. To set the stage, consider the imperatives that organizations are driving such as becoming more customer-centric to drive top line revenue, or optimize just-in-time procurement to drive costs out of the business, or comply with new Dodd Frank regulations. Not only do all these imperatives require organizations to be able to deliver business value faster and faster, but they span the organization across Lines of Business and geographies. In particular they require reliable global business processes and analytics to succeed, and that means that they need trusted data. For example, Procure-to-Pay processes and decisions rely on data across product data, vendor data, and financial data, and that data has to be trusted, meaning that it’s essential to have consistent, correct, and complete vendor price and performance data in order to determine preferred vendors and negotiate better contracts. (more…)
In the U.S., and perhaps around the world, the events of a couple of weeks ago are truly memorable. As is often the case, when significant politically oriented shifts occur, the discussion of public opinion towards those in charge rises to the surface. The past couple of weeks, of course, are no exception. In scanning through the various media outlets, I came across an interesting read regarding the impact of such events on presidential public opinion. Not surprisingly, history shows that immediately after a significant positively impactful event, public opinion ratings for the “Commander in Chief” surge. This is due in large part to the general public essentially offering a reward or pat on the back for a job well done. People, for the most part, lose sight of what it was that caused them to not give a glowing report card in the first place, change their tune, and come to the belief that perhaps they were in fact wrong. Over time, however, we see that this point of view doesn’t last. Again, as history shows us, a short period of time after the bump in favorable public opinion the polls slowly drop back to levels they were before the significant event occurred. Essentially a one off win isn’t enough to sustain long term improvement in public opinion so more must be done.
We launched a coast-to-coast Customer Data Forum road show with visits to Atlanta and Washington, D.C., that attracted business and IT professionals interested in using master data management (MDM) to attract and retain customers.
From the business side, our guests consisted of analysts, sales operations personnel, and business liaisons to IT, while the IT side was represented by enterprise and data architects, IT directors, and business intelligence and data warehousing professionals. In Washington, about half the audience was from public sector and government agencies. (more…)