Tag Archives: Business Case
In my recent white paper, “Holistic Data Governance: A Framework for Competitive Advantage”, I aspirationally state that data governance should be managed as a self-sustaining business function no different than Finance. With this in mind, last year I chased down Earl Fry, Informatica’s Chief Financial Officer, and asked him how his team helps our company prioritize investments and resources. Earl suggested I speak with the head of our enterprise risk management group … and I left inspired! I was shown a portfolio management-style approach to prioritizing risk management investment. It used an easy to understand, business executive-friendly visualization “heat map” dashboard that aggregates and summarizes the multiple dimensions we use to model risk . I asked myself: if an extremely mature and universally relevant business function like Finance manages its business this way, can’t the emerging discipline of data governance learn from it? Here’s what I’ve developed… (more…)
In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:
- Vision and Business Case to deliver business value
- Tools and Architecture to support architectural scope of data governance
- Policies that make up data governance function (security, archiving, etc.)
- Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
- Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
- Organizational Alignment: how the organization will work together across silos
- Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
- Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
- Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).
For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.
So goes the line in the 1999 Oliver Stone film, Any Given Sunday. In the film, Al Pacino plays Tony D’Amato, a “been there, done that” football coach who, faced with a new set of challenges, has to re-evaluate his tried and true assumptions about everything he had learned through his career. In an attempt to rally his troops, D’Amato delivers a wonderful stump speech challenging them to look for ways to move the ball forward, treating every inch of the field as something sacred and encouraging them to think differently about how to do so.
Ever wondered if an initiative is worth the effort? Ever wondered how to quantify its worth? This is a loaded question as you may suspect but I wanted to ask it nevertheless as my team of Global Industry Consultants work with clients around the world to do just that (aka Business Value Assessment or BVA) for solutions anchored around Informatica’s products.
As these solutions typically involve multiple core business processes stretching over multiple departments and leveraging a legion of technology components like ETL, metadata management, business glossary, BPM, data virtualization, legacy ERP, CRM and billing systems, it initially sounds like a daunting level of complexity. Opening this can of worms may end up in a measurement fatigue (I think I just discovered a new medical malaise.) (more…)
Finally, you need to create a business case and present the finding of the data quality checkup. There are two levels of presentation that typically take place after the data quality assessment. The first is a technical presentation to IT giving all the details of completeness, conformity, consistency, accuracy, duplication, and integrity characteristics of the data. IT needs to understand the types of issues in order to figure out what needs to be repaired and have an idea what can be fixed and what it might cost.
The more important presentation is what impact these issues are having on the business. Does the lack of accuracy in the data affect the accuracy of business decisions? How does the completeness of the data affect insurance ratings, loan applications, or well drilling decisions? Are your customer’s committing a crime? (more…)
Of my recent series of papers on the value of data quality improvement, the first focused on the economic or financial aspects of data quality improvement. One of the goals of the paper was to show that if you iteratively drill down along the different economic value dimensions to look at the use of information that contributes to organizational success, you can establish a link between data failures and business or operational process success. For example, when looking at cost reduction as the high level value dimension, we see that when attempting to reduce the spend associated with particular products through better negotiations with vendors, duplicate product entries in the supplier catalog reduced the ability to do accurate review of costs of each item as well as classes of items. This inconsistency impacted the ability to achieve the cost reductions.
We launched a coast-to-coast Customer Data Forum road show with visits to Atlanta and Washington, D.C., that attracted business and IT professionals interested in using master data management (MDM) to attract and retain customers.
From the business side, our guests consisted of analysts, sales operations personnel, and business liaisons to IT, while the IT side was represented by enterprise and data architects, IT directors, and business intelligence and data warehousing professionals. In Washington, about half the audience was from public sector and government agencies. (more…)
Building a business case for data quality is a waste of time. Nobody really cares. Improving data quality for quality’s sake is a waste of money. Sounds funny coming from a data quality specialist, someone who has spent the last decade preaching data profiling and data quality. But the fact is people from the business side do not care about data quality. What they care about is the impact poor data quality has on their line of business.
When you look at how the business measures itself (after you get past revenue and profit), the talk is about key performance indicators (KPI). What are some of the KPIs for a call center? You will hear about goals of reducing talk time. The business wants to lower costs. You will hear about goals of decreasing hold times. The business wants to improve the customer experience. (more…)
Last week I wrote about a telephone company (Avaya) that achieved a 2,000% ROI through their Data Quality COE. They achieved this payback by focusing on the income side of the business; that is, driving revenue growth and reducing operational costs by improving the quality of integrated customer and business data.
But is it possible to develop a business case for an investment that doesn’t improve profitability? The short answer is yes. The long answer (including a comprehensive business-case methodology) is explained in the ICC workshop at the Global Integration Summit, but here is a real-world example. (more…)