Category Archives: Data Governance
Data warehouses tend to grow very quickly because they integrate data from multiple sources and maintain years of historical data for analytics. A number of our customers have data warehouses in the hundreds of terabytes to petabytes range. Managing such a large amount of data becomes a challenge. How do you curb runaway costs in such an environment? Completing maintenance tasks within the prescribed window and ensuring acceptable performance are also big challenges.
We have provided best practices to archive aged data from data warehouses. Archiving data will keep the production data size at almost a constant level, reducing infrastructure and maintenance costs, while keeping performance up. At the same time, you can still access the archived data directly if you really need to from any reporting tool. Yet many are loath to move data out of their production system. This year, at Informatica World, we’re going to discuss another method of managing data growth without moving data out of the production data warehouse. I’m not going to tell you what this new method is, yet. You’ll have to come and learn more about it at my breakout session at Informatica World: What’s New from Informatica to Improve Data Warehouse Performance and Lower Costs.
I look forward to seeing all of you at Aria, Las Vegas next month. Also, I am especially excited to see our ILM customers at our second Product Advisory Council again this year.
Many software vendors, analyst and journalist are overusing the term “Data Governance” in today’s complex business and IT environments. However, it has become one of the primary goals and drivers for data-related IT projects whilst at the same time being one of the most difficult to define, measure and quantify. What real meaning can we give to the concept of Data Governance? What are its importance, impact and meaning for the enterprise?
To try returning some meaning and context to Data Governance, let’s go back to the semantics through an analogy understandable by everyone, and insightful in the smallest detail.
Welcome to Data Land… If data are its citizens, the governance of such a country would aim at ensuring that these data co-exist in a peaceful way, stayed healthy, enriched themselves, were not living on top of each other, did not destroy each other in the case of conflict, and most importantly work together every year at improving the GDP of Data Land. This means creating value by the use and action of everyone. Of course, bioethics laws would prevent the cloning or duplication of its inhabitants… Data governance would then define itself as a framework which intends to ensure the efficient management of the data in the enterprise. Putting data under governance prevents its chaotic generation and use.
In Data Land, governance implies:
A territory to govern
The scope of influence of governance must be clearly defined, the border of its country clearly delimited.
What type of data are we talking about? The question of the perimeter is not a trivial one, and its impact on the projects and tools to be implemented is big. Master data, so critical that it commands a particular investment for its management, forms a first consistent set and its governance leads to MDM projects.
What about transactions data or social interactions data? More and more popular, full of intelligence for the enterprise, they do not fit into the normal referential bucket, but can benefit from data quality initiatives, with their own specific concerns (volume and volatility, for instance). Also, Data Land is not free from globalization. Though it is important to establish borders for security reasons, “common market” initiatives with neighboring countries (partners, data vendors, data pools) are increasing and aim at surpassing the scope of the traditional enterprise, in favor of the “exterprise”.
Any Data Land (for instance the master data one) must have a leader, a sponsor who conveys a vision and ensures the alignment of all the members of his government who, like in real life, may be tempted by the will of handling its governed data in an autonomous or selfish way. This executive sponsorship is an important success factor of projects related to data governance. Its absence, source of the famous government hitches, leads to systematic failure. The governor is often an executive (CIO, COO, CEO), with enough power and respected enough to impose a choice in case of blocking.
A government and its supervisory body:
Every country needs a team who define the detail in terms of strategy and laws to put in place to ensure it is functioning correctly. Data Land requires nothing less. The organizational changes and the setup of dedicated enterprise-wide teams are among the most advertised collateral for data governance projects. The Data Governance Council is tasked with defining the rules governing the data i.e. the law. The Data Stewards ensure compliance with the law and, if not enforced, will take action to ensure compliance. In order for the initiative to be a success and just like national governments, they should theoretically be independent of particular interests and business lobbies. They do however need to have an intimate knowledge of the data and its use in the enterprise processes. This is why they often come from the “civil society”, meaning they were members of the business teams before, with a mission of surpassing their previous assignment for the greater good.
Laws and institutional processes
The first objective of the abovementioned government is to establish the governance scheme, the set of rules that govern the best practices around creating, using, modifying and removing data. These laws are of multiple types. The ones that establish property titles (data owners), easement rights (data consumers) and security rules (data custodians). There are also the ones that define the boundaries, restrictions or more positively the data standards. These rules will be enforced and data controlled by the data stewards. As in the civil society, an efficient management of the data involves orderly empowerment of the actors (prevention) as well as systematic control (repression). The enforcement of the law and its corrective aspect may be supported by processes orchestrating multiple users, according to the scheme defined by the Governance Council.
So what about IT tools? They are the infrastructures of Data Land. Vehicles, road signs, and even if it is less fun, speed cameras. They are here to facilitate the application of the governance scheme, to give tools to the government, to enforce order and the respect of the law. In any circumstances, they can help with the definition of the scheme. Data governance is an initiative taken by the enterprise for the enterprise, independently of any IT solution which will have to adapt (if sufficiently flexible).
As with any country-based government, data governance has an ambition to manage the enterprise data landscape with perfect efficiency.
Ambitious ? Surely.
Critical ? Definitively.
Let’s then ensure that the way to this ideal will deliver value by itself. This is what the relevance of IT tools should be judged against.
Special thanks to David Jordan for translating the original article from French to English.
Following up on the discussion I started on GovernYourData.com (thanks to all who provided great feedback), here’s my full proposal on this topic:
We all know about the “Garbage In/Garbage Out” reality that data quality and data governance practitioners have been fighting against for decades. If you don’t trust data when it’s initially captured, how can you trust it when it’s time to consume or analyze it? But I’m also looking at the tougher problem of data degradation. The data comes into your environment just fine, but any number of actions, events – or inactions – turns that “good” data “bad”.
So far I’ve been able to hypothesize eight root causes of data degradation. I’d really love your feedback on both the validity and completeness of these categories. I’ve used similar examples across a number of these to simplify. (more…)
Last week, we hosted a webinar Realizing the Potential of Your Data with Ochsner Health System. Jonathan Stevenson, Director of Analytics, joined me for a dialogue on what they’ve learned in their early steps toward becoming an Accountable Care Organization.
We had a an interactive audience asking questions. A few of which, with their answers, are included below: (more…)
In my previous blog I explored the importance of a firm understanding of commercial packaged applications on data quality success. In this final post, I will examine the benefits of having operational experience as a key enabler of effective data quality delivery. (more…)
Since I joined Informatica over a year ago, I’ve received a daily stream of unsolicited emails from vendors selling “marketable user email/contact list databases” of myriad software and hardware technologies ranging from enterprise apps, business intelligence, Cloud computing, networking and infrastructure, etc. You get the idea – and I’m sure many of you experience a similar phenomenon on a daily basis.
My catalyst for writing a post about this is when I considered the relevance, transparency and quality requirements that data governance leaders strive for –and how these vendors seem to dismiss all of the above. (more…)
Integration technologies have been around for 20 years (as long as Informatica has been in business) and have proliferated in corporate IT. We are now at an inflection point in the business needs and maturity of integration best practices which we can call Next Generation Data Integration (DI). If we’re going to talk about the next generation, then first we need to put a stake in the ground to describe the current, or prior generation. Furthermore, for it to be a “generational” change, it needs to be a significant step-function improvement in how the work is done and in the business value generated by data assets. Or as Jim Collins said in Built to Last: Successful Habits of Visionary Companies, we need a Big Hairy Audacious Goal. (more…)
In my recent white paper, “Holistic Data Governance: A Framework for Competitive Advantage”, I aspirationally state that data governance should be managed as a self-sustaining business function no different than Finance. With this in mind, last year I chased down Earl Fry, Informatica’s Chief Financial Officer, and asked him how his team helps our company prioritize investments and resources. Earl suggested I speak with the head of our enterprise risk management group … and I left inspired! I was shown a portfolio management-style approach to prioritizing risk management investment. It used an easy to understand, business executive-friendly visualization “heat map” dashboard that aggregates and summarizes the multiple dimensions we use to model risk . I asked myself: if an extremely mature and universally relevant business function like Finance manages its business this way, can’t the emerging discipline of data governance learn from it? Here’s what I’ve developed… (more…)
The need to be more customer-centric in financial services is more important than ever as banks and insurance companies look for ways to reduce churn as those in the industry know that loyal customers spend more on higher margin products and are likely to refer additional customers. Bankers and insurers who understand this, and get this right, are in a better position to maintain profitable and lasting customer loyalty and reap significant financial rewards. The current market conditions remain significant and will be difficult to overcome without the right information management architecture to help companies be truly customer centric. Here’s why:
- Customer satisfaction with retail banks has decreased for four consecutive years, with particularly low scores in customer service. Thirty-seven percent of customers who switched primary relationships cited in an industry survey showed poor customer service as the main reasons.
- The commoditization of traditional banking and insurance products has rapidly increased client attrition and decreased acquisition rates. Industry reports estimate that banks are losing customers at an average rate of 12.5% per year, while average acquisition rates are at 13.5%, making acquisitions nearly a zero-sum game. Further, the cost of acquiring new customers is estimated at five times the rate of retaining existing ones.
- Switching is easier than ever before. Customer churn is at an all-time high in most European countries. According to an industry survey, 42 percent of German banking customers had been with their main bank for less than a year. As customer acquisition costs running between of €200 to €400, bankers and insurers need to keep their clients at least 5 to 7 years to simply break even.
- Mergers and acquisitions impact even further the complexity and risks of maintaining customer relationships. According to a recent study, 17 percent of respondents who had gone through a merger or acquisition had switched at least one of their accounts to another institution after their bank was acquired, while an additional 31 percent said they were at least somewhat likely to switch over the next year.
Financial services professionals have long recognized the need to manage customer relationships vs. account relationships by shifting away from a product-centric culture toward a customer-centric model to maintain client loyalty and grow their bottom lines organically. Here are some reasons why:
- A 5% increase in customer retention can increase profitability by 35% in banking, 50% in brokerage, and 125% in the consumer credit card market.
- Banks can add more than $1 million to the profitability of their commercial banking business line by simply extending 16 of these large corporate relationships by one year, or by saving two such clients from defecting. In the insurance sector, a one percent increase in customer retention results in $1M in revenue.
- The average company has between a 60% and 70% probability of success selling more services to a current customer, a 20% to 40% probability of selling to a former customer, and a 5% to 20% probability of making a sale to a prospect.
- Up to 66% of current users of financial institutions’ social media sites engage in receiving information about financial services, 32% use it to retrieve information about offers or promotions and 30% to conduct customer service related activities.
So what does it take to become more Customer-centric?
Companies who have successful customer centric business models share similar cultures of placing the customer first, people who are willing to go that extra mile, business processes designed with the customer’s needs in mind, product and marketing strategy that is designed to meet a customer’s needs, and technology solutions that helps access and deliver trusted, timely, and comprehensive information and intelligence across the business. These technologies include
Why is data integration important? Customer centricity begins with the ability to access and integrate your data regardless of format, source system, structure, volume, latency, from any location including the cloud and social media sites. The data business needs originates from many different systems across the organization and outside including new Software as a Service solutions and cloud based technologies. Traditional hand coded methods and one off tools and open source data integration tools are not able to scale and perform to effectively and efficiently access, manage, and deliver the right data to the systems and applications in the front lined. A the same time, we live in the Big Data era with increasing transaction volumes, new channel adoption including mobile devices and social media combined generating petabytes of data of which to support a capable and sustainable customer centric business model, requires technology that can handle this complexity, scale with the business, while reducing costs and improving productivity.
Data quality issues must be dealt with proactively and managed by both business and technology stakeholders. Though technology itself cannot prevent all data quality errors from happening, it is a critical part of your customer information management process to ensure any issues that exist are identified and dealt with in an expeditious manner. Specifically, a Data Quality solution that can help detect data quality errors in any source, allow business users to define data quality rules, support seamless consumption of those rules by developers to execute, dashboards and reports for business stakeholders, and ongoing quality monitoring to deal with time and business sensitive exceptions. Data quality management can only scale and deliver value if an organization believes and manages data as an asset. It also helps to have a data governance framework consisting of processes, policies, standards, and people from business and IT working together in the process.
Lastly, growing your business, improving wallet share, retaining profitable relationships, and lowering the cost of managing customer relationships requires a single, trusted, holistic, and authoritative source of customer information. Managing customer information has historically been in applications across traditional business silos that lacked any common processes to reconcile duplicate and conflicting information across business systems. Master Data Management solutions are purposely designed to help breakdown the traditional application and business silos and helps deliver that single view of the truth for all systems to benefit. Master Data Management allows banks and insurance companies to access, identity unique customer entities, relate accounts to each customer, and extend that relationship view across other customers and employees including relationship bankers, financial advisors, to existing agents and brokers.
The need to attract and retain customers is a continuous journey for the financial industry however that need is greater than ever before. The foundation for successful customer centricity requires technology that can help access and deliver trusted, timely, consistent, and comprehensive customer information and insight across all channels and avoid the mistakes of the past, allow you to stay ahead of your competition, and maximize value for your shareholders.
 2010 UK Retail Banking Satisfaction Study, J.D. Power and Associates, October 2010.
 “Customer Winback”
 Mortgage Servicing News