Category Archives: Metadata
Do you remember NASA’s $125 million mistake in 1999? The Mars orbiter was lost as the result of a failed information transfer in which one engineering team used metric units while another used imperial units.
I remember because I could relate. After moving to the U.S. from Canada for graduate school, I had to communicate my height in feet and inches instead of meters and centimeters and give directions in miles instead of kilometers.
On a trip to Vancouver, Canada, Andrew Donaher reminded me about NASA’s costly mistake and how it could have been avoided with a business-friendly data governance program. Following much positive feedback from our last blog, I invited Andy to discuss data governance. You may recall that Andy is the Director of Information Management Strategy at Groundswell Group, a Western Canadian consulting firm that specializes in information management services.
Q. According to www.governyourdata.com, data governance is not about the data. It’s about the business processes, decisions, and stakeholder interactions that you want to enable. What’s your take on the value of data governance?
A: The goal of data governance should be to give people confidence in the data they use to make decisions or take actions. They benefit by not wasting time and energy vetting data or creating new processes. That is a huge value to the organization both in terms of risk mitigation and opportunity. At the absolute highest level, data governance is critical to establish trust and confidence in data.
Q. Explain how IT leaders could approach data governance the wrong way.
A. Typically data governance is approached from a restrictive, security-focused and policing perspective. I have found it much more productive to approach it from an enablement, conversational and guiding perspective. The benefit and value of the rules, policies and procedures associated with governance are that people do not have to re-invent the wheel every time. All those things are set up so people can leverage them to provide value faster.
Think back to when you were learning to ride a bike. Hopefully your parent didn’t stand at a distance barking instructions on what to do and what not to do. He or she started by holding the back of your bike so you felt stable and supported, providing you with guidance on how to do it, words of encouragement about what you’re doing well, and constructive advice on what you could be doing better. Then something would click and you’d get it. When you looked back with a smile on your face, feeling proud of yourself, you’d see your parent was no longer holding your bike. He or she was a few steps behind you smiling back while you rode your bike all by yourself!
Remember that feeling of confidence and elation? That is a form of governance too. It isn’t about shutting things down, it is about enabling and supporting. To do this properly you need to listen and understand what the goals are and what is important. I encourage IT leaders to work closely with line of business leaders to ensure trust and confidence in the data. Everyone should know how to get the proper data they need to help the organization move forward.
Q. Can you share some examples of data governance rules, policies and procedures that are more policing than enabling?
A. An example is when “Hold” or “No” are the default responses to every access request. Typically every database request submitted sits in a queue until an administrator reviews the access request and contacts the person with a series of questions that typically add little value. Sometimes the request is granted or it’s escalated for further investigation. While there is absolutely a level of security and policing that needs to occur on sensitive information, sometimes security and governance can unnecessarily become synonymous.
A potential policy alternative is first distinguishing between sensitivity in data structures and then codifying access policies. For example, imagine someone requests read-only access to a generally available schema in the enterprise data warehouse. This person has a particular job title and works in a particular department. Another person with the same role has similar access. The process requires an “approver” to manually review and approve the request. In this instance, you could set up the access request for automatic approval. The risk will have been mitigated through the applied rules, so you have the necessary governance, but you’ve enabled the business to move faster. That’s a win for everyone involved.
Q. Can you give some concrete advice about how to kick off a successful data governance initiative using an enabling approach?
A. I have two recommendations:
- Recruit Business Partners: Make certain you have some highly respected, experienced and motivated business partners to participate in the kick-off.
- Quantify the Value: As a group, quantify the value of risk mitigation and opportunity cost. For example
- To quantify the risk, measure the dollar value of a wrong metric going to the investor community, the impact on the market value and the percentage chance of it happening. Or quantify the executive team making a wrong decision based on incorrect information.
- To quantify the opportunity, calculate the value of speed-to-market, getting a product to customers quicker than a competitor. You should be able to find examples of how much it cost your organization when you launched a product before a competitor and when you launched a product after a competitor. You can leverage that in your calculation to ensure everyone knows exactly how important enablement is.
When you work collaboratively, business and IT will be on the same page. Business leaders should understand the pressures the IT group is under to protect corporate data. The IT team should understand the pressure business leaders are under to get answers to questions quickly to cut costs and find opportunities for growth in revenue and profits.
Q. Any tips on how to enable data governance processes with technology?
A. You may want to consider these two valuable elements to make data governance and analysis even easier:
- Metadata Manager provides a frame of reference or the context to give data meaning. It enables IT staff to manage technical metadata and perform an impact analysis of a proposed change before it is implemented. While root cause analysis enables business partners to dig into a term in a report to understand the source of the data and how it was moved and transformed before it was added to a report.
- Business Glossary maintains a standard set of business definitions, accountability for its terms and an audit trail for compliance. It enables business partners and IT to collaboratively manage business metadata. To use a healthcare example, does “Claim Paid Date” mean the date it was approved, the check was cut or the check cleared? Turn to Business Glossary to find out.
Q. Can you rescue a data governance initiative that was built based on a policing approach?
A. Absolutely. It takes effort and thought but it can absolutely be done. The key to doing it is realizing the opportunity cost of having people create their own business rules and metrics. While there is a cost to the wasted labor, the greatest cost is lost opportunities. If people are spending time trying to recreate rules and reconcile numbers, they won’t have time to focus on the game changing insight you get from predictive analytics or optimization, which is where the real competitive advantage lies.
As I was scanning my BBC app on my iPhone a few weeks ago I noticed this article on how game companies are sharing files for distributed development. It talked about how EA was overcoming the development challenges of the multi-shooter game “Battlefield 4″. Not only where they handling the code itself but also very large graphics and sound files as the complete game file was larger than 50GB.
Rather than file transfer or email the whole file (impossible) or chunks (too expensive, too time consuming), they were using Panzura’s cloud storage controller to store the “master file” in the cloud and handle code and content deltas (<5% of the total file) very similarly to what a MDM environment does in the B2B space when it checks for duplicates and only syncs “approved” attribute-level net changes into the MDM hub but also back to the source system.
This is as much of a file transfer challenge around compression as it is a logical challenge detecting and automating updates when appropriate and flagging it for review when inappropriate. The similarities to a MDM system are shocking. Just as two or more CRM, billing or asset mgmt systems handle their somewhat similar, yet still different, individual “master” files of an asset, a customer, an account or a product; the game development operation syncs its copies across development and QA locations based on the fact if it is a Sony Playstation or Microsoft XBox “view” of the same game.
In the event a cloud storage provider goes belly-up – just as it happened in the BBC article – there is obviously (as there is in MDM) the possibility of a cloud-onsite hybrid.
Now my juices got flowing – Informatica should be using its experience in ETL and SOA to use the MDM Hub for use cases where structured master data need to be used to sync chunks of large files relevant to a particular transaction, say a whole life insurance application, a mutual fund annual proxy statement, a car manual, etc. Rather than mail this massive booklets every quarter or year, these files should be developed and distributed based on preference attributes linked to a customer account and location and assembled on-the-fly for the particular object in question.
Surely, the risk disclaimers, steps to change a spark plug are 80% similar between instruments or vehicles so why reprint or duplicate them electronically.
Spinning this further, what happens if developers need to understand gamers’ behavior in terms of hacks they applied and attempts/behavior to get to the next save point. This then becomes increasingly a Big Data paradigm, especially for situations where the broadband signal is run through the XBox and constant switches between TV, web browsing, a voice call, VoD, OnDemand Games and XBox Games occurs. My head is starting to smoke already. Would a switch from the game to a local TV station or HBO now indicate that the gamer was getting a bit tired or bored at a certain stage in the game….what happens if the Kinect detects they actually walked away. So much data – so little time.
These are my two cents….as I am pretty sure Informatica will not get into the gaming business any time soon. And I was so hoping I could expense this Christmas’ XBox for customer demo purposes (LoL).
Are you aware of any untraditional uses of master data, maybe in combination with knowledge or content management systems? Would love to hear some ideas.
Why Now is the Time for an Investment in Data Management
All application managers have gotten this question at some point or another. But it could be worse. Consider if the question never was asked and that bad data caused an error in a crucial business process or transaction. The damage can be significant and it happens every day.
Let’s suppose you do bad data in an enterprise application. This raises a number of very difficult questions:
- Provenance. Where did this data come from? Is the data from the right source?
- Transformation. Was the data transformed correctly as it was moved from source to target?
- Operational. Was there an operational error along the way that caused a critical process to run only partially or not at all?
- Change Management. Did somebody make a change to the data integration / data management system that looked like a logical solution to their problem, but that cause your application to receive bad data?
Good data is the crude oil (we’re all going to hear that analogy a lot more!) that business processes run on. If you have bad (or dirty) oil, you are going to have problems with the process.
Why do application managers care? After all this is data integration, not application management. The answer is pretty straightforward:
- So you don’t get questions like the one above, questions that suck up the time of your staff. (15 hours per analyst per month from one customer source)
- So bad data does not lead to bad transactions and bad decisions.
- So that bad or inconsistent data does not damage the confidence of the users of your application, causing workarounds and lack of adoption.
So, what should be done to fix this? It is time to start thinking about data integration and data quality management as a single system rather than a somewhat random collection of expensive one-off projects. The result will be lower costs, higher productivity, and greater user confidence in your enterprise applications.
For more on this and related topics, visit our Potential at Work site for Application Leaders.
I’m glad you enjoyed my last letter explaining what data is and how people in my industry make a living managing it. After that letter, you confidently answered all data-related questions your knitting-circle friends could throw at you. But then Edward Snowden, former NSA contractor and world-renowned whistle-blower, came on the scene. Suddenly mainstream news anchors are talking about metadata.
I got your panicked voicemail and, as promised, I’m going to try to clarify what metadata is and how it relates to data. (more…)
When executing application modernization or application rationalization your focus is on supporting the business strategy by implementing systems that run critical business processes. And that is exactly where the focus should be.
The problem comes when there is a lack of focus on delivering trustworthy data for those business systems and processes. If you are consolidating enterprise applications or upgrading to new enterprise applications, the data needs to be migrated from System A to System B. This is virtually never a simple “cut & paste.” In fact, data migration projects can be fairly risky. Bloor Research has found in their latest study that 38% of these projects fail. Even worse, the Harvard Business Review reports that 17% of enterprise application projects go over budget by 200% and over schedule by 70%. There are many examples of this. The State of California has terminated their contract with their ERP vendor after spending $254 million. The U.S. Marine Corps has spent $1.1 billion on another ERP system, 10 times its original estimated cost.
So, how do you deliver the business value that your users demand? Here are four best practices to help you to deliver applications on-time and on-budget that meet the user’s needs for timely, authoritative data.
- Have an internal competence in data migration. This is a best practice identified by Bloor Research in their study on data migration. You can’t simply turn this project over to a third party. Only your staff truly knows your internal data. Another thing to consider is that your staff will also have to operate the new applications after go-live (see #4).
- Have a separate data migration team and budget. Bloor Research also recommends a separate budget and team. This is to ensure that there is a strong focus on data migration and quality and that it doesn’t become just a project detail in the larger application installation. Bloor found a very high likelihood of project failure if there is not a separate budget.
- Make sure that your business users are deeply involved. The Bloor survey found that by far the #1 success factor identified by their respondents was “Business engagement.” Unless the business side is deeply involved in requirements definition and providing business context there is a significant risk of misunderstandings that will result in a system that does not meet the needs of its users.
- Consider the new system go-live as the beginning, not the end. We have seen many organizations that view data migration as a “one-and-done” project where everybody packs up and goes home at the end of the project. An enterprise application is a living, breathing, system that needs continuing care and feeding. Once the application goes live, you will need to provide services such as: ongoing data quality management, synchronization with other operational systems, and synchronization with a master data management hub if you have one.
For more information:
Live Webinar with Philip Howard of Bloor Research. May 20, 2013. Successful Application Go-Lives: Best Practices for Application Data Migration
Application Data Migration Presentations at Informatica World 2013. We will have a Hands-On Lab and data migration presentations from Accenture and National Oilwell Varco.
Bloor White Paper: Best Practices for Data Migration
Informatica Solution Site: Data Migration Solution
Lean manufacturing, as defined by Wikipedia, is “a production practice that considers the expenditure of resources for any goal other than the creation of value for the end customer to be wasteful…Essentially, lean is centered on preserving value with less work” and is a management philosophy derived mostly from the Toyota Production System (TPS). I’ve been having discussions with Jim Harris from OCDQ Blog and Reuben Vandeventer, Director of Data Governance for CNO Financial and others about best practices for data quality management and the applicability of lean management practices as it relates to data warehousing. Click here to hear Jim, Reuben and I discuss three critical areas of data quality to focus on for building data warehouses that people actually use and trust.
Big data and related technologies such as Hadoop present significant opportunities and challenges to businesses. Nearly everybody in IT reports that they are actively evaluating big data technologies. And, just as you would expect, they are in a variety of stages of implementation. So, who has time to think about data governance when dealing with a massive change like this?
First, you have to get your hands around the new technology, right? Actually, this is exactly the right time to think about data governance for big data; before the wild, untamed data from outside the company starts getting mixed with your potentially more trustworthy, tamed, internal data. (more…)
Does your organization have a structured repository of metadata that can help a data center operator (whether they are on-site or off-shore) quickly troubleshoot a production incident related to a data integration job at 2:00 am in the morning? Or any time of day for that matter? This is just one use of metadata. A new Metadata Management whitepaper has just been published which describes the wide range of metadata types, uses and the business value derived from them. (more…)
A number of customers have asked me recently about the benefits of using a business glossary product over using a spreadsheet or Sharepoint. The discussion is worth sharing.
If you have a smaller company and all you need is a list of standard business terms to provide a common business vocabulary across the company, a spreadsheet or Sharepoint can work, …up to a point. The problem is that once your organization reaches a certain size, you are going to have trouble scaling the management of the business terms, making them available across a larger organization, and fostering collaboration based on the agree-upon business terms. (more…)
Why is it that point-to-point integration has such a bad reputation and negative connotation? We’ve all seen the infamous hairball (or spaghetti) picture of information exchanges between systems that looks something like this:
But point-to-point is not the problem – variation is. When different teams develop point interfaces without standards or re-useable components and without shared governance and controls, the result is a tangled hairball every time. (more…)