Tag Archives: business glossary
In the old days, data quality process was executed as a one-way process: IT gets access to raw data and builds lineage and mapping for the data, then passes them to business analyst to perform analysis. Often time analyst finds the data she received are not up to the required standard so she ends up spending great length of time checking and correcting the errors in the data, before playing any analytic magic with the data.
The downside of this process? Poor operation efficiency, delayed time-to-results, unhappy employees (analyst in particular), just to name a few. In the end business is still left with lousy quality of data which provides little value to the company.
As an analyst in my previous career, I have spent days, weeks, sometimes months to clean and standardize the data I gathered, before I could create reports and build charts to explain the meaning of the data, which btw, was the fun part of my gig. Those late night crunch has long memories. Cleaning the data by hand was probably the least enjoyable task for me as an analyst.
But I didn’t know better then, not until I came to Informatica. I learned that there are software tools that can automate the data quality process and significantly reduce the amount of manual work, and most important, with the help of those tools, you get to play with clean and relevant data, you also feel confident that you can rely on those data to make important business decisions. But the best part of my learning is the concept of the role-based data quality process identified by Informatica and implemented in its Data Quality 9.6, the latest release of it Data Quality product family.
In the nutshell, the role-based process works like this:
First analyst and IT will examine the raw data together to understand what’s in the data. They spend time on things such as identifying the relationships in the data, discovering the data domains, etc, just to make sure the condition of the raw data meets their requirement. After this first pass, analyst will need to come up with a set of business rules based on the objectives she wants to achieve, those are the rules that will help identify trends and patterns in the data so proper business decisions can be made. To apply those rules in the actual workflow analyst will need help from her peers in IT to implement those rules into a software so the process can run automatically. Once IT completes the implementation, the data quality process can be executed automatically. However, no data is perfect and no rules can capture all the scenarios. When exceptional event happens, analyst needs to be alerted and decide what to do with the anomaly. Finally, the data quality process can’t be running in the dark, analyst needs be able to monitor and measure the effectiveness of the data quality rules they created, both proactively and post-event for compliance purposes.
Picture perfect but rather complicated implementation right? The good news is, Informatica is now able to deliver many of the capabilities described above to organizations facing data quality challenges. The latest release of Data Quality 9.6 is built around this role-based concept and offers many new functionalities for business users and analysts so they can easily collaborate with their IT peers to implement a holistic data quality process in their organization.
For the first time, role-play (no pun intended) makes data quality sexy and fun. With Informatica Data Quality 9.6, turning your raw data into clean and trusted assets is no longer a resource intensive and draining process. I invite you to join us at a webinar on June 10, in which we will present you an in-depth demonstration of the new capabilities in Informatica Data Quality 9.6, built for enabling a holistic data stewardship for your company.
Do you remember NASA’s $125 million mistake in 1999? The Mars orbiter was lost as the result of a failed information transfer in which one engineering team used metric units while another used imperial units.
I remember because I could relate. After moving to the U.S. from Canada for graduate school, I had to communicate my height in feet and inches instead of meters and centimeters and give directions in miles instead of kilometers.
On a trip to Vancouver, Canada, Andrew Donaher reminded me about NASA’s costly mistake and how it could have been avoided with a business-friendly data governance program. Following much positive feedback from our last blog, I invited Andy to discuss data governance. You may recall that Andy is the Director of Information Management Strategy at Groundswell Group, a Western Canadian consulting firm that specializes in information management services.
Q. According to www.governyourdata.com, data governance is not about the data. It’s about the business processes, decisions, and stakeholder interactions that you want to enable. What’s your take on the value of data governance?
A: The goal of data governance should be to give people confidence in the data they use to make decisions or take actions. They benefit by not wasting time and energy vetting data or creating new processes. That is a huge value to the organization both in terms of risk mitigation and opportunity. At the absolute highest level, data governance is critical to establish trust and confidence in data.
Q. Explain how IT leaders could approach data governance the wrong way.
A. Typically data governance is approached from a restrictive, security-focused and policing perspective. I have found it much more productive to approach it from an enablement, conversational and guiding perspective. The benefit and value of the rules, policies and procedures associated with governance are that people do not have to re-invent the wheel every time. All those things are set up so people can leverage them to provide value faster.
Think back to when you were learning to ride a bike. Hopefully your parent didn’t stand at a distance barking instructions on what to do and what not to do. He or she started by holding the back of your bike so you felt stable and supported, providing you with guidance on how to do it, words of encouragement about what you’re doing well, and constructive advice on what you could be doing better. Then something would click and you’d get it. When you looked back with a smile on your face, feeling proud of yourself, you’d see your parent was no longer holding your bike. He or she was a few steps behind you smiling back while you rode your bike all by yourself!
Remember that feeling of confidence and elation? That is a form of governance too. It isn’t about shutting things down, it is about enabling and supporting. To do this properly you need to listen and understand what the goals are and what is important. I encourage IT leaders to work closely with line of business leaders to ensure trust and confidence in the data. Everyone should know how to get the proper data they need to help the organization move forward.
Q. Can you share some examples of data governance rules, policies and procedures that are more policing than enabling?
A. An example is when “Hold” or “No” are the default responses to every access request. Typically every database request submitted sits in a queue until an administrator reviews the access request and contacts the person with a series of questions that typically add little value. Sometimes the request is granted or it’s escalated for further investigation. While there is absolutely a level of security and policing that needs to occur on sensitive information, sometimes security and governance can unnecessarily become synonymous.
A potential policy alternative is first distinguishing between sensitivity in data structures and then codifying access policies. For example, imagine someone requests read-only access to a generally available schema in the enterprise data warehouse. This person has a particular job title and works in a particular department. Another person with the same role has similar access. The process requires an “approver” to manually review and approve the request. In this instance, you could set up the access request for automatic approval. The risk will have been mitigated through the applied rules, so you have the necessary governance, but you’ve enabled the business to move faster. That’s a win for everyone involved.
Q. Can you give some concrete advice about how to kick off a successful data governance initiative using an enabling approach?
A. I have two recommendations:
- Recruit Business Partners: Make certain you have some highly respected, experienced and motivated business partners to participate in the kick-off.
- Quantify the Value: As a group, quantify the value of risk mitigation and opportunity cost. For example
- To quantify the risk, measure the dollar value of a wrong metric going to the investor community, the impact on the market value and the percentage chance of it happening. Or quantify the executive team making a wrong decision based on incorrect information.
- To quantify the opportunity, calculate the value of speed-to-market, getting a product to customers quicker than a competitor. You should be able to find examples of how much it cost your organization when you launched a product before a competitor and when you launched a product after a competitor. You can leverage that in your calculation to ensure everyone knows exactly how important enablement is.
When you work collaboratively, business and IT will be on the same page. Business leaders should understand the pressures the IT group is under to protect corporate data. The IT team should understand the pressure business leaders are under to get answers to questions quickly to cut costs and find opportunities for growth in revenue and profits.
Q. Any tips on how to enable data governance processes with technology?
A. You may want to consider these two valuable elements to make data governance and analysis even easier:
- Metadata Manager provides a frame of reference or the context to give data meaning. It enables IT staff to manage technical metadata and perform an impact analysis of a proposed change before it is implemented. While root cause analysis enables business partners to dig into a term in a report to understand the source of the data and how it was moved and transformed before it was added to a report.
- Business Glossary maintains a standard set of business definitions, accountability for its terms and an audit trail for compliance. It enables business partners and IT to collaboratively manage business metadata. To use a healthcare example, does “Claim Paid Date” mean the date it was approved, the check was cut or the check cleared? Turn to Business Glossary to find out.
Q. Can you rescue a data governance initiative that was built based on a policing approach?
A. Absolutely. It takes effort and thought but it can absolutely be done. The key to doing it is realizing the opportunity cost of having people create their own business rules and metrics. While there is a cost to the wasted labor, the greatest cost is lost opportunities. If people are spending time trying to recreate rules and reconcile numbers, they won’t have time to focus on the game changing insight you get from predictive analytics or optimization, which is where the real competitive advantage lies.
A number of customers have asked me recently about the benefits of using a business glossary product over using a spreadsheet or Sharepoint. The discussion is worth sharing.
If you have a smaller company and all you need is a list of standard business terms to provide a common business vocabulary across the company, a spreadsheet or Sharepoint can work, …up to a point. The problem is that once your organization reaches a certain size, you are going to have trouble scaling the management of the business terms, making them available across a larger organization, and fostering collaboration based on the agree-upon business terms. (more…)
It’s important to note that I didn’t title this post “Implementing a Data Governance Architecture”. Data governance is not a technology space, tool – or architecture. As our data governance framework illustrates, tools and architecture represents but one of many facets needed to support an enterprise data governance competency. But once you’ve defined your vision and business case with a clear approach for managing the people, process and policy facets, technology can play a significant role in determining the ultimate success or failure of your data governance efforts. Complex and poorly integrated current state architectures present a significant obstacle to applying common standards for the delivery of trusted and secure data across the enterprise. Data architects play a pivotal role in enabling data governance by designing and evangelizing the data management reference architecture to support data quality and privacy requirements. In addition, these architects must recommend enabling technologies to support data governance and stewardship workflows that aid the core processes of discovery, definition, application and measurement and monitoring (Stay tuned – I’ll be sharing a lot more about these core data governance processes in a future post discussing the “Defined Processes” facet of our framework). Whatever you do, don’t fall into the all-too-common IT trap of selecting the tools before the goals, strategy and processes of data governance are in place. If you skip these steps and just try to build it, they (‘the business’) most assuredly will NOT come. (more…)
Any personal opinions on the health care mandate being irrelevant; I can’t help but be amused by the liberties taken by both major political parties on the definition of a “tax.” When Chief Justice Roberts’ gave the majority opinion that the individual health insurance mandate was constitutional under Congress’ power to tax, the political spin doctors went into overdrive. Everyone on both sides is simultaneously agreeing it is and is not a tax in order to promote their agendas – and has managed to confuse the heck out of the American public in the process. (This ABC News story prompted me to write about this).
I bring this up here because this national debate on the constitutionality of “Obamacare” and the definition of what constitutes a tax is no different from many of the politically-charged debates occurring within your organizations with passions running equally high and confusion reigning supreme. (more…)
It shouldn’t have been a complete surprise then, but I was still amazed by the huge increase in interest in metadata at this year’s Informatica World in Las Vegas. It all started with one of our pharmaceutical customers winning an Informatica Innovation Award in the opening session. The next morning, a session that they hosted (which was great) was standing room only, and we actually had to turn some attendees away. We were sorry to do that, but it was a strong signal of the growing interest and use of metadata. (more…)
Last week, a long-time Informatica customer told me about how he is managing change in his data integration environment on a truly massive scale. The story he told reminded me of someone doing a heart transplant on themself while running a marathon. … blindfolded.
- First, he is delivering several innovative new applications. One to improve customer service. Another, proactively monitors operational performance and suggests where the business needs to invest more to improve their capacity and service levels.
- Next, he is rolling out new systems that will collect customer sentiment analysis from social media sources (Big Data) and integrate that with ongoing campaigns and company planning. (more…)
Consider this situation: Would you try to ride a bicycle blindfolded? You could probably pump the pedals and steer without trouble, but you would be lacking the visual feedback that the changes you are making in direction and velocity will keep you on your intended course and avoid harm.
This question undoubtedly sounds crazy, but people are making changes to their data integration environments every day without the tools in place to visualize the environment and to tell them the impact of proposed changes.
There are good tools available today to help with this problem.