Guest Post by Norman Steele, CDMP, CBIP, Enterprise Metadata Management at Fannie Mae
Of all the IT disciplines I’ve worked in over the past 30 years, nothing seems harder to standup than a metadata program that delivers business value day to day. When you turn a light on in a dirty closet, not to many folks want to dive in and clean it up. They would rather shut the door and deal with it later or not at all. Pretty much the same thing happens when a metadata team comes along and tells you that you can find out where your data is by importing your business glossaries and application data models into the repository and attempts to connect them together, only to find inconsistent practices across the organization reducing the ability to connect them.
Glossaries and data models had not been used this way in the past and this exposes numerous inconsistencies:
- Terms defined differently in parts of the organization
- Valid values defined inconsistently
- Multiple naming conventions for the same object
- Data types changing for like objects
- Use of logical names with syntactic structure instead of business terms
- Inability to connect precise business terms to high level concepts found in enterprise data models
- Business terms that do not align with the data model
- Logical side of the data model that does not align with the physical side
How to change this? The premise here is for a metadata team to be self sustaining, where the business asks to get their metadata into the repository and asks for guidance on how to improve its quality, requires collaboration with several groups:
- Governance’s ability to define standards and monitor metrics for adherence is invaluable. Working together, the factors behind each of the issues described earlier can be more effectively communicated through governance standards and best practices and metrics collected to measure progress.
- Business Units ability to see the benefit of the cleanup work, provide the resources not only to learn how to look at their processes a bit differently but make the investment to improve the metadata quality.
- Management support is crucial, especially during the initial phases where it takes time to get a tool configured to work with the design time artifacts the way the organization sees it, not just what the tool does out of the box.
- Every company implements its development process in slightly different ways so when using a tool to help manage your metadata, the tool needs to be configured, enhancements made to fill the gaps which can only happen when you have a good working relationship with the tool vendor.
Only when the collective efforts of these groups are brought to bear on the quality of metadata and the capabilities of the metadata tool can the organization start to see the benefits of connecting them together, something that the evangelists and visionaries tell us lies ahead.
- For more on some of these lessons learned, attend the Fannie Mae Breakout Session at Informatica World 2014.
- To learn more about the conference and keynotes, click here.
- To register for Informatica World, click here.
Views expressed are those of the author and do not necessarily represent those of Fannie Mae.