In my last post: Master Data Modelers, I alluded to a fundamental issue with the way that some organizations drive their master data management project plan, and that a fundamental issue influences the modeling approach that is taken. It centers on what should be a very simple question: is MDM about data consolidation or data sharing?
When MDM is perceived to be about consolidation, it suggests that the important aspect of the program is to collect data and merge it together into a single “conformed” master representation. But each of the data sources is probably going to have its own idiosyncrasies, attribution and representations. Satisfying the need for consolidation means that the target model has to be robust enough to capture all the data, but that can also lead to those same issues that triggered this series in the first place: inconsistencies across master reference data sets, struggling with merging data sets with different relational and data element structures, and most importantly, confused semantics in the result master data set.
When MDM is understood to be about sharing, the focus is on the business application consumers of the unified view of the shared master data, and the processes involve working with the business users to understand how the shared master data assets can improve process performance.
My opinion is that if the perception is that MDM=consolidation, the developers focus on the master data model as the target for consolidation. All the energy goes into dumping the data into the repository and engineering the data integration accordingly. These are the guys who are surprised when the master data does not conform to any end-user expectations, and they eventually hit the wall.
On the other hand, if your focus is business-process centric, then master data consolidation is just a means to an end. In that scenario, the master model is essentially the conduit through which the collaborative community accesses the master data. These teams often consider data governance and data quality as prerequisites to the program and end up with a more streamlined, faster implementation. Next post: considerations for master data modeling. For more on this topic join me on a March 20 TDWI Webinar, “Is Your Approach to Modeling MDM Fixed or Flexible?”