Tag Archives: BI
On www.ebizq.net a few days ago, the question was posed “Why is a Single Version of the Truth Still Difficult to Achieve With BI?” I provided a quick response there but felt that it’s a topic worthy of expanded discussion.
As I wrote in my comment, a single version of the truth by definition means that there's a single representation of critical master data such as customers, products, assets, and more that's unique, complete, and consistent, and becomes the most reliable and authoritative information for the entire enterprise.
Business Intelligence as a technology or market is specialized to report on existing data from multiple systems without prejudice. It may do some aggregation or rollup for dimensional analysis but it is not designed or equipped to create a single version of the truth. The inconsistency in customer or product dimensions from siloed applications can make any BI analytics running on data warehouses or operational data unreliable. It should come as no surprise to regular readers of this blog and to those familiar with MDM that MDM is needed to recognize (disparate data), resolve (into a single version of the truth), and relate (them to derive some meaning). In fact, many MDM projects have been kicked off for the initial goal to have BI reporting to more accurately reflect the true nature of the business.
Brian Gentile, CEO of Jaspersoft made another interesting comment by saying “Even within classical BI systems, more than one version of the truth can persist.” He reasons that OLAP systems that require OLAP cubes exacerbate the problem by holding yet more copies. In an MDM system this could be avoided as a matter of process by always using the latest dimensions fed from the MDM Hub. In effect, the OLAP system or data warehouse should be considered a downstream sync which needs to be updated by the integration capabilities of the MDM platform.
Regardless, I think the question posed was an excellent one because it reminds us that many companies out there are still starting to learn about MDM. Those of us who have been living and breathing MDM for the last 5 years or more take it for granted that MDM should be a precursor for many downstream benefits such as more accurate BI. It goes to show that plenty of continued education and support is needed by all of us to help organizations learn about and realize the benefits of MDM. Hence this blog
The first thing I would like to do is dispel a myth that many people believe. That is, being information-enabled or competing with data means analytics or BI. This is only partially true.
Analytics is one of the methods an organization uses to compete on information. For example, with analytics you can analyze buying behavior and leverage the information to better promote products. To truly be information-enabled, an organization must control the information across operational (transaction) systems as well as analytic solutions.
In the world of analytics, most organizations invest a significant amount of time and effort cleansing data from operational systems before it moves into a data warehouse. Thus, enabling higher quality analytics where reporting can be performed. However, the “cleansing” effort is rarely reflected back into the source/operational systems. This plays into the unwritten rule of IT that bad data doubles at the rate of good data. (more…)
Over the last 4 years we have seen Customer Data Integration (CDI) evolve into Master Data Management (MDM), heard many debates about MDM architectural styles, features and functions, operational vs. analytical, and the ever present need for an MDM platform to have web services so that applications could be developed to manage and retrieve the valuable master data contained within the MDM system. In fact, one vendor used to tout the number of services they shipped with their Hub (Hundreds) as an indicator of why they were the best MDM platform.
Fast forward to present day. Many MDM systems are now up and running, processing master data sourced from internal CRM, ERP, legacy applications to create a Reliable Trusted Golden Record (RTGR), but they have not yet “mastered” the ability to enable business users to directly create and consume master data from the MDM system. As a result, business users continue to create incomplete, inconsistent and duplicate data within their business applications (CRM, ERP, etc.) but those applications are unable to enforce data quality at the point of creation. The result is that data stewards end up retroactively resolving those problems downstream. This situation quickly creates a bottleneck because shear volume of errors means that data stewards cannot fix the thousands of bad data records fast enough. While providing more features and functions to data stewardship consoles and improving back end automatic processing capabilities helps the problem somewhat, many companies are now realizing that a possible solution could be to take a leaf out of Business Intelligence (BI) in the 90’s by giving more exposure and management of Master Data to the business users.
In order to do this, IT must now decide how they apply a business user facing user interface (UI) on top of their MDM Hub all the while noting that the other types of master data will likely be added to the Hub with ever changing business requirements. They also face ever changing standards by which popular UI interaction technologies continue to evolve and mature. Many IT are not equipped to do custom application development. Even if they could, there is that little problem of finding enough additional bandwidth to learn how to incorporate those hundreds of services to embark on major ground-up development of new applications. Finally the maintenance and change costs would be high.
What is needed is for MDM platforms to move to the next level. The architecture of a MDM platform should be integrated, model/metadata driven and flexible enough to apply the latest technologies in UI, workflow, etc. along with tight security. All these should be easily achievable through configuration rather than coding.
MDM for the business masses, it’s time.
For anyone in the integration business, the notion that data silos are bad is deeply engrained in the psyche. It’s plain common sense that having multiple copies of data in different places makes it a lot harder to run your business in a consistent, coherent manner. But we keep committing the same sins over and over again– often with the very technologies that promise to solve the data fragmentation problem—SOA, data warehousing, MDM, to name a few.
First we moved off mainframes to distributed systems. Of course, no one would doubt that the benefits in terms of access to key business data and application functionality more than outweighed the costs of silo proliferation. At least in the client/server era, the number of silos was still somewhat manageable.
But then the internet came along, and everyone rushed to get the latest internet/web application up and running, while at best paying lip service to a cohesive enterprise architecture. As we later learned, this lead to a huge proliferation of new systems and data silos at most companies. (more…)
Okay, so the band Coldplay will never release a song by that title (and I probably wouldn’t want to hear it if they did.) But it would be timely, because despite certain rumors to the contrary, data warehousing is thriving.
We weren’t supposed to need data warehousing in an era of SOA/data services, data federation and other new-fangled technologies. Data warehousing was old-fashioned and tired and a bit boring. But the need for data warehousing solutions just continues to grow– companies aren’t getting less data, and their environments aren’t getting simpler. The discipline of integrating data from multiple systems and conforming it to a common structure so that is can be analyzed and used for business intelligence and reporting is still invaluable. This is not to say that the new technologies don’t play a role– they can greatly enhance data warehousing by providing more real-time data and new ways of delivering data where it’s needed. (more…)