The State of Enterprise Data is a Business Problem
When I talk to business leaders including CFOs, they tell me data is an issue. They tell me in particular that they worry about the integrity of the data from each and every source. Because of this, many CFOs in particular still manually manage their data. Clearly, this explains why the total productivity impact of digital transformation has not yet been felt.
The problem clearly starts with the fact the fact that IT was been developed in piece meal fashion. For this reason, “the applications work fine. Together, they hinder companies’ efforts to coordinate customer, supplier, and employee processes—they do not form a foundation for execution. And the company’s data, one of its most important assets is patchy, error prone, and not up to date.” (Enterprise Architecture as Strategy, Jeanne Ross, page 7)
As I see it, there are two causes for the hairball. First is the current state of enterprise architecture and the second is how well everything is integrated together. One investment banking company that Jeanne Ross knows says that “80 percent of his company’s programming code is dedicated to linking disparate systems, as opposed to creating new capabilities” (Enterprise Architecture as Strategy, Jeanne Ross, page 7).. An enterprise architect I interviewed stated “the profession is just maturing. The enemy of enterprise architecture is complexity. Things are not getting simpler. Strong architecture and integration as disciplines are just in fact getting started. It is the complexity of the systems and the need to rationalize it”.
Clearly, we need to fix the enterprise architecture. A key element of this often involves maturing and standardizing what we have. The starting point for this often is often creating a traceability model from strategies to business services to processes to business capabilities. It seems from my enterprise architecture discussions that the process of gathering the information for the model is a painful process. Once a model is developed, we should look at what to keep, change, or eliminate in the service portfolio. Key to success is removing the data management risk you face when modernizing, rationalizing, or consolidating applications.
If you are taking this step, there is one more step that you should consider. This is fixing the applications interconnection issue. Just like applications themselves, the interconnect between applications that has to be maintained is often orders of magnitude greater. If only, the interconnect could happen once to each and every application. Then agility could be returned so instead of managing 10s to 100s of interconnections per application, you could manage just one. Everything becomes simpler with a hub approach versus point to point interconnect. Here data sources are decoupled from destinations, enabling application to publish once but effortlessly support one-to-many consuming applications. This can mean dramatically fewer integrations. At the same time, this means data can be validated once. Consuming applications no longer need to build out additional data validation processes. Data is already certified to be clean, complete, enriched, and accurate. As big data sources become mature they can clearly play in this same hub ecosystem.
So there you have it, you do not need to stay with the hairball of interconnection. You can discover what you have. You can then consolidate and retire what is at its end of life. And finally, you can fix the interconnect that remains so it connects only once to the application. With these steps, the business wins through improved agility and as well reduced costs.
Solution Page: Application Consolidation and Migration
Solution Page: Informatica Data Hub
Webinar: Best practices in hub architecture