Tag Archives: EAI
With the ready availability of data integration technology, it’s amazing to me that the use of manual coding for data integration flows is even a consideration. However, based upon this article in SearchDataManagement, the concept is still out there.
Of course the gist of the article is that hand coding is no longer considered the most productive way to go, which is correct. However, just the fact that this is still an issue and a consideration for anyone moving to data integration solutions perplexes me. Perhaps it’s the new generation of architects and data management professionals who need a quick lesson on the pitfalls of doing data integration by hand. (more…)
Those who understand data integration and the supporting technology are in high demand. Why? The need to create data synergy within enterprises, among both traditional and cloud computing-based data and applications, is inflecting. This is due to a more clearly understood business benefit around the value of data integration.
The ability to have information arrive on-time and when needed has been a fundamental need of IT since I wrote the EAI book over a decade ago. However, in the last few years, systems became more complex, including the complexity of the data that exists within them. In response, data integration grew more complex, the technology more sophisticated, and thus the increase in the demand for data integration talent. (more…)
I was happy to do an architect-to-architect Webinar with David Lyle, which was more of an interactive conversation than a Webinar. The focus was on the ability to provide integration using data virtualization, but the message was perhaps more profound than that.
The core issues that many enterprises face are that information is largely an asset that they cannot access. The data is locked up within years and years of ill planned databases and applications where the core data, such as customer and sales information, is scattered throughout the enterprise. Most staffers and executives consider this to be “just the way it is.” (more…)
For years, organizations have been relying on strategies such as enterprise application integration to streamline and automate business processes from disparate silos and systems. However, there’s a large gaping hole in the capabilities EAI – and its successor, enterprise service buses – can deliver. Current middleware strategies fall short in addressing data integration and data quality issues – and this is costing organizations. (more…)
On the 1st October 2009, I participated in a webinar “The Right Way to Do Data Integration for Applications,” hosted by David S. Linthicum, a recognized expert in SOA, Cloud computing and Enterprise Application Integration. It was an event that was very well attended and generated a lot of interest from the attendees judging by the large number and quality of questions that were submitted. I recommend that you listen to the replay and download the associated white paper he wrote on that subject.
David covered some of the limitations he has encountered over the years with the way Enterprise Integration Application (EAI) and Enterprise Service Bus (ESB) technologies deal with the integration of data, deployed in a SOA initiative. (more…)
The recent blog post from Forrester Research analysts Clay Richardson and Rob Karel posed an excellent question around synergies between Business Process Management (BPM) and Master Data Management (MDM). Siperian customers have been at the forefront of driving these synergistic requirements that have made the Siperian MDM Hub uniquely suited for bringing together BPM tools such as Lombardi with MDM to enable data governance.
Together with the excellent points made by Clay and Rob, one of the trends we have seen is that most of the integration of MDM and BPM has been focused on the inbound – the creation of master data. But in the outbound – synchronizing master data (from the MDM hub) with downstream systems – BPM has been less leveraged or involved.
Inbound: While an MDM hub automates the merging of a large volume of duplicates, exceptions need to be handled by the data steward in "collaboration" with the data owner/ business user. Additionally, business users are starting to interact with Hub data directly as a “system of entry”, through interfaces like Siperian’s Business Data Director. Both of these inbound scenarios require a “chain of approval” and potentially sophisticated rules for process flow. A BPM tool such as Lombardi integrated with Siperian and leveraging the Hub’s ability to store states (transitional states of records prior to being finally committed to the hub). See a previous blog post The Art of MDM Workflow for more information.
Outbound: Downstream systems that accept the data from the MDM Hub can automatically receive updates through data synchronization. Once the master data is created or updated in the MDM Hub, it can place records onto message queues for EAI style distribution to systems such as CRM/ ERP, allowing those applications to have up-to-date accurate master data. Alternatively, periodic batch exports and updates using ETL can also accomplish this albeit in a non-real time manner. Typically, since there is no human interaction or complex process flow, BPM does not enter into the equation for such processing.
In summary, so far we have seen great use cases for using BPM inbound, not so much on the outbound side. What have been your experiences and do you have situations where you have applied BPM on the outbound?
I am back after a somewhat self-imposed hiatus during which I have been doing some soul-searching, or rather talking to a number of practitioners, experts, thought-leaders and analysts in the integration space. My singular quest was to uncover some real-world myths about SOA.
I spoke to a variety of integration experts – enterprise and application architects, application developers, data architects and data integration developers. During these interesting conversations, we discussed real-world SOA…or let me qualify that term further as real-world “service-orientation.”
Of course we discussed paradigms such as “loose-coupling,” “modularity,” “services,” etc., but more importantly in many cases, we spoke at length about how they were falling short of realizing the promised benefits of SOA. On probing each usage scenario further, I chanced upon a couple of interesting myths about SOA, which I would like to share with you. (more…)
In one of my earlier posts I discussed the need for a sophisticated data services-driven technology serving as the foundation for SOA and BPM.
“Data and processes are intertwined. It will fundamentally change the way organizations think about your roles, and your roles are going to need to evolve”.
At this year’s Data Management Association (DAMA) International Symposium,
Michael is quoted saying that:
“In this world there’s a very loosely coupled user interface from the assembled services that in turn share access to data. SOA exposes data issues to more people, places and processes, and what I tell companies is that without a focus on information management and meta data management they’re going to fail.”
It is in speaking to numerous customers, prospects and technologists that I had gathered that without accurate, consistent and timely information, SOA and BPM deployments will face serious information-centric hurdles, affecting the cost-effectiveness and success of the project. As we move towards more agile architectures, I believe that we need to grow typical process-centric approaches to include information centricity as well.
As Michael states:
“Where we are going is beyond the first generation of BPM and SOA [that is process-centric],” he said, “to the next generation of SOA that is information-centric.”
Observe that the key word here is “information-centric.” Reading such statements from Michael and many others definitely validates the strategy I have been defining for building out an effective IT infrastructure that can benefit from the flexibility of a services and process-driven approach, in the data integration layer. Simply wrapping data access with a web service does not qualify as a sophisticated data service and hence, stringing together such simple services with a BPM tool also does not guarantee agility.
As discussed in Services to Orient your Enterprise Data Layer, Joe McKendrick is of the opinion that neither SOA nor enterprise-application integration alone can effectively handle the enterprise data layer. However, data services delivered within an SOA framework can create a data-abstraction layer to address the complexities seen across enterprise data environments.
I have always said that without serving up good quality, consistent and timely information as a data service or a comprehensive data service built using a sophisticated data integration platform, SOA and BPM deployments will not be able to deliver on their promise of agility.
What are your experiences? What kind of information-centric issues have you run into in your service-oriented deployments? Is inaccurate, stale and inconsistent information passing through your IT infrastructure holding you back?