Is the constant drum beat to process more data in less time keeping you up at night? Are your business users beating you up to deliver their data faster? Monthly loads turn into daily which turn into hourly which turn into… I don’t know… and you don’t know where to turn? Cheer up, you are not alone, and you’ve come to the right place.
Having worked with hundreds of companies over the years on data integration, one of the most common challenges they face is when they “hit the wall” on batch processing. That is, traditional single server batch processing is no longer fulfilling the needs of the organization. It’s not always about moving more data in less time, but can also be driven by the need to deliver the same (or new) data with less latency.
Business imperatives, like those focused on improving customer service and operational efficiency, can drive your organization toward agile BI or operational DI. IT focused goals such as reducing costs or platform modernization may be asking you to do more with less… and the drum keeps beating…
Globalization and other business needs are moving organizations toward 7 x 24 operations. Growing data volumes means more data must be processed within the small (and shrinking) batch window. The end goal of data integration is to deliver the right information at the right time to the right people or applications. This is a challenging goal because this information generally originates in independently managed systems that were independently developed with incompatible data models.
In a series of upcoming blog posts, I will draw upon my experience and delve into best practices and different techniques for optimizing transformation performance, scaling with grid computing, leveraging data virtualization for agility, real-time data integration, lean integration, and looking at what “big data” brings to the data integration party.