Real Time Data Integration Best Practices – Fast, Not Furious

Fast and Furious (circa 1945)

As an Integration specialist / developer you’ve probably been noticing the changes in your project requirements lately. Accuracy requirements, yes, still there. Data quality requirements – of course. These haven’t changed, although tolerances are getting tighter and more complex with Hadoop sources entering the foray. But now, creeping up on the requirements checklists— getting data delivered when it counts. And these days, that’s faster than ever before. Faster data translates to faster decisions and ability to respond to important business situations.  And that’s really what it boils down to-delivering integrated data after the fact is simply no longer acceptable to your project sponsors.

So what do you do about it? How can you rethink what you do? Often times it starts at the architectural level. Applications created in 2012 using traditional architecture models will be an IT-constraining legacy by 2016. So, again, what do you do?

In the immortal words of Steve Jobs – Think different. Here are some ideas on how to make your integration project Fast, Not Furious.

  • Determine what does “Real-Time” mean to your business. Maybe overnight batch application is all you really need for daily performance management sales by Business Unit, but maybe it’s down to the hour inventory management or by the minute or second order confirmation operational metrics or Straight-through Processing for call-center or analytical data integration for quick decisions
  • Deploy infrastructure that can support data delivery over different latencies. The leading business applications of 2016 are designed today using Nexus-enabled application architecture principles. Look at the Lambda architecture that suggests designing modern integration projects using a presentation layer, a speed layer, and a batch layer.
  • Critical considerations to consider – Complexity of source and targets. Data formats. Level of quality. Availability of source/target interfaces. Volume, velocity; and variety of data. Delivery requirements. Loose coupling/reusability. Performance relative to availability.

This is just a starter set of steps to consider when migrating to real time, but in summary, getting fast means augmenting traditional integration solutions with real-time data integration to:

  • Reduce decision latency.
  • Improve decision quality and company responsiveness.
  • Increase visibility and enable proactive analytics, alerts and notifications to customers and executives in order to meet competitive and regulatory demands.

Here’s another thought. One way to quickly get your project “fast” is to utilize tools you may have licensed, but haven’t taken full advantage of. Yet. In many respects, changed data capture (CDC) works well and is an add-on to the PowerCenter tool-set you already have. In that way, you can poll the data set at intervals in minutes if needed and reflect the current status of any data that has changed recently. You can do this with current tools and current staff, which is a big advantage.

But regardless, if there’s one thing to remember about real time these days– it’s this:

Make sure for your next project you think about getting Fast before your sponsors get Furious.