New TDWI Research Delves into Data Integration and the Power of a Platform

TDWI Research
Download TWDI’s essential guide to data integration modernization
A new report from TDWI Research is essential reading for anybody responsible for the future of their company’s data management strategy and/or architecture. In “Modernizing Data Integration to Accommodate New Big Data and New Business Requirements Integration,” TDWI Director for Data Management Philip Russom details a seven-item checklist for modernizing data integration that will really help organizations wrestle with the challenge to do more with data—faster—to support new business initiatives.

One thing that jumped out at me as I read it is that if you’re responsible for data management, change will be constant, and you have to plan for it. Organizations today—yours and your competitors’—are looking to compete on analytics. That will require not just data, but clean, reliable, secure data that can fuel innovation and decision-making. You’ll need to support new business use cases, while dealing with the increasing volume and complexity of data and assessing and adopting new analytics technology.

Calling it a “checklist” sounds slight, but Russom has given a lot of depth to each item on the checklist, and discusses the big trends that will impact your future data management requirements, including:

• Multi-latency data delivery
• Fast data prep
• Enabling self-service data access
• Leveraging new platform types
• Adding right-time functions
• Incorporating multi-structured and non-traditional data

The final issue Russom considers is the value of modernizing your data integration tool portfolio to an integrated platform of multiple data management tools. That, I think, is a very important move to make.

The value of an integrated platform

The challenge with data is to meet the demands of the business, and a hodge-podge of overlapping tools won’t make it easy. Your data infrastructure has to deliver data to serve all business initiatives while you also manage change, work to speed data delivery times, and enable more business self-service of data. The only way to do that is with a standard data management platform that provides:

• An end-to end data management capability
• Code re-use
• Data set discovery and re-use
• Automation and intelligence
• Developer and user abstraction from changes in the underlying data and technology to do analytics

Your existing DW/BI environment is not likely to go away any time soon. It serves a very useful purpose today—delivering high-quality data and insights based on well-managed, structured data. Yet we’re also going to see greater need for new capabilities enabled by new technologies. The change will not be sudden. It is inevitable, but cumulative.

This means you’re looking for a data management platform that will serve your immediate requirements and also allow you to evolve with future, still-unknown needs. It will have to let you both integrate your existing tools and adopt new technology with minimal disruption. Basically, every business has different needs, and you want a single, unified platform that’s able to move at the pace your organization is comfortable with, and that your business strategy requires.

Emerging strategies and best practices

Finding a platform that provides that level of flexibility, lets you transfer skill sets from one project to another, and supports both business-critical decision making and rapid analytics innovation, is no easy task. Surveys such as the one TDWI has in this report are showing that the trend towards integrated data management platforms is strong and growing, but Philip Russom’s checklist is the sort of thoughtful, insightful document that is going to help organizations better understand the challenge, and see the road to success.

Of course, a checklist just gets you started, and I’m hoping we’ll all be reading further insight from Philip soon. The question I’d pose to him would be around the challenge of turning insights into action. Specifically, I’d like to know what he’d recommend in terms of operationalizing useful insights that have been found in the “innovation” side of the analytics environment. I think organizations will benefit greatly as best practices around that “handoff” crystalize.

But don’t let me get ahead of myself—for now, download the TDWI Research paper, “Modernizing Data Integration to Accommodate New Big Data and New Business Requirements,” to see how to continue your own data evolution.

Comments

  • Philip Russom

    Roger, thanks for your kind words about my TDWI Checklist Report on Modernizing Data Integration. Your blog summarizes the report really well. Like you, I strongly recommend that readers download the report, so they can get all of its tips.

    Allow me a moment to respond to the question you posed for me: “What about turning insights into action by operationalizing analytic innovations?” That’s a great question, because it touches aspects of modern data integration.

    Much of the business value of new big data comes through analytics —
    Businesses are demanding more value from data today, in general, plus they increasingly depend on multiple forms of analytics to operate, compete, and plan. To achieve those goals, they need to modernize their data integration (DI) infrastructure and solutions to capture new big data types and repurpose them for multiple use cases in OLAP, advanced analytics, reporting, and operations.

    Modern data management must (among other things) integrate data in a way that fosters discovery —
    For decades, we’ve known that you should never allow new data from a new source into a data warehouse or similar target without careful exploration and profiling to determine the data’s technical needs and business value. The same is even more true today, as users struggle to understand the increasing diversity of data types, sources, and potential value. In response, I’ve seen many user organizations extend and modernize their DI infrastructure by deploying new data hubs, data vaults, and data lakes (on both relational databases and Hadoop) to provide modern platforms for landing and staging data (both old and new). These DI-driven data platforms also enable a wide range of users – from highly technical data scientists to mildly technical business analysts – to explore and profile data, so they can understand the state and value of new data, then go on to discover new facts and insights about the business and its customers, products, partners, and other important entities.

    Users don’t operationalize analytic insights as rigorously as they should —
    I just described the road to insight, where analytics leads to business value from new big data. The problem is that the road is too often a dead end. For example, data scientists will discover the cause of a new form of customer churn, share that information with marketers, then quit and move on to the next project. Before quitting, the analyst should also report the findings to the data warehouse team, so it can operationalize the insight by creating metrics and time series that business people can track through reports and management dashboards. After all, forms of customer churn and other significant business events tend to recur, and should be tracked over time. Luckily, this problem is easily fixed by updating business processes and technical best practices around analytics.

    But enough about what I’m seeing. How about you readers? What are you experiencing when it comes to modernizing DI and operationalizing analytics?