Making the Leap to Predictive Analytics
I recently got back from a great Informatica World 2016 (our biggest ever). One of my favorite sessions there was a presentation by a leading North American company on how they made the jump from BI/reporting to predictive analytics, by first focusing on the data governance.
Too often, data management is overlooked in the rush to show the results of analytics investments, but will really pay to take a step back, think about the new data requirements, and how you are going to meet them.
Data considerations for predictive analytics
This organization, like many, had a fairly mature data warehouse/business intelligence (DW/BI) environment that they had been successfully running for years. And, like many DW/BI environments, the DW was built to track changes to relatively static data that was carefully structured. It ran on very exact and well-managed data that fit into a carefully designed data model that had been relatively stable over the years.
Moving to predictive analytics with more of a focus on data mining and statistical analysis had a number of new and challenging data requirements that they had not faced before. First, predictive analytics produces better results with more data and from a wide variety of data sources. Second, predictive analytics is not always about trends, but can also be about finding significant data outliers – events and conditions that are atypical from the norm.
A basis for success
Before starting on their predictive analytics initiative, this organization first stepped back and designed a data governance program into the program. Experience has shown that it is a lot easier to design-in a data governance program up front than to try to retro-fit it later on – after the initiative is well down the road.
Because the data for this initiative was coming from a wide variety of sources, both internal and external to the organization, they established data stewards who are subject matter experts in their areas to manage the data policies, meaning, and quality for their areas.
This level of data governance may not be necessary for some predictive analytics initiatives, depending on how business-critical the results will be. In this case, the results of their analytics had a very direct impact on their operations and therefore the quality of the customer experience they delivered. Since their #1 goal was to improve the customer experience, this qualifies as business-critical and merits the level of governance they applied.
After they had established a data council, put data stewards in place, established metrics, and run the process for a while they found that they were delivering data that the organization could use with confidence for significant operational decisions. The business ran more efficiently, saving money, while at the same improving the experience for customers.
Their advice for predictive analytics for operational effectiveness:
- Design data governance in up front
- Define the data governance strategy, people and processes first. Then select the technology that meets the requirements identified.
- Identify which data is important to manage (don’t “boil the ocean”) and establish KPIs to monitor the effectiveness of the effort on your key business data
- Let the process run a while to get the data to a table and trustworthy state
- Test regularly with your new predictive analytics to ensure that you are delivering the results that you expect.
This is great advice for anybody who is planning to make the leap to predictive analytics.
For more information, read our ebook, “Laying the Foundations for Next-Generation Analytics“