There was an interesting story that surfaced recently. Indiana University researchers found that a pair of predictive modeling techniques can make significantly better decisions about patients’ treatments than can doctors acting alone. Indeed, they claim a better than 50 percent reduction in costs and more than 40 percent better patient outcomes. (See a story by Derrick Harris over at GigaOM for additional analysis, and I will also cover this subject in greater detail in a forthcoming column in TDWI’s “BI This Week.”) The use case for big data and predictive analytical models is compelling. The researchers leveraged clinical and demographic data on more than 6,700 patients with clinical depression diagnoses. Within that population, about 65 to 70 percent had co-occurring chronic physical disorders, including diabetes and hypertension.
Leveraging Markov decision processes, they built a model used to predict the probabilities of future events based upon those events that immediately preceded them. Moreover, they leveraged dynamic decision networks, which can consider the specific features of those events to determine probabilities. In other words, the model looks at the current attributes of a patient, and then uses huge amounts of data to provide the likely diagnosis and the best treatment to drive the best possible outcome.
The use of core data points along with well-designed analytical models leads to a cost reduction from $497 to $189 per unit (58.5 percent reduction). Also, patient outcomes improved by about 35 percent.
I’m not sure this surprises many in the world of big data analytics, and those who have leveraged predictive modeling approaches and tools. I spent a lot of time just out of college creating such models around business-oriented use cases, such as predicting the growth of a product line around the correlation of external data points, such as economic indicators.
The problem in the past was that the right data in the right amounts typically did not factor into these models. Thus, the results were limited to the available data.
What’s critical to the use of predictive modeling is the ability to access massive amounts of data and consider that data within these models. This is accomplished using a sound data integration strategy, and the right set of technologies to access the right operational and big data systems aggregating the correct and current data required for the models.
This is not just technology that will be nice to have. The use of predictive analytics and the tools that support the creation of these models, along with the strategic use of data integration technology changes the game.
Besides the use case presented above, which means better treatment at lower costs in the world of healthcare, businesses will have the ability to apply the same value to core business processes. This means the ability to optimize the business with better use of data, allowing better decisions. Even automating those decisions.
Thus, we have the killer application for data integration technology.