Illusion Series – Episode II: The Return of Business as Usual

So as I sit here contemplating life, I realize that change is not only hard but damn’ hard. Not so much for me (of course;-) but for everybody else who does not run into and advise a new company every other week including a wide variety of personalities and business models, aka canonized modes of repeatable processes generating economic value.


Where are you along the curve?
Where are you along the curve?

If one would subscribe to the Gartner Hype Cycle, find the technology of choice along this squiggly line. One would think that the further you are to the left, especially before hitting the last few meters before getting on the Plateau of Productivity, the economic justification of said technology would be a nice-to-have. When any non-deployment oriented technology (aka not cloud computing) came on scene, for example, CRM or big data analytics, they were so cool, buyers were just enamored with its capabilities, not their value. When the inflated expectations finally caught up (or will catch up in case of big data) and force a technology into the trough, their economic raison d’etre gets questioned and requires justification.

Many of the technologies on the Hype Cycle are solely or heavily business process management oriented. In the olden days, anything from ERP to SCM, CMS and CRM and ultimately, BPM, relied to a large degree or completely on capturing, harmonizing and optimizing workflow.   And there was a good reason for this. Productivity gains were driven heavily by eliminating human interaction and automating system process steps. This meant marginal productivity outpaces average productivity. However, as we have squeezed the last drops of human and machine sweat out of the process towel, these two indicators have traded positions.


Mind the gap - average vs marginal productivity
Mind the gap – average vs marginal productivity

Looking at the average Hype Cycle over the last few years, very few of them are truly data centric like data management, analytics and social media platforms. Granted, the true value of them would be to automate actions from any insight but that could be fairly easily plugged into any existing service bus or workflow backbone.

So why is it then that any workflow 2.0 (or 3.0) initiative like app-to-book in insurance often has little to no consideration about where the data is coming from feeding these new (and old) processes? Why are major processes re-engineering projects outpacing advanced data management projects in terms of frequency, size and executive attention?

How are you going to get to your process ROI if your data ROI is not being considered? In one insurer’s financial justification I actually heard the chief architect say “we assume the data is good and available”. I would consider this thinking a major risk.

Moreover, why are only fairly mature technologies towards or in the trough required justifications. In my humble opinion, if any technology requires a business case or deep investigation around supported use cases in a particular sector or organization, it is a brand new one.

From a software vendor and integration provider POV, this is the quickest way to win new business and justify required resources. Once a technology has become fairly mainstream or integrated into other consuming technologies, such as data quality into data warehousing or big data analytics or location awareness into mobile payment and service life cycle management, the financial justification should have been well established already.   I would expect that most organizations can rely on benchmark reports rather than commission expensive individualized investigations.

After all, by then self-sufficient automation and decreasing marginal returns are driven by the fact that increasingly more source applications have been upgraded or replaced by newer more well-maintained. It also assumes that competitors have invested already, effectively diminishing the ROI of a company starting to investigate a technology.

So why is it then that the ROI of improved enterprise data management is still something someone has to prove? Why is it still an afterthought in many companies or geographies? Why is such a justification still required at all in industries where a large number of leading firms have invested already? If you are that late to the game, advanced data management has become table stakes and should be on par with any other technology set in terms of priority. I would rather not see another corporate IT pick list including BPM, ESB, EDW, Analytics, Data Quality and Cloud CRM, only to have Data Quality fall off the list right from the get go.

Is going back to business as usual the death of data? Have Uber, Facebook, Google and Amazon not taught us anything about how to prioritize good data? B2B can definitely learn something from B2C.

What can B2C learn from B2B?

Previous episode:

Episode I