Reflections on Informatica World
This post is by Neil Raden. He is an author, consultant, industry analyst and founder of Hired Brains Research, based in Santa Fe, NM. He has a passion for analytics based on decades of experience, and strives to express it through his work in writing, speaking and advising clients.
Martin Baron, the Executive Editor of the Washington Post, giving the Oweida Lecture in Journalism Ethics recently at Penn State University, mused, “I confess, for a long while, I was wondering what to talk about. You have to wonder whether every possible angle has been thoroughly covered already.”
I know exactly how he feels. I spent four days last week at Informatica World in San Francisco, an event covered by dozens of analysts, journalists, consultants and Informatica clients. Surely every announcement, every revelation and every opinion of the same has been dissected, analyzed and reported. Rather than just repeating what has doubtlessly been covered, all I have to offer is how I see it through my lens.
Informatica was an important part of the success of my consulting company from 1995 until about 2005 until I shut down the implementation work and concentrated on industry analysis and advisory services instead. There were other data integration tools available, but we chose Informatica for most of our clients for one feature the others lacked – Metadata. Moving and curating data without writing code in COBOL or FORTRAN or APL or Easytrieve was a huge step forward and made data warehouse success more attainable, but metadata made the process shareable, repeatable and easy to document.
Fast forward to 2017, and there is metadata right smack in the middle of everything Informatica is doing, stitching together all of the components (previously siloes) of Data Integration, Big Data Management, Cloud Data Management, Data Quality, Data Security, Master Data Management and also Data Governance, an area where until recently Informatica was a little weak, but the acquisition and integration of Daiku and Axon plugged that gap. Metadata gets a new name, Enterprise Information Catalog, certainly a more descriptive one. But one wonders, how is all this metadata created and maintained? Enter (with great fanfare) CLAIRE, their metadata-driven artificial intelligence technology. You can read more about CLAIRE here.
We didn’t get much detail on how CLAIRE (with AI in the middle, cute) implements AI technology beyond references to ML (machine learning), a term that is all-to-quickly becoming generic, but it stands to reason that delivering on the promise of an active enterprise data catalog requires more than a group of data stewards manually tagging data into a metadata catalog. Informatica is not alone in this, as others like Tamr, Alation and Trifacta for example (and there are others too, including the stack vendors) have pitched the same concept, but they are all point solutions unlike Informatica which now offers a (conformed) suite of tools and applications.
In my own experience with analytics, dating back decades, one thing has always been clear- the problem is data, always was, and still is. I don’t know how well this will all perform; I hope it does. I have no reason to believe it won’t. Informatica claims to have 7,000+ customers, but I suspect many of them will not move beyond their existing data integration and/or MDM solutions right away for reasons of cost or skill. But that 7,000+ includes many of the elite organizations in the world and they will certainly give this a shake.
There are a lot of brilliant people at Informatica and they are staking out the high ground. I have reason to believe, beyond my long association with the company, that this bold transformation will pay dividends to Informatica and its customers.