Tag Archives: ” “SOA
Today, agility and timely visibility are critical to the business. No wonder CIO.com, states that business intelligence (BI) will be the top technology priority for CIOs in 2012. However, is your data architecture agile enough to handle these exacting demands?
In his blog Top 10 Business Intelligence Predictions For 2012, Boris Evelson of Forrester Research, Inc., states that traditional BI approaches often fall short for the two following reasons (among many others):
- BI hasn’t fully empowered information workers, who still largely depend on IT
- BI platforms, tools and applications aren’t agile enough (more…)
If you haven’t already, I think you should read The Forrester Wave™: Data Virtualization, Q1 2012. For several reasons – one, to truly understand the space, and two, to understand the critical capabilities required to be a solution that solves real data integration problems.
At the very outset, let’s clearly define Data Virtualization. Simply put, Data Virtualization is foundational to Data Integration. It enables fast and direct access to the critical data and reports that the business needs and trusts. It is not to be confused with simple, traditional Data Federation. Instead, think of it as a superset which must complement existing data architectures to support BI agility, MDM and SOA. (more…)
Late last year at a company all-hands meeting, the CEO of a large consumer electronics company had a serious mandate. He needed big results and fast.
“Competition is really heating up and customer churn is on the rise. I need visibility on-demand – a common view of all enterprise data or else we cannot continue to grow.” IT teams across the enterprise have been scurrying and working furiously to create a common view of CUSTOMER, PRODUCT, SALES, and INVENTORY, but the results have been incomplete, inaccurate and too slow. This is no easy task. (more…)
So, where have I been since my last blog? Well, I have been working on our new Architect to Architect webinar series on data virtualization, which is very exciting for me as I get to rub shoulders (virtually speaking) with hundreds of industry architects.
The interactive nature and record attendance at these webinars have made one thing very clear – data virtualization is indeed top of mind. In my last blog we discussed the concept and how data virtualization is different or a superset of traditional data federation, especially as it overcomes many limitations of the latter. Wayne Eckerson did a great job at tracking the evolution of data federation in a recent webinar and blog. (more…)
We have all heard of data federation and of late we have also been hearing how simple, traditional data federation often gets passed off as data virtualization. Let’s get back to basics and take a hard look at what the real need is.
Data federation is not a new concept. When it first arrived on the scene many years ago, technologists got excited as it offered a way to quickly access numerous disparate data sources without physically moving data. Years passed and the term kept appearing in research paper after research paper – but what did not happen was the anticipated widespread adoption. TDWI’s Wayne Eckerson does a great job at tracking the evolution of data federation in his recent webinar and blog. Simple, traditional data federation does one thing and only one thing well – it creates a virtual view across heterogeneous data sources, delivering data in real-time, typically to reporting tools and composite applications. In its very simplicity lay its downfall.
The “Business” Needs Critical Data “Now” – We Need The Next Generation Data Federation Technology “Yesterday!”
There is a lot of talk about using data federation, Enterprise Information Integration (EII) or data virtualization to deliver new data to the business, on-demand. However, do existing approaches cut it?
I have been following the data integration space for many years now, and like many of you, I have wondered about the viability of data federation as a data integration approach. Not because it does not hold promise – it does – it has many advantages as a fast, flexible and low cost approach to integrate multiple and diverse data sources in real-time, without the need for physical data movement.
However, according to the numerous architects that I have had the pleasure of meeting with on the Informatica 9 World Tour, simple or traditional data federation has not been able to live up to its immense promise. And why is that I asked – the reasons were many…