Today, agility and timely visibility are critical to the business. No wonder CIO.com, states that business intelligence (BI) will be the top technology priority for CIOs in 2012. However, is your data architecture agile enough to handle these exacting demands?
In his blog Top 10 Business Intelligence Predictions For 2012, Boris Evelson of Forrester Research, Inc., states that traditional BI approaches often fall short for the two following reasons (among many others):
- BI hasn’t fully empowered information workers, who still largely depend on IT
- BI platforms, tools and applications aren’t agile enough (more…)
If you haven’t already, I think you should read The Forrester Wave™: Data Virtualization, Q1 2012. For several reasons – one, to truly understand the space, and two, to understand the critical capabilities required to be a solution that solves real data integration problems.
At the very outset, let’s clearly define Data Virtualization. Simply put, Data Virtualization is foundational to Data Integration. It enables fast and direct access to the critical data and reports that the business needs and trusts. It is not to be confused with simple, traditional Data Federation. Instead, think of it as a superset which must complement existing data architectures to support BI agility, MDM and SOA. (more…)
Late last year at a company all-hands meeting, the CEO of a large consumer electronics company had a serious mandate. He needed big results and fast.
“Competition is really heating up and customer churn is on the rise. I need visibility on-demand – a common view of all enterprise data or else we cannot continue to grow.” IT teams across the enterprise have been scurrying and working furiously to create a common view of CUSTOMER, PRODUCT, SALES, and INVENTORY, but the results have been incomplete, inaccurate and too slow. This is no easy task. (more…)
So, where have I been since my last blog? Well, I have been working on our new Architect to Architect webinar series on data virtualization, which is very exciting for me as I get to rub shoulders (virtually speaking) with hundreds of industry architects.
The interactive nature and record attendance at these webinars have made one thing very clear – data virtualization is indeed top of mind. In my last blog we discussed the concept and how data virtualization is different or a superset of traditional data federation, especially as it overcomes many limitations of the latter. Wayne Eckerson did a great job at tracking the evolution of data federation in a recent webinar and blog. (more…)
We have all heard of data federation and of late we have also been hearing how simple, traditional data federation often gets passed off as data virtualization. Let’s get back to basics and take a hard look at what the real need is.
Data federation is not a new concept. When it first arrived on the scene many years ago, technologists got excited as it offered a way to quickly access numerous disparate data sources without physically moving data. Years passed and the term kept appearing in research paper after research paper – but what did not happen was the anticipated widespread adoption. TDWI’s Wayne Eckerson does a great job at tracking the evolution of data federation in his recent webinar and blog. Simple, traditional data federation does one thing and only one thing well – it creates a virtual view across heterogeneous data sources, delivering data in real-time, typically to reporting tools and composite applications. In its very simplicity lay its downfall.
The “Business” Needs Critical Data “Now” – We Need The Next Generation Data Federation Technology “Yesterday!”
There is a lot of talk about using data federation, Enterprise Information Integration (EII) or data virtualization to deliver new data to the business, on-demand. However, do existing approaches cut it?
I have been following the data integration space for many years now, and like many of you, I have wondered about the viability of data federation as a data integration approach. Not because it does not hold promise – it does – it has many advantages as a fast, flexible and low cost approach to integrate multiple and diverse data sources in real-time, without the need for physical data movement.
However, according to the numerous architects that I have had the pleasure of meeting with on the Informatica 9 World Tour, simple or traditional data federation has not been able to live up to its immense promise. And why is that I asked – the reasons were many…
Because there is real need! Period! Even after investing heavily in agile architecture approaches such as SOA, IT organizations are finding it extremely challenging to solve complex data integration issues.
If you are involved in defining or re-defining a data architecture to enable composite applications and portals to effectively leverage data in an SOA, here is where the discussions are happening.
If you are looking to enhance your existing data architecture to ensure that business intelligence reports can quickly leverage data that is not in your data warehouse, you will find your answers here.
Calling All Architects – Get Engaged in THE Most Important Discussion on Data Services and Data Integration
Do you believe that a solid and well thought through data architecture for efficiently accessing, integrating and processing data is foundational to maximizing business value? Do you want to hear from your peers in the industry, about how they are solving data integration challenges such as speeding-up time to delivery of data, data quality and infrastructure complexity?
If yes, would you like to join the discussion on a data architecture that can help composite applications and portals to efficiently leverage timely, trustworthy and relevant data in an SOA? Or, an architecture that can complement existing data architecture to ensure that business intelligence reports can quickly leverage data that is not in the data warehouse?
Businesses have seen great success in using virtualization to gain greater efficiencies from their hardware and network resources. Now, the concept of virtualization has been extended to the data layer.
The bottom line is about providing a logical abstraction of all underlying data, so that it appears as one data source to consuming applications.
However, given that your data is often distributed, heterogeneous, and often error-ridden, it’s not enough to simply federate it and pass this off as data virtualization. The data you deliver to your end users must be data they can trust, however, traditional data federation approaches seem to ignore this fact. They simply propagate inconsistent and inaccurate data, quickly. So where is the gap?
A painting typically starts with broad brush strokes after which the artist painstakingly fills in each detail, until the masterpiece finally reveals itself. In my last post, Revive Enterprise Architecture With Transformational SOA Data Integration, I introduced you to the broad brush strokes of SOA-based Data Services, the daunting data-centric integration problems it can solve and a high-level summary of its transformational capabilities. As promised, in this short series of posts, we will take a look at each of these transformational capabilities, in detail. So, let’s start with the most logical and fundamental capability – Multimodal Data Provisioning Services. (more…)