Tag Archives: Data Virtualization
According to Doug Henschen, Executive Editor at InformationWeek, “Despite the weak economy and zero growth in many IT salary categories, business intelligence (BI), analytics, information-integration and data warehousing professionals are seeing a slow-but-steady rise in income.” (more…)
If you don’t understand application semantics ‑ simply put, the meaning of data ‑ then you have no hope of creating the proper data integration solution. I’ve been stating this fact since the 1990s, and it has proven correct over and over again.
Just to be clear: You must understand the data to define the proper integration flows and transformation scenarios, and provide service-oriented frameworks to your data integration domain, meaning levels of abstraction. This is applicable both in the movement of data from source to target systems, as well as the abstraction of the data using data virtualization approaches and technology, such as technology for the host of this blog. (more…)
Data integration has always been a core technology in the world of healthcare. The latest trend is to leverage data integration technology as an emerging key strategic advantage within healthcare systems. The objective is to improve care through the use of existing and new information, and thus provide the ability to be more proactive, saving money and saving lives. (more…)
Data integration is one of the most important concepts that enterprises should be dealing within 2013. Data integration provides the ability to extract information from source systems, and move the information to target systems that need to act upon the information in some way. As the number of systems have increased and become more complex, the need to dive deeper into data integration becomes more apparent and urgent.
Data integration allows us to approach the real-time enterprise, where all processes and systems have the ability to see into all other processes and systems, and react to optimize the business in (near) real-time. This concept has been mulled over for years, but has yet to become a reality for most enterprises. (more…)
In terms of data integration, the notion of data virtualization lets us think about collections of data or services as abstract entities. Thus the abstractions can be represented in a form that is most useful to the integration server or the data integration architect. It’s this notion of abstraction that provides for the grouping of related pieces of information. These groups are independent of their physical location and structure, and allow us to define and understand what meaningful operations can be performed on the data or services.
We leverage data virtualization for a few core reasons: (more…)
The ability to create abstract schemas that are mapped to back-end physical databases provides a huge advantage for those enterprises looking to get their data under control. However, given the power of data virtualization, there are a few things that those in charge of data integration should know. Here are a few quick tips.
Tip 1: Start with a new schema that is decoupled from the data sources. (more…)
Those moving to Big Data, and that is a lot of enterprises right now, should also consider the need for data integration to support their new data platform. In many cases, the use of proper data integration procedures and technology is an afterthought. However, with a bit of planning and the right data integration technology, the transition to Big Data can be a smooth and productive one. Here are a few things to consider:
Data quality becomes even more important. Considering that Big Data systems, no matter if they are within the cloud or the data center, manage massive amounts of data, both structured and unstructured. Thus, the ability to manage data quality becomes more of a priority. (more…)
I just came back from MicroStrategy World. There were many conversations about social, mobile, cloud and big data. There was strong interest in cloud, clear adoption of mobile, and some big data adoption. eHarmony had a great presentation about how they handle big data with Informatica, and how they’re starting to use Hadoop with Informatica HParser running on Hadoop for processing JSON.
But that wasn’t the number one conversation. The one topic that everyone was interested in – and I talked to nearly 100 customers and partners over four days – was creating new reports faster, or Agile BI. (more…)
Today, agility and timely visibility are critical to the business. No wonder CIO.com, states that business intelligence (BI) will be the top technology priority for CIOs in 2012. However, is your data architecture agile enough to handle these exacting demands?
In his blog Top 10 Business Intelligence Predictions For 2012, Boris Evelson of Forrester Research, Inc., states that traditional BI approaches often fall short for the two following reasons (among many others):
- BI hasn’t fully empowered information workers, who still largely depend on IT
- BI platforms, tools and applications aren’t agile enough (more…)