Tag Archives: Data Virtualization
Speed is the top challenge facing IT today, and it’s reaching crisis proportions at many organizations. Specifically, IT needs to deliver business value at the speed that the business requires.
The challenge does not end there; This has to be accomplished without compromising cost or quality. Many people have argued that you only get two out of three on the Speed/Cost/Quality triangle, but I believe that achieving this is the central challenge facing Enterprise Architects today. Many people I talk to are looking at agile technologies, and in particular Agile Data Integration.
There have been a lot of articles written about the challenges, but it’s not all doom and gloom. Here is something you can do right now to dramatically increase the speed of your project delivery while improving cost and quality at the same time: Take a fresh look you Agile Data Integration environment and specifically at Data Virtualization. Data Virtualization offers the opportunity to simplify and speed up the data part of enterprise projects. And this is the place where more and more projects are spending 40% and more of their time. For more information and an industry perspective you can download the latest Forrester Wave report for Data Virtualization Q1 2015.
Here is a quick example of how you can use Data Virtualization technology for rapid prototyping to speed up business value delivery:
- Use data virtualization technology to present a common view of your data to your business-IT project teams.
- IT and business can collaborate in realtime to access and manage data from a wide variety of very large data sources – eliminating the long, slow cycles of passing specifications back and forth between business and IT.
- Your teams can discover, profile, and manage data using a single virtual interface that hides the complexity of the underlying data.
- By working with a virtualization layer, you are assured that your teams are using the right data and data that can by verified by linking it to a Business Glossary with clear terms, definitions, owners, and business context to reduce the chance of misunderstandings and errors.
- Leading offerings in this space include data quality and data masking tools in the interface, ensuring that you improve data quality in the process.
- Data virtualization means that your teams can be delivering in days rather than months and faster delivery means lower cost.
There has been a lot of interest in agile development, especially as it relates to data projects. Data Virtualization is a key tool to accelerate your team in this direction.
Informatica has a leading position in the Forrester report due to the productivity of the Agile Data Integration environment but also because of the integration with the rest of the Informatica platform. From an architect’s point of view it is critical to start standardizing on an enterprise data management platform. Continuing data and data tool fragmentation will only slow down future project delivery. The best way to deal with the growing complexity of both data and tools is to drive standardization within your organizations.
According to Doug Henschen, Executive Editor at InformationWeek, “Despite the weak economy and zero growth in many IT salary categories, business intelligence (BI), analytics, information-integration and data warehousing professionals are seeing a slow-but-steady rise in income.” (more…)
If you don’t understand application semantics ‑ simply put, the meaning of data ‑ then you have no hope of creating the proper data integration solution. I’ve been stating this fact since the 1990s, and it has proven correct over and over again.
Just to be clear: You must understand the data to define the proper integration flows and transformation scenarios, and provide service-oriented frameworks to your data integration domain, meaning levels of abstraction. This is applicable both in the movement of data from source to target systems, as well as the abstraction of the data using data virtualization approaches and technology, such as technology for the host of this blog. (more…)
Data integration has always been a core technology in the world of healthcare. The latest trend is to leverage data integration technology as an emerging key strategic advantage within healthcare systems. The objective is to improve care through the use of existing and new information, and thus provide the ability to be more proactive, saving money and saving lives. (more…)
Data integration is one of the most important concepts that enterprises should be dealing within 2013. Data integration provides the ability to extract information from source systems, and move the information to target systems that need to act upon the information in some way. As the number of systems have increased and become more complex, the need to dive deeper into data integration becomes more apparent and urgent.
Data integration allows us to approach the real-time enterprise, where all processes and systems have the ability to see into all other processes and systems, and react to optimize the business in (near) real-time. This concept has been mulled over for years, but has yet to become a reality for most enterprises. (more…)
In terms of data integration, the notion of data virtualization lets us think about collections of data or services as abstract entities. Thus the abstractions can be represented in a form that is most useful to the integration server or the data integration architect. It’s this notion of abstraction that provides for the grouping of related pieces of information. These groups are independent of their physical location and structure, and allow us to define and understand what meaningful operations can be performed on the data or services.
We leverage data virtualization for a few core reasons: (more…)
The ability to create abstract schemas that are mapped to back-end physical databases provides a huge advantage for those enterprises looking to get their data under control. However, given the power of data virtualization, there are a few things that those in charge of data integration should know. Here are a few quick tips.
Tip 1: Start with a new schema that is decoupled from the data sources. (more…)
Those moving to Big Data, and that is a lot of enterprises right now, should also consider the need for data integration to support their new data platform. In many cases, the use of proper data integration procedures and technology is an afterthought. However, with a bit of planning and the right data integration technology, the transition to Big Data can be a smooth and productive one. Here are a few things to consider:
Data quality becomes even more important. Considering that Big Data systems, no matter if they are within the cloud or the data center, manage massive amounts of data, both structured and unstructured. Thus, the ability to manage data quality becomes more of a priority. (more…)
I just came back from MicroStrategy World. There were many conversations about social, mobile, cloud and big data. There was strong interest in cloud, clear adoption of mobile, and some big data adoption. eHarmony had a great presentation about how they handle big data with Informatica, and how they’re starting to use Hadoop with Informatica HParser running on Hadoop for processing JSON.
But that wasn’t the number one conversation. The one topic that everyone was interested in – and I talked to nearly 100 customers and partners over four days – was creating new reports faster, or Agile BI. (more…)