Tag Archives: Data Integration Platform
The articles cites some research from Ovum, that predicts many enterprises will begin moving toward data integration, driven largely by the rise of cloud computing and big data. However, enterprises need to invest in both modernizing the existing data management infrastructure, as well as invest in data integration technology. “All of these new investments will push the middleware software market up 9 percent to a $16.3 billion industry, Information Management reports.” This projection is for 2015.
I suspect that’s a bit conservative. In my travels, I see much more interest in data integration strategies, approaches, and technology, as cloud computing continues to grow, as well as enterprises understand better the strategic use of data. So, I would put the growth at 15 percent for 2015.
There are many factors driving this growth, beyond mere interest in cloud computing and big data.
The first consideration is that data is more strategic than initially understood. While businesses have always considered data a huge asset, it has not been until the last few years that businesses have seen the true value of understanding what’s going on inside, and outside of their business.
Manufacturing companies want to see the current state of production, as well as production history. Management can now use that data to predict trends to address, such as future issues around employee productivity, or even a piece of equipment that is likely to fail and the impact of that failure on revenue. Healthcare companies are learning how to better monitor patient health, such as spotting likely health problems before they are diagnosed, or leveraging large data to understand when patterns emerge around health issues, such as areas of the country that are more prone to asthma, based upon air quality.
Second, there is the need to deal with compliance issues. The new health care regulations, or even the new regulation around managing a publically traded company, require a great deal of data management issues, including data integration.
As these laws emerge, and are altered over time, the reporting requirements are always more complex and far reaching than they were before. Those who want to avoid fines, or even avoid stock drops around mistakes, are paying close attention to this area.
Finally, there is an expectation from customers and employees that you will have a good handle on your data. 10 years ago you could tell a customer on the phone that you needed to check different systems to answer their question. Those days are over. Today’s customers and employees want immediate access to the data they need, and there is no good excuse for not being able to produce that data. If you can’t, your competition will.
The interest in data integration will experience solid growth in 2015, around cloud and big data, for sure. However, other factors will drive this growth, and enterprises will finally understand that data integration is core to an IT strategy, and should never be an afterthought.
My first job out of college was to figure out how to get devices that monitored and controlled an advanced cooling and heating system to communicate with a centralized and automated control center. We ended up building custom PCs for the application, running a version of Unix (DOS would not cut it), and the PCs mounted in industrial cases would communicate with the temperature and humidity sensors, as well as turn on and turn off fans and dampers.
At then end of the day, this was a data integration, not an engineering problem, that we were attempting to solve. The devices had to talk to the PCs, and the PC had to talk to a centralized system (Mainframe) that was able to receive the data, as well as use that data to determine what actions to take. For instance, the ability determine that 78 degrees was too warm for a clean room, and that a damper had to be open and a fan turned on to reduce the temperature, and then turn off when the temperature returned to normal.
Back in the day, we had to create and deploy custom drivers and software. These days, most devices have well-defined interfaces, or APIs, that developers and data integration tools can access to gather information from that device. We also have high performing networks. Much like any source or target system, these devices produce data which is typically bound to a structure, and that data can be consumed and restructured to meet the needs of the target system.
For instance, data coming off a smart thermostat in your home may be in the following structure:
Device (char 10)
Date (char 8)
Temp (num 3)
You’re able to access this device using an API (typically a REST-based Web Service), which returns a single chunk of data which is bound to the structure, such as:
Then you can transform the structure into something that’s native to the target system that receives this data, as well as translate the data (e.g., converting the Data form characters to numbers). This is where data integration technology makes money for you, given its ability to deal with the complexity of translating and transforming the information that comes off the device, so it can be placed in a system or data store that’s able to monitor, analyze, and react to this data.
This is really what the IOT is all about; the ability to have devices spin out data that is leveraged to make better use of the devices. The possibilities are endless, as to what can be done with that data, and how we can better manage these devices. Data integration is key. Trust me, it’s much easier to integrate with devices these days than it was back in the day.
Thank you for reading about Data Integration with Devices! Editor’s note: For more information on Data Integration, consider downloading “Data Integration for Dummies“
On November 10, Informatica made history with the launch of Informatica 9. In my mind, being a SOA enthusiast, another equally significant event transpired – the birth of SOA-based Data Services – transformational SOA data integration that can revive your enterprise architecture.
So, what exactly are SOA-based Data Services and why am I so excited?
Being information driven is as much an organizational commitment as it is a technology commitment. The following outlines the major components required to be information driven.
Data Governance (DG) – DG is the overarching program for a data driven company. In my last blog, I defined DG as the practice of managing data as a corporate asset across the enterprise. It involves the processes, policies, standards, organization, and technologies required to manage and ensure the availability, accessibility, quality, consistency, auditability, and security of data in a company or institution.
Who should own DG? In most things we do, we look for the single point of accountability. In this instance, I recommend a collective structure of senior business managers who are accountable for the data subjects that drive your business. Additionally, I suggest a senior member of IT who can drive change across IT systems. (more…)
This week, we announced the acquisition of AddressDoctor, the market leader of global address validation with coverage for over 200 countries and territories. This is another example of how we are continually working to deliver the most advanced data quality products to our customers. Address Doctor provides an address validation engine which is already fully integrated into Informatica Data Quality. This acquisition is simply another step towards market leadership for Informatica in the enterprise data quality market. Rob Karel from Forrester referred to our vision of pervasive data quality – supporting all roles, all applications, all data domains and all stages of the data integration lifecycle in his blog.
Let’s focus on “all data domains” and how the AddressDoctor acquisition supports this key criteria for successful enterprise data quality. (more…)
As I discussed before, it’s not enough to walk through a functional checklist for a data integration platform. It’s important to make sure that it works in the right way. In my last posting, I discussed the concept of a “unified” platform and its implications for the user experience.
The second key aspect of how a data integration platform works is its openness—how much it is designed to work with the broader IT environment. Data integration, by definition, touches a large portion of the IT environment, which could mean thousands of different applications and data sources in large organizations. Moreover, it’s not just the systems inside the firewall you need to be concerned with.
In most cases, it’s important to also support integration with the systems of B2B partners such as customers, suppliers, distributors, etc., as well as any SaaS partners. And the platform has to support any technology standard such as those for operating systems or databases which have been instituted. Frankly, there’s not much use in a data integration platform that isn’t designed to work with as broad a range of applications and systems as possible. (more…)
If you say the words “data integration“, different people may think of different things. Some think of ETL (extract-transform-load) tools. Others think of enterprise application integration (EAI) technologies or message brokers. But in most cases, regardless of which tool leaps to mind, people think about how data is integrated inside of an organization, or enterprise data integration.
In other words, how data is shared between different applications and systems inside the firewall. But this is just one aspect or realm within the broader data integration discipline, albeit an important one and generally the one most people start with.
Technology vendors like to talk about platforms, because platforms imply a broader footprint both in terms of functional capabilities and in terms of implementation usage. Platforms also sound more “strategic,” even if the practical implications are vague. But the term “platform” can also be simple marketing hype. How do you know when a software “platform” is really a platform? More specifically, do data integration platforms exist now?