Tag Archives: Data Integration Platform
My first job out of college was to figure out how to get devices that monitored and controlled an advanced cooling and heating system to communicate with a centralized and automated control center. We ended up building custom PCs for the application, running a version of Unix (DOS would not cut it), and the PCs mounted in industrial cases would communicate with the temperature and humidity sensors, as well as turn on and turn off fans and dampers.
At then end of the day, this was a data integration, not an engineering problem, that we were attempting to solve. The devices had to talk to the PCs, and the PC had to talk to a centralized system (Mainframe) that was able to receive the data, as well as use that data to determine what actions to take. For instance, the ability determine that 78 degrees was too warm for a clean room, and that a damper had to be open and a fan turned on to reduce the temperature, and then turn off when the temperature returned to normal.
Back in the day, we had to create and deploy custom drivers and software. These days, most devices have well-defined interfaces, or APIs, that developers and data integration tools can access to gather information from that device. We also have high performing networks. Much like any source or target system, these devices produce data which is typically bound to a structure, and that data can be consumed and restructured to meet the needs of the target system.
For instance, data coming off a smart thermostat in your home may be in the following structure:
Device (char 10)
Date (char 8)
Temp (num 3)
You’re able to access this device using an API (typically a REST-based Web Service), which returns a single chunk of data which is bound to the structure, such as:
Then you can transform the structure into something that’s native to the target system that receives this data, as well as translate the data (e.g., converting the Data form characters to numbers). This is where data integration technology makes money for you, given its ability to deal with the complexity of translating and transforming the information that comes off the device, so it can be placed in a system or data store that’s able to monitor, analyze, and react to this data.
This is really what the IOT is all about; the ability to have devices spin out data that is leveraged to make better use of the devices. The possibilities are endless, as to what can be done with that data, and how we can better manage these devices. Data integration is key. Trust me, it’s much easier to integrate with devices these days than it was back in the day.
Thank you for reading about Data Integration with Devices! Editor’s note: For more information on Data Integration, consider downloading “Data Integration for Dummies“
On November 10, Informatica made history with the launch of Informatica 9. In my mind, being a SOA enthusiast, another equally significant event transpired – the birth of SOA-based Data Services – transformational SOA data integration that can revive your enterprise architecture.
So, what exactly are SOA-based Data Services and why am I so excited?
Being information driven is as much an organizational commitment as it is a technology commitment. The following outlines the major components required to be information driven.
Data Governance (DG) – DG is the overarching program for a data driven company. In my last blog, I defined DG as the practice of managing data as a corporate asset across the enterprise. It involves the processes, policies, standards, organization, and technologies required to manage and ensure the availability, accessibility, quality, consistency, auditability, and security of data in a company or institution.
Who should own DG? In most things we do, we look for the single point of accountability. In this instance, I recommend a collective structure of senior business managers who are accountable for the data subjects that drive your business. Additionally, I suggest a senior member of IT who can drive change across IT systems. (more…)
This week, we announced the acquisition of AddressDoctor, the market leader of global address validation with coverage for over 200 countries and territories. This is another example of how we are continually working to deliver the most advanced data quality products to our customers. Address Doctor provides an address validation engine which is already fully integrated into Informatica Data Quality. This acquisition is simply another step towards market leadership for Informatica in the enterprise data quality market. Rob Karel from Forrester referred to our vision of pervasive data quality – supporting all roles, all applications, all data domains and all stages of the data integration lifecycle in his blog.
Let’s focus on “all data domains” and how the AddressDoctor acquisition supports this key criteria for successful enterprise data quality. (more…)
As I discussed before, it’s not enough to walk through a functional checklist for a data integration platform. It’s important to make sure that it works in the right way. In my last posting, I discussed the concept of a “unified” platform and its implications for the user experience.
The second key aspect of how a data integration platform works is its openness—how much it is designed to work with the broader IT environment. Data integration, by definition, touches a large portion of the IT environment, which could mean thousands of different applications and data sources in large organizations. Moreover, it’s not just the systems inside the firewall you need to be concerned with.
In most cases, it’s important to also support integration with the systems of B2B partners such as customers, suppliers, distributors, etc., as well as any SaaS partners. And the platform has to support any technology standard such as those for operating systems or databases which have been instituted. Frankly, there’s not much use in a data integration platform that isn’t designed to work with as broad a range of applications and systems as possible. (more…)
If you say the words “data integration“, different people may think of different things. Some think of ETL (extract-transform-load) tools. Others think of enterprise application integration (EAI) technologies or message brokers. But in most cases, regardless of which tool leaps to mind, people think about how data is integrated inside of an organization, or enterprise data integration.
In other words, how data is shared between different applications and systems inside the firewall. But this is just one aspect or realm within the broader data integration discipline, albeit an important one and generally the one most people start with.
Technology vendors like to talk about platforms, because platforms imply a broader footprint both in terms of functional capabilities and in terms of implementation usage. Platforms also sound more “strategic,” even if the practical implications are vague. But the term “platform” can also be simple marketing hype. How do you know when a software “platform” is really a platform? More specifically, do data integration platforms exist now?