Ivan Chong

The Usefulness of Things

As a Tesla owner, I recently had the experience of calling Tesla service after a yellow warning message appeared on the center console of my car.” Check tire pressure system.  Call Tesla Service.” While still on the freeway, I voice dialed Tesla with my iPhone and was in touch with a service representative within minutes.

Me: A yellow warning message just appeared on my dash and also the center console.

Tesla rep: Yes, I see – is it the tire pressure warning?

Me: Yes – do I need to pull into a gas station?  I haven’t had to visit a gas station since I purchased the car.

Tesla rep:  Well, I also see that you are traveling on a freeway that has some steep elevation – it’s possible the higher altitude is affecting your car’s tires temporarily until the pressure equalizes.  Let me check your tire pressure monitoring sensor in a half hour.  If the sensor still detects a problem, I will call you and give further instructions.

As it turned out, the warning message disappeared after ten minutes and everything was fine for the rest of the trip. However, the episode served as a reminder that the world will be much different with the advent of the Internet of Things. Just as humans connected with mobile phones become more productive, machines and devices connected to the network become more useful. In this case, a connected automobile allowed the remote service rep to remotely access vehicle data, read the tire pressure sensor as well as the vehicle location/elevation and was able to suggest a course of action. This example is fairly basic compared to the opportunities afforded by networked devices/machines.

In addition to remote servicing, there are several other use case categories that offer great potential, including:

  • Preventative Maintenance – monitor usage data and increase the overall uptime for machines/devices while decreasing the cost of upkeep. e.g., Tesla runs remote diagnostics on vehicles and has the ability to identify vehicle problems before they occur.
  • Realtime Product Enhancements – analyze product usage data and deliver improvements quickly in response. e.g., Tesla delivers software updates that improve the usability of the vehicle based on analysis of owner usage.
  • Higher Efficiency in Business Operations – analyze consolidated enterprise transaction data with machine data to identify opportunities to achieve greater operational efficiency. e.g., Tesla deployed waves of new fast charging stations (known as superchargers) based upon analyzing the travel patterns of its vehicle owners.
  • Differentiated Product/Service Offerings – deliver new class of applications that operate on correlated data across a broad spectrum of sources (HINT for Tesla: a trip planning application that estimates energy consumption and recommends charging stops would be really cool…)

In each case, machine data is integrated with other data (traditional enterprise data, vehicle owner registration data, etc.) to create business value. Just as important to the connectivity of the devices and machines is the ability to integrate the data. Several Informatica customers have begun investing in M2M (aka Internet of Things) infrastructure and Informatica technology has been critical to their efforts. US Xpress utilizes mobile censors on its vast fleet of trucks and Informatica delivers the ability to consolidate, cleanse and integrate the data they collect.

My recent episode with Tesla service was a simple, yet eye-opening experience. With increasingly more machines and devices getting wireless connected and the ability to integrate the tremendous volumes of data being generated, this example is only a small hint of more interesting things to come.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Complex Event Processing, Data Aggregation, Data Integration | Tagged , , , | 2 Comments

The Importance Of AddressDoctor To Informatica Customers

This week, we announced the acquisition of AddressDoctor, the market leader of global address validation with coverage for over 200 countries and territories. This is another example of how we are continually working to deliver the most advanced data quality products to our customers. Address Doctor provides an address validation engine which is already fully integrated into Informatica Data Quality.  This acquisition is simply another step towards market leadership for Informatica in the enterprise data quality market. Rob Karel from Forrester referred to our vision of pervasive data quality – supporting all roles, all applications, all data domains and all stages of the data integration lifecycle in his blog.

Let’s focus on “all data domains” and how the AddressDoctor acquisition supports this key criteria for successful enterprise data quality. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Customers, Data Integration, Data Quality, News & Announcements | Tagged , , | Leave a comment

Bad Data Impacting the Economy

Alan Greenspan stated last week that poor data quality is part of the reason for today’s financial crisis. As many businesses have already learned, databases are only as accurate as the information fed into them. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality | Tagged | Leave a comment

Olympic Competition and the Global Economy

A few years back, Thomas Friedman wrote a best-seller, The World is Flat. He spoke about how the world is becoming increasingly interconnected, specifically in business. Have you recently wondered just how quickly this notion of globalization is spreading? If you’ve been watching the 2008 Summer Olympics, you may come to the conclusion that globalization is occurring at a rate even faster than Friedman had predicted.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Identity Resolution | Tagged | 1 Comment

Communicating the Value of Data Quality

Many of our customers express frustration that even though it is quite obvious how their business suffers from poor data quality, they find it difficult to convince their associates to invest in initiatives that correct the problems.

Earlier this year, we participated in Rob Karel’s Forrester research that addresses this issue. The resulting research paper is titled “A Truism for Trusted Data: Think Big, Start Small” and its getting a lot of interest. Recently, there was a nice writeup in Intelligent Enterprise where they interviewed Rob and also made mention of the Data Quality ROI calculator that we’ve developed by working alongside our customers.

The article states

While Forrester is often suspect of vendor-supplied calculators, the research firm lists Informatica as an example of a vendor that has taken an approach that matches Forrester’s bottom-up strategy. The Informatica Data Quality ROI Calculator enables customers “to capture and visualize the benefits of a data quality investment ” before that investment is made,” Forrester said.

The report is available from Forrester’s web site. It contains some nice examples of how customers have built a business case for justifying investment in Data Quality. If you would like to share some previous successes in building a Data Quality ROI, feel free to post a comment!

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality, Informatica Events | Tagged , , , | Leave a comment

Growing in a Tough Economy

At Informatica, we take great delight in the success of our customers. Just today, in spite of the economic downturn, KPN announced that its Q2 revenues increased 22% and raised its outlook for the remainder of the year. To what do they attribute their results? Solid execution and successful integration of acquired companies. Watch the KPN CFO give his take on CNBC.

At our recent customer conference, KPN was the overall winner of the annual Informatica Innovation Awards. They made some strategic investment in Informatica products to achieve their business objectives.

“Informatica products and services have been critical to driving shareholder value through improved customer service. To realize our Strategic Innovation goals, we built our Customer Data Cleansing platform with Informatica; it gives us real-time access and cleansing of our customer data. Our nomination for an Innovation Award signifies an industry acknowledgement of the competitive advantage this project has brought to KPN.”
- Jan Muchez, CIO, KPN

At Informatica, we focus a great deal on how our customers link key business imperatives to the application of data integration and data quality technology. What an excellent example provided by KPN – Congratulations to them on a great quarter!

FacebookTwitterLinkedInEmailPrintShare
Posted in Customers, Data Integration, Data Quality | Tagged , | Leave a comment

National Security vs. Privacy Rights – the Role for Technology

I ran across an interesting article concerning the US initiative to broker data exchange with various EU nations. The intent is to gain greater access to information that would help in the global war on terror.

European governments are entering into these agreements much more readily than they were four, five years ago, because concerns about terrorism are no longer confined to one side of the Atlantic.

The article then highlights the concerns over violation of personal privacy rights and the potential for abuse.

The agreement, which was described by two European officials, also allows for the transmission of “personal data revealing racial or ethnic origin, political opinion or religious or other beliefs, trade union membership or information concerning health and sexual life” in cases where they are “particularly relevant to the purposes of this agreement.” It defines personal data as “any information relating to an identified or identifiable natural person.”

The technology challenge can often be so consuming that we devote scarce attention to the ethical issues involved. Data integration and identity resolution technology are continually advancing. By factoring in ethical and moral considerations into the development of the technology, we should be able to support both objectives. Privacy and security do not necessarily need to be requirements that trade off against each other. In terms of identity resolution, the technology easily supports masking of personal attributes. Match results can be delivered independent of the conditions which trigger the match. Personal data used for matching can be stored in a transient manner and safeguarded against open access. etc. etc. I’m sure we can debate the efficacy of the technology towards these objectives. But at least, we should include technology in the debate.

ShareThis

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality | 1 Comment

Data Governance at MIT

Just gave a presentation at MIT’s Information Quality conference hosted at the Sloan school of management. Data Governance largely deals with softer topics like people, organizational strategies, and processes.  Not necessarily technology.  The irony was not lost on anyone that this presentation given at MIT stressed that technology alone would not solve a company’s data quality problems.

It was a real privilege and honor for me to return as a lecturer to some of the same classrooms I attended as a student. MIT’s Sloan school is right next to the Media Lab where I did undergraduate research some twenty years ago.  The most profound takeaway from my time as an engineering student was the notion that technology alone could not solve hard problems.  Back in 1986, we were experimenting with sending images and video over the network and the prof’s were always stressing that social and organizational considerations factored heavily into technology adoption.  This may sound obvious to grizzled IT veterans, but to the wide-eyed geeks studying at MIT, this came as quite a revelation.  Certainly, this is the underlying driver behind Data Governance – it’s a necessary framework so the enterprise can leverage and apply data quality, data integration, and metadata management technology.

The presentation covered several case studies involving successful customer deployments of enterprise-wide data governance programs.  Many of the attendees commented that they found it necessary to gain initial wins on tactical projects so they could gain credibility and navigate the political issues behind an enterprise deployment. There was certainly some really vigorous discussion and debate on this topic.

What experience have you had with implementing a data governance program?  Just like these MIT students, feel free to share your opinions with us.

ShareThis

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Governance, Risk and Compliance | Leave a comment

Microsoft Buys Zoomix

It's been rumored for a while, but now it is official – Microsoft has announced an agreement to buy a data quality startup company, Zoomix, for the purpose of enhancing SQL Server.

Microsoft plans to add Zoomix's technology to future releases of its SQL Server database, the company said through its public relations firm. Zoomix said its development team will join the SQL Server team at Microsoft's research and development center in Israel.

While this is not a large transaction for Microsoft, the move does underscore the importance of Data Quality. However, this raises an interesting question. Who should you trust to deliver data quality? The people who brought you Vista? the folks who sold you SAP? At first glance, it seems quite convenient to be able to deal with data quality issues in conjunction with specific source systems. However, many IT experts would claim this approach is merely a stop-gap measure. Data must be managed apart from its host systems. Data Quality rules start to truly add value to the business when they span MS SQL Server, and SAP, and Oracle, and etc. etc. It's still a topic of debate. But the discussion has moved beyond the question of "is data quality software useful?" to "where is the most useful place to deliver data quality software?"

Feel free to post your opinions!

ShareThis

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality, News & Announcements | Leave a comment

Can Data Quality Solve World Hunger?

If you ever find yourself discussing the benefits of data quality for your business and one of your associates asks rhetorically, "Yes, but can it solve world hunger?" you now have an answer for them.

FAO

The Food and Agriculture Organization of the United Nations records the level of completeness for data collection from each member nation. On their website, their stated mission is to work towards "a world without hunger." A key element in their fight against hunger is the FAO Stat database and a key means of maintaining the efficacy of the data is their data quality dashboard.

For organizations working with the FAO, it's important that the data be accurate – otherwise perishable goods may be wasted by getting shipped to locations not suffering from malnourished populations. This example highlights something that I've seen very often in the context of enterprise data quality initiatives. Many prospective customers come to us and ask "how do we get started, given the complexities of coordinating across multiple organizations inside our company?" Within the Informatica customer base, there are many examples of successful initiatives starting off with Data Quality metrics and dashboards. The metrics offer a great way for organizations to maintain a dialog on how to prioritize their investment in data quality.

Already, I've received email comments on my posting. "Can Data Quality allow us to live longer? Facilitate the exploration of outer space?" Great questions… stayed tuned for future postings!

ShareThis

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality | Leave a comment