Tag Archives: Monitoring

Social Media Monitoring with CEP, pt. 2: Context As Important As Sentiment

When I last wrote about social media monitoring, I made a case for using a technology like Complex Event Processing (“CEP”) to detect rapidly growing and geospatially-oriented social media mentions that can provide early warning detection for the public good (Social Media Monitoring for Early Warning of Public Safety Issues, Oct. 27, 2011).

A recent article by Chris Matyszczyk of CNET highlights the often conflicting and confusing nature of monitoring social media.  A 26-year old British citizen, Leigh Van Bryan, gearing up for a holiday of partying in Los Angeles, California (USA), tweeted in British slang his intention to have a good time:  “Free this week, for quick gossip/prep before I go and destroy America.” Since I’m not too far removed the culture of youth, I did take this to mean partying, cutting loose, having a good time (and other not-so-current definitions.) (more…)

Posted in Complex Event Processing, Customer Acquisition & Retention, Customer Services, Customers, Data Integration Platform, Master Data Management, Real-Time | Tagged , , , , , , , , , | Leave a comment

Social Media Monitoring for Early Warning of Public Safety Issues

A friend posted a humorous photo on Facebook the other day of a sign that read “In Case of Fire Exit Don't Tweet SignFire… Exit building before Tweeting about it.”  Hopefully it’s a made-up sign just to get a laugh, but the truth is actually closer than you think.  The intersection of the pervasiveness of mobile devices and the ubiquitous nature of social media is creating a tremendous amount of relevant data about the physical world not previously seen before – what is happening, where it’s happening, to whom, by whom in some cases and when.  Not usually early adopters, public safety officials are starting to take notice.

Here’s a real-world story: (more…)

Posted in Uncategorized | Tagged , , , , , | 1 Comment

Ultra Messaging Plug-In Now Available for OpenNMS Network Management Solution

OpenNMS, a provider of open source Network Management solutions for the enterprise, has just released an Ultra Messaging plug-in.

This plug-in allows OpenNMS users to manage their network resources more efficiently, by enabling the ability to monitor Ultra Messaging network traffic. Please note that the Ultra Messaging SNMP agent is also required.

To download the OpenNMS plug-in, or read more about it, please click here.  To begin a trial of the Ultra Messaging SNMP agent, please contact your 29West-Informatica Sales representative, or send an email to info@29west.com.

Posted in Financial Services, Operational Efficiency, Ultra Messaging | Tagged , , , | Leave a comment

“May The Force Be With You” – How To Align And Link Business Objectives To Data Quality Initiatives

“The first step in fixing a problem is to measure the size of the problem”. … Agreed! Easy. The next step is harder – how to sustain high quality data and generate ongoing business value.

Measuring data quality became possible with the advent of profiling and scorecarding features as standard functionality within enterprise data quality products. By using a combination of data quality rules, reference data and technology, you can create a report which lists the quality and percentage of records with completeness, conformity, consistency, duplicate and accuracy issues. This process is called a data quality audit. The ongoing process of scorecarding is called monitoring.

The next question is – what do we do now? We have the DQ metrics. How do we get the business and IT motivated to sustain high quality data? (more…)

Posted in Business Impact / Benefits, Data Quality, Profiling, Scorecarding | Tagged , , , | Leave a comment

ITRS and 29West Announce Global Partnership, Geneos Interface

26 April 2010, London: ITRS Group plc, the leading provider of predictive, real-time systems and application management products to the world’s financial community, is pleased to announce a new global partnership with 29West. Recently acquired by Informatica (NASDAQ: INFA), 29West is the market leader in high-performance, ultra low-latency messaging solutions.

The two companies will develop an interface between the ITRS Geneos framework and 29West’s Ultra Messaging system. ITRS Geneos provides a powerful set of analytical, consolidation and correlation functions across all managed metrics, while 29West’s Ultra Messaging allows trading and market data applications to communicate with each other with ultra-low latency.

Misha Kipnis, ITRS CTO, said: “We are delighted to be working closely with 29West. The partnership was driven by demand from 29West clients for a comprehensive, professional management and monitoring system. The new interface will greatly increase efficiency and allow 29West clients to manage not only their Ultra Messaging middleware infrastructure, but also, through ITRS Geneos, their trading applications, FIX engines, market data platforms and IT infrastructures.”

Read more here…

Posted in Ultra Messaging | Tagged , , , , | Leave a comment

Better management through measuring data quality

I recently asked a customer of ours why they invested so much in monitoring and publishing key performance indicators for their data quality. “Believe it or not, the biggest reason we measure data quality is not to correct bad data” came the reply. “The reason we monitor data quality is to detect problems with our business processes.”

Indeed, as I mentioned in my last blog post, business users look to investments in people and processes in addition to technology in order to address poor data quality. For example, if a bank branch manager received a report showing that customer data originating from his branch office had much higher incidents of duplicate entries and was putting the entire bank at risk of massive regulatory fines, he is not going to throw technology at the problem. His response might be mandatory training for tellers or better hiring practices to screen for adequate computer skills.

Experts in quality control methodology refer to this as addressing “root cause.” Common starting points of measurement involve completeness, accuracy, consistency, conformity, duplication, and integrity. Eventually, as the business culture matures its data quality practices, timeliness and data lineage (origination) are used to evaluate quality of data. Of course, software technology that automates the process of parsing, standardizing, matching and consolidating data is of immense value and is an absolute requirement in any data integration project. However, the issue of data quality goes beyond these IT projects. Ongoing measurement and monitoring of data quality provides value directly to the business because it helps them to better manage their people and processes.

Posted in Data Integration, Data Quality | Tagged , , , | Leave a comment

Is there life on Mars?

This week NASA announced that it may have discovered evidence of water flowing on the surface of Mars in the recent past. This raises the possibility of life existing on Mars in the past and even to the present day.

A lot of talent, time and money has gone into addressing the fundamental question – is there life on Mars. In addition to the two rovers on the surface, there are currently three spacecraft in operation around Mars; Mars Odyssey, Mars Reconnaissance Orbiter and Mars Express – sadly the Mars Global Surveyor which is responsible for this weeks exciting news has probably suffered a severe failure and is in effect lost. Each spacecraft has been sent out to gather some basic statistics about the Red Planet such as; how the surface changes over time, the percentage of carbon-dioxide at the poles or how the temperature varies throughout the atmosphere. All of this in the hope that we move closer to a definitive answer to that fundamental question. But forget the answer: are we asking the right questions?


Posted in Data Integration, Data Quality | Tagged , | Leave a comment

Information Quality & Management Transformation

I recently received an email from one of my early clients. After having worked in four different companies in four different industries, she came to a sad conclusion, writing:

“The thing that they all have in common is a desire to cut corners and deal with quality later. It takes a lot of energy to be the information quality cheerleader, and I find it discouraging and overwhelming at times. Keep writing your articles and books to encourage all the people like me who are dealing with these issues every day.” P. G.

The discovery that P. G. has experienced is, unfortunately, the norm—not the exception. There are two critical elements in this experience.


Posted in Data Integration, Data Quality | Tagged , , , , | Leave a comment

IQ in Internet and e-Business Information

“In e-Business, the Information IS the Business”
Having just completed writing a chapter on “IQ in the Internet and e-Business Environments” in my forthcoming book, Information Quality Applied: Best Practices for Improving Business Information, Processes and Systems (John Wiley & Sons), I wanted to share a few excerpts from this chapter. This is one of ten chapters focused on applying sound quality principles to the unique quality issues in various information value “circles” such as “Prospect to Satisfied Customer,” “Order to Cash” Supply chain, for example.
There are three categories of information in the Internet environment to which quality principles must be applied:
* Web-Based Documents and Web Content
* Data “Shared” by Internal Processes and Internet Processes
* Information Collected or Created in e-Commerce and e-Business value chains, including third party business partners
The major problem with IQ in the Internet is that business is conducted in “cyberspace” with no person “minding the store” or monitoring the e-Business transactions.
Here I will address some problems and improvements in the first category.

Posted in Data Integration, Data Quality | Tagged , , | Leave a comment

IQ Lessons Learned: Consumer Reports Recalls Faulty Car Seat Study

IQ in the News

Most people have probably heard about the highly reputable Consumer Reports’ recall of its flawed testing of infant car seat safety. The report, issued January 5, 2007, found that many car seats failed the high-speed side impact test it conducted (the government requires passing of frontal crashes of 30 mph. Consumer Reports tested at 35 mph for frontal crashes and 38 mph (so they thought) in side impact crashes. The findings seemed to indicate a high degree of failure with nine failing some or all of the crash tests, and only two doing well in all tests.

However, the government found a problem with the way the testing was conducted. Instead of a 38 mph side crash, the test simulated a side-impact crash of over 70 mph with very inconsistent results that would have come from 38 mph tests. Consumer Reports recalled the entire report January 18.
IQ Lessons Learned From the Consumer Reports Recall:

Negative impact on consumers and their confidence in the organization:

  • The impacts of the faulty testing where dramatic and swift. The Executive Director of the Washington State Safety Restraint Coalition exclaimed that “Consumer Reports screwed up….They really upset people and created enormous confusion.”
  • When designing tests, as you will with IQ assessments, you must assure you design the tests properly. Measuring validity and accuracy are two distinctly different measurements. You can test validity by defining the business rules, valid values or ranges the data must conform to, and conduct these tests electronically with IQ assessment software or your own validity routine tests.

    But to measure accuracy, you must confirm the data values correctly correspond to the characteristic of the real world object or event, the data represents. To perform this test, you must compare the data with the characteristic of the real world object itself.

    In the case of car seats, Consumer Reports believes, rightly as I believe, that crash tests should be conducted at high speeds, more representative of actual accident experience.

  • When you make a mistake, own up to it and apologize for it. Then do everything you can to ameliorate the error and its impact.

    Consumer Reports retracted the report as soon as they determined the serious problem with the study.

    Jim Guest, President of Consumer Reports, wrote, “A message to our readers” on the Consumer Reports home page, with important messages to his customers, “I took action when we discovered a mistake in our side-impact crash tests.” “We strive to be accurate and fair, and I regret this error. I want to make sure that our actions are as thorough and transparent as possible so that we preserve your trust as we continue to test, inform, and protect consumers.”

  • When you have IQ problems, but must have accurate and complete data, you must pay the price of the process failure and the costs of “information scrap and rework.” Consumer reports is retesting all of the infant car seats to provide the comparable data.
  • “Reputation” of an information provider is not a guarantee of the quality of information provided. Even the best make mistakes.

    One must error-proof its processes based on root cause of failure. A better measure is the reliability of the processes to provide consistent, quality information based on the kinds of error-proofing provided and consistency of the process results.

  • When you have a significant IQ problem, you must analyze the root cause(s) and improve the process to prevent the root cause(s) from causing failure again.

    Consumer Reports will be conducting extensive analysis as to what went wrong in these tests to assure they will not recur. This is the same approach when we find critical IQ problems. We must conduct root cause analysis, find the root causes and improve and verify the efficacy of the improvements to prevent defect recurrence.

What do you think?

Posted in Data Integration, Data Quality | Tagged , , | Leave a comment