Tag Archives: Complex Event Processing
Several years ago, I got to participate in one of the first neural net conferences. At the time, I thought it was amazing just to be there. There were chip and software vendors galore. Many even claimed to be the next Intel or the next Microsoft. Years later I joined a complex event processing vendor. Again, I felt the same sense of excitement. In both cases, the market participants moved from large horizontal market plays to smaller and more vertical solutions.
Now to be clear, it is not my goal today to pop anyone’s big data balloon. But as I have gotten more excited about big data, I have gotten more and more an eerie sense of deja vu. The fact is the more that I dig into big data and hear customer’s stories about what they are trying to do with big data; the more I have concern about the similarities between big data and neural nets and complex event processing.
Clearly, big data does offer some interesting new features. And big data does take advantage of other market trends including virtualization and cloud. By doing so, big data achieves new orders of scalability than traditional business intelligence processing and storage. At the same time, big data offers the potential for lowering cost but I should take a moment to stress the word potential. The reason I do this is that while a myriad of processing approaches have been developed, no standard has yet emerged. And early adopters complain about having a difficulty in hiring big data map reduce programmers. And just like neural nets, the programing that needs to be done is turning out to be application specific.
With this said, it should be clear that big data does offer the potential to test datasets and to discover new and sometimes unexpected data relationship. This is a real positive thing. However, like its predecessors, this work is application specific and the data that is being related is truly of differing quality and detail. This means that the best that big data can do as a technology movement is discover potential data relationship. Once this is done, meaning can only be created by establishing detailed data relationships and dealing with the varying quality of data sets within the big data cluster.
Big Data will become small for management analysis
This means that big data must become small in order to really solve customer problems. Judith Hurwitz puts it this way, “big data analysis is really about small data. Small data, therefore, is the product of big data analysis. Big data will become small so that it is easier to comprehend”. What is “more necessary than ever is the capability to analyze the right data in a timely enough fashion to make decisions and take actions”. Judith says that in the end what is needed is quality data that is consistent, accurate, reliable, complete, timely, reasonable, and valid. The critical point is whether you use map reduce processing or traditional BI means, you shouldn’t throw out your data integration and quality tools. As big data becomes smaller, these will in reality become increasingly important.
So how does Judith see big data evolving? Judith sees big data propelling a lot of new small data. Judith believes that, “getting the right perspective on data quality can be very challenging in the world of big data. With a majority of big data sources, you need to assume that you are working with data that is not clean”. Judith says that we need to accept the fact that a lot of noise will exist in data. It is by searching and pattern matching that you will be able to find some sparks of truth in the midst of some very dirty data”. Judith suggests, therefore, a two phase approach—1) look for patterns in big data without concern for data quality; and 2) after you locate your patterns, applying the same data quality standards that have been applied to traditional data sources.
For this reason, I believe that history will to a degree repeat itself. Clearly, the big data emperor does have his clothes on, but big data will become smaller and more vertical. Big data will become about relationship discovery and small data will become about quality analysis of data sources. In sum, this means that small data analysis is focused and provides the data for business decision making and big data analysis is broad and is about discovering what data potentially relates to what data. I know this is a bit of different from the hype but it is realistic and makes sense. Remember, in the end, you will still need what business intelligence has refined.
Unlike some of my friends, History was a subject in high school and college that I truly enjoyed. I particularly appreciated biographies of favorite historical figures because it painted a human face and gave meaning and color to the past. I also vowed at that time to navigate my life and future under the principle attributed to Harvard professor Jorge Agustín Nicolás Ruiz de Santayana y Borrás that goes, “Those who cannot remember the past are condemned to repeat it.”
So that’s a little ditty regarding my history regarding history.
Forwarding now to the present in which I have carved out my career in technology, and in particular, enterprise software, I’m afforded a great platform where I talk to lots of IT and business leaders. When I do, I usually ask them, “How are you implementing advanced projects that help the business become more agile or effective or opportunistically proactive?” They usually answer something along the lines of “this is the age and renaissance of data science and analytics” and then end up talking exclusively about their meat and potatoes business intelligence software projects and how 300 reports now run their business.
Then when I probe and hear their answer more in depth, I am once again reminded of THE history quote and think to myself there’s an amusing irony at play here. When I think about the Business Intelligence systems of today, most are designed to “remember” and report on the historical past through large data warehouses of a gazillion transactions, along with basic, but numerous shipping and billing histories and maybe assorted support records.
But when it comes right down to it, business intelligence “history” is still just that. Nothing is really learned and applied right when and where it counted – AND when it would have made all the difference had the company been able to react in time.
So, in essence, by using standalone BI systems as they are designed today, companies are indeed condemned to repeat what they have already learned because they are too late – so the same mistakes will be repeated again and again.
This means the challenge for BI is to reduce latency, measure the pertinent data / sensors / events, and get scalable – extremely scalable and flexible enough to handle the volume and variety of the forthcoming data onslaught.
There’s a part 2 to this story so keep an eye out for my next blog post History Repeats Itself (Part 2)
In this video, Richard Cramer, chief healthcare strategist, Informatica, talks about the opportunities for healthcare organizations to move data forward. He touches on relationship analytics, master data management (MDM), data quality and Complex Event Processing (CEP). He specifically answers the following questions:
- What are some of the major opportunities for healthcare organizations to move data forward?
- What technology is most poised to deliver benefits to healthcare organizations today?
- How can Informatica help healthcare organizations in their quest to deliver proactive medicine?
One person’s “real-time” is another person’s “fast enough”. (Or is it vice versa?) I’ve been in the Complex Event Processing (CEP) space for almost five years, and nothing gets this industry more spun up than heated discussions about “feeds and speeds” - the fastest products, the lowest latencies, the greatest event volumes, the most events per second and so on. (more…)
When something is really good, you’ve got to keep using it and last year the award-winning Comedy Central show South Park featured a character called “Captain Hindsight.” I won’t describe it, just watch it and please return to finish reading this blog. Here’s a link to a site with the perfect clip.
You may not know that Complex Event Processing already plays a role in the Super Bowl. Specifically in aiding law enforcement to insure situational awareness for security measures. In 2003, Michael Lewis penned the best seller “Moneyball: The Art of Winning an Unfair Game” which showed the dramatic role information and new analytic approaches can play in fielding a winning baseball team at a (relative) bargain price. It’s only natural for similar thinking to occur in other sports. Here’s five places that CEP can play in the Super Bowl (not all serious).
1) Super Bowl Security – Some agencies already leverage CEP technology to gain situational awareness about activities happening with a close geographic proximity of the big game. Everything from traffic conditions, to shipping logistics & hazardous material routes, to air traffic and regional crime patterns are used to gain a situational threat awareness. (more…)
After much cajoling, I convinced the CEO of North Pole Global to give some insight into how his organization uses CEP to optimize operations and maximize productivity. For years, he’s been unwilling to say a word in fear of giving away a big competitive advantage to his competitors like Hanukkah Harry Inc. and Tooth Fairy Limited. This year, because I’ve been extra good, the big guy has allowed me to ask a few questions, so here goes.
Me: Has anyone actually asked for the Snuggie for Dogs?
Santa: Strangely enough, my wife has requested one but we don’t have a dog.
Me: OK, enough with the warm ups. You are a huge user of CEP. Can you tell me how you were introduced to it?
Santa: My CIO, Shorty McTechie brought it to my attention and to be honest I sat through the first five minutes day dreaming about snicker doodles. I had to interupt him and tell him to get to the point. He said to not think about it as some complex technology but that CEP stood for Complete Elf Productivity. That got my attention for sure. I love the little guys, but sometimes they are stuck in their old ways. I wanted new ideas, new processes and most of all adaptability.
Me: What do you mean by adaptability? (more…)
I can’t believe it. I had a bad experience on a United flight and I’m going to say something nice about them. They did good by me because they were proactive. Have you ever even heard an airline even say “sorry” for a bad experience?
It never happened to me until the other day. Flying back from DC to San Francisco, my armrest audio controls didn’t work and I couldn’t listen and enjoy the inflight entertainment – the movie Eat Pray Love and shows like House. I was frustrated and got even more irate when I thought how hard it might be to lodge a complaint online to the airline. (more…)