Tag Archives: Complex Event Processing

Analytics Stories: A Case Study from Quintiles

Pharma CIOAs I have shared within other posts within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. For pharmaceutical businesses, strengthening the right to win begins and ends with the drug product development lifecycle. I remember, for example, talking several years ago to the CFO of major pharmaceutical company and having him tell me the most important financial metrics for him had to do with reducing the time to market for a new drug and maximizing the period of patent protection. Clearly, the faster a pharmaceutical company gets a product to market, the faster it can begin to earning a return on its investment.

Fragmented data challenged analytical efforts

PharmaceuticalAt Quintiles, what the business needed was a system with the ability to optimize design, execution, quality, and management of clinical trials. Management’s goal was to dramatically shorten time to complete each trial, including quickly identifying when a trial should be terminated. At the same time, management wanted to continuously comply with regulatory scrutiny from Federal Drug Administration and use it to proactively monitor and manage notable trial events.

The problem was Quintiles data was fragmented across multiple systems and this delayed the ability to make business decisions. Like many organizations, Quintiles data was located in multiple incompatible legacy systems. This meant there was extensive manual data manipulation before data could become useful. As well, incompatible legacy systems impeded data integration and normalization, and prohibited a holistic view across all sources. Making matters worse, management felt that it lacked the ability to take corrective actions in a timely manner.

Infosario launched to manage Quintiles analytical challenges

PharmaceuticalTo address these challenges, Quintiles leadership launched the Infosario Clinical Data Management Platform to power its pharmaceutical product development process. Infosario breaks down the silos of information that have limited combining massive quantities of scientific and operational data collected during clinical development with tens of millions of real-world patient records and population data. This step empowered researchers and drug developers to unlock a holistic view of data. This improved decision-making, and ultimately increasing the probability of success at every step in a product’s lifecycle. Quintiles Chief Information Officer, Richard Thomas says, “The drug development process is predicated upon the availability of high quality data with which to collaborate and make informed decisions during the evolution of a product or treatment”.

What Quintiles has succeeded in doing with Infosario is the integration of data and processes associated with a drug’s lifecycle. This includes creating a data engine to collect, clean, and prepare data for analysis. The data is then combined with clinical research data and information from other sources to provide a set of predictive analytics. This of course is aimed at impacting business outcomes.

The Infosario solution consists of several core elements

At its core, Infosario provides the data integration and data quality capabilities for extracting and organizing clinical and operational data. The approach combines and harmonizes data from multiple heterogeneous sources into what is called the Infosario Data Factory repository. The end is to accelerate reporting. Infosario leverages data federation /virtualization technologies to acquire information from disparate sources in a timely manner without affecting the underlying foundational enterprise data warehouse. As well, it implements a rule-based, real-time intelligent monitoring and alerting to enable the business to tweak and enhance business processes as they are needed. A “monitoring and alerting layer” sits on top of the data, with the facility to rapidly provide intelligent alerts to appropriate stakeholders regarding trial-related issues and milestone events. Here are some more specifics on the components of the Infosario solution:

• Data Mastering provides the capability to link multi-domains of data. This enables enterprise information assets to be actively managed, with an integrated view of the hierarchies and relationships.

• Data Management provides the high performance, scalable data integration needed to support enterprise data warehouses and critical operational data stores.

• Data Services provides the ability to combine data from multiple heterogeneous data sources into a single virtualized view. This allows Infosario to utilize data services to accelerate delivery of needed information.

• Complex Event Processing manages the critical task of monitoring enterprise data quality events and delivering alerts to key stakeholders to take necessary action.

Parting Thoughts

According to Richard Thomas, “the drug development process rests on the high quality data being used to make informed decisions during the evolution of a product or treatment. Quintiles’ Infosario clinical data management platform gives researchers and drug developers with the knowledge needed to improve decision-making and ultimately increase the probability of success at every step in a product’s lifecycle.” This it enables enhanced data accuracy, timeliness, and completeness. On the business side, it has enables Quintiles to establish industry-leading information and insight. And this in turn has enables the ability to make faster, more informed decisions, and to take action based on insights. This importantly has led to a faster time to market and a lengthening of the period of patent protection.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

Share
Posted in CIO, Data Governance, Data Quality | Tagged , , , , | Leave a comment

Is Big Data Destined To Become Small And Vertical?

Several years ago, I got to participate in one of the first neural net conferences. At the time, I thought it was amazing just to be there. There were chip and software vendors galore. Many even claimed to be the next Intel or the next Microsoft. Years later I joined a complex event processing vendor. Again, I felt the same sense of excitement. In both cases, the market participants moved from large horizontal market plays to smaller and more vertical solutions.

deja vuA sense of deja vu

Now to be clear, it is not my goal today to pop anyone’s big data balloon. But as I have gotten more excited about big data, I have gotten more and more an eerie sense of deja vu. The fact is the more that I dig into big data and hear customer’s stories about what they are trying to do with big data; the more I have concern about the similarities between big data and neural nets and complex event processing.

big dataBig Data offers new features

Clearly, big data does offer some interesting new features. And big data does take advantage of other market trends including virtualization and cloud. By doing so, big data achieves new orders of scalability than traditional business intelligence processing and storage. At the same time, big data offers the potential for lowering cost but I should take a moment to stress the word potential. The reason I do this is that while a myriad of processing approaches have been developed, no standard has yet emerged. And early adopters complain about having a difficulty in hiring big data map reduce programmers. And just like neural nets, the programing that needs to be done is turning out to be application specific.

With this said, it should be clear that big data does offer the potential to test datasets and to discover new and sometimes unexpected data relationship. This is a real positive thing. However, like its predecessors, this work is application specific and the data that is being related is truly of differing quality and detail. This means that the best that big data can do as a technology movement is discover potential data relationship. Once this is done, meaning can only be created by establishing detailed data relationships and dealing with the varying quality of data sets within the big data cluster.

Big Data will become small for management analysis

This means that big data must become small in order to really solve customer problems. Judith Hurwitz puts it this way, “big data analysis is really about small data. Small data, therefore, is the product of big data analysis. Big data will become small so that it is easier to comprehend”. What is “more necessary than ever is the capability to analyze the right data in a timely enough fashion to make decisions and take actions”. Judith says that in the end what is needed is quality data that is consistent, accurate, reliable, complete, timely, reasonable, and valid. The critical point is whether you use map reduce processing or traditional BI means, you shouldn’t throw out your data integration and quality tools. As big data becomes smaller, these will in reality become increasingly important.

So how does Judith see big data evolving? Judith sees big data propelling a lot of new small data. Judith believes that, “getting the right perspective on data quality can be very challenging in the world of big data. With a majority of big data sources, you need to assume that you are working with data that is not clean”. Judith says that we need to accept the fact that a lot of noise will exist in data. It is by searching and pattern matching that you will be able to find some sparks of truth in the midst of some very dirty data”. Judith suggests, therefore, a two phase approach—1) look for patterns in big data without concern for data quality; and 2) after you locate your patterns, applying the same data quality standards that have been applied to traditional data sources.

history repeatsHistory will repeat itself

For this reason, I believe that history will to a degree repeat itself. Clearly, the big data emperor does have his clothes on, but big data will become smaller and more vertical. Big data will become about relationship discovery and small data will become about quality analysis of data sources. In sum, this means that small data analysis is focused and provides the data for business decision making and big data analysis is broad and is about discovering what data potentially relates to what data.  I know this is a bit of different from the hype but it is realistic and makes sense. Remember, in the end, you will still need what business intelligence has refined.

Share
Posted in Big Data | Tagged , , , , | Leave a comment

History Repeats Itself Through Business Intelligence (Part 1)

History repeats

Unlike some of my friends, History was a subject in high school and college that I truly enjoyed.   I particularly appreciated biographies of favorite historical figures because it painted a human face and gave meaning and color to the past. I also vowed at that time to navigate my life and future under the principle attributed to Harvard professor Jorge Agustín Nicolás Ruiz de Santayana y Borrás that goes, “Those who cannot remember the past are condemned to repeat it.”

So that’s a little ditty regarding my history regarding history.

Forwarding now to the present in which I have carved out my career in technology, and in particular, enterprise software, I’m afforded a great platform where I talk to lots of IT and business leaders.  When I do, I usually ask them, “How are you implementing advanced projects that help the business become more agile or effective or opportunistically proactive?”  They usually answer something along the lines of “this is the age and renaissance of data science and analytics” and then end up talking exclusively about their meat and potatoes business intelligence software projects and how 300 reports now run their business.

Then when I probe and hear their answer more in depth, I am once again reminded of THE history quote and think to myself there’s an amusing irony at play here.  When I think about the Business Intelligence systems of today, most are designed to “remember” and report on the historical past through large data warehouses of a gazillion transactions, along with basic, but numerous shipping and billing histories and maybe assorted support records.

But when it comes right down to it, business intelligence “history” is still just that.  Nothing is really learned and applied right when and where it counted – AND when it would have made all the difference had the company been able to react in time.

So, in essence, by using standalone BI systems as they are designed today, companies are indeed condemned to repeat what they have already learned because they are too late – so the same mistakes will be repeated again and again.

This means the challenge for BI is to reduce latency, measure the pertinent data / sensors / events, and get scalable – extremely scalable and flexible enough to handle the volume and variety of the forthcoming data onslaught.

There’s a part 2 to this story so keep an eye out for my next blog post  History Repeats Itself (Part 2)

Share
Posted in Big Data, Complex Event Processing, Data Integration, Data Integration Platform, Data Warehousing, Real-Time, Uncategorized | Tagged , , , , , , , | Leave a comment

Opportunities for Healthcare Organizations to Move Data Forward

In this video, Richard Cramer, chief healthcare strategist, Informatica, talks about the opportunities for healthcare organizations to move data forward. He touches on relationship analytics, master data management (MDM), data quality and Complex Event Processing (CEP). He specifically answers the following questions:

– What are some of the major opportunities for healthcare organizations to move data forward?
– What technology is most poised to deliver benefits to healthcare organizations today?
– How can Informatica help healthcare organizations in their quest to deliver proactive medicine?

Share
Posted in Complex Event Processing, Data Quality, Healthcare, Master Data Management | Tagged , , , , , , , , , | Leave a comment

Social Media Monitoring with CEP, pt. 2: Context As Important As Sentiment

When I last wrote about social media monitoring, I made a case for using a technology like Complex Event Processing (“CEP”) to detect rapidly growing and geospatially-oriented social media mentions that can provide early warning detection for the public good (Social Media Monitoring for Early Warning of Public Safety Issues, Oct. 27, 2011).

A recent article by Chris Matyszczyk of CNET highlights the often conflicting and confusing nature of monitoring social media.  A 26-year old British citizen, Leigh Van Bryan, gearing up for a holiday of partying in Los Angeles, California (USA), tweeted in British slang his intention to have a good time:  “Free this week, for quick gossip/prep before I go and destroy America.” Since I’m not too far removed the culture of youth, I did take this to mean partying, cutting loose, having a good time (and other not-so-current definitions.) (more…)

Share
Posted in Complex Event Processing, Customer Acquisition & Retention, Customer Services, Customers, Data Integration Platform, Master Data Management, Real-Time | Tagged , , , , , , , , , | Leave a comment

The Reality of Real-Time: When Fast Enough Can Make You Real Money

One person’s “real-time” is another person’s “fast enough”.  (Or is it vice versa?)  I’ve been in the Complex Eventhttp://www.animaltalk.us/are-you-fast-enough/ Processing (CEP) space for almost five years, and nothing gets this industry more spun up than heated discussions about “feeds and speeds” –  the fastest products, the lowest  latencies, the greatest event volumes, the most events per second and so on. (more…)

Share
Posted in Complex Event Processing, Customer Acquisition & Retention, Real-Time | Tagged , , , , , , | 1 Comment

Social Media Monitoring for Early Warning of Public Safety Issues

A friend posted a humorous photo on Facebook the other day of a sign that read “In Case of Fire Exit Don't Tweet SignFire… Exit building before Tweeting about it.”  Hopefully it’s a made-up sign just to get a laugh, but the truth is actually closer than you think.  The intersection of the pervasiveness of mobile devices and the ubiquitous nature of social media is creating a tremendous amount of relevant data about the physical world not previously seen before – what is happening, where it’s happening, to whom, by whom in some cases and when.  Not usually early adopters, public safety officials are starting to take notice.

Here’s a real-world story: (more…)

Share
Posted in Uncategorized | Tagged , , , , , | 1 Comment

Informatica 9.1 Can Help You Be Captain Proactive

When something is really good, you’ve got to keep using it and last year the award-winning Comedy Central show South Park featured a character called “Captain Hindsight.”  I won’t describe it, just watch it and please return to finish reading this blog. Here’s a link to a site with the perfect clip

As part of Informatica’s 9.1 launch, we are introducing a new offering called “Proactive PowerCenter Monitoring.” This will actually be an option for PowerCenter. (more…)

Share
Posted in Big Data, Complex Event Processing, Data Integration, Data Integration Platform, Informatica 9.1, Real-Time | Tagged , , , | Leave a comment

Super Bowl MVP Of The Future

You may not know that Complex Event Processing already plays a role in the Super Bowl. Specifically in aiding law enforcement to insure situational awareness for security measures.   In 2003, Michael Lewis penned the best seller “Moneyball: The Art of Winning an Unfair Game” which showed the dramatic role information and new analytic approaches can play in fielding a winning baseball team at a (relative) bargain price. It’s only natural for similar thinking to occur in other sports.  Here’s five places that CEP can play in the Super Bowl (not all serious).

1)      Super Bowl Security – Some agencies already leverage CEP technology to gain situational awareness about activities happening with a close geographic proximity of the big game. Everything from traffic conditions, to shipping logistics & hazardous material routes, to air traffic and regional crime patterns are used to gain a situational threat awareness. (more…)

Share
Posted in Complex Event Processing, Uncategorized | Tagged , , , , , , , , , | 2 Comments

How Santa Uses CEP – Elf Productivity, Real-Time Naughty And Nice Rating

After much cajoling, I convinced the CEO of North Pole Global to give some insight into how his organization uses CEP to optimize operations and maximize productivity. For years, he’s been unwilling to say a word in fear of giving away a big competitive advantage to his competitors like Hanukkah Harry Inc. and Tooth Fairy Limited.  This year, because I’ve been extra good, the big guy has allowed me to ask a few questions, so here goes.

Me:  Has anyone actually asked for the Snuggie for Dogs?

Santa: Strangely enough, my wife has requested one but we don’t have a dog.

Me: OK, enough with the warm ups. You are a huge user of CEP. Can you tell me how you were introduced to it?

Santa: My CIO, Shorty McTechie brought it to my attention and to be honest I sat through the first five minutes day dreaming about snicker doodles.   I had to interupt him and tell him to get to the point. He said to not think about it as some complex technology but that CEP stood for Complete Elf Productivity.  That got my attention for sure. I love the little guys, but sometimes they are stuck in their old ways. I wanted new ideas, new processes and most of all adaptability.

Me: What do you mean by adaptability? (more…)

Share
Posted in Complex Event Processing, Customer Services, Customers, Real-Time, Uncategorized | Tagged , , , , , , | 4 Comments