Tag Archives: customer data integration
Unlike some of my friends, History was a subject in high school and college that I truly enjoyed. I particularly appreciated biographies of favorite historical figures because it painted a human face and gave meaning and color to the past. I also vowed at that time to navigate my life and future under the principle attributed to Harvard professor Jorge Agustín Nicolás Ruiz de Santayana y Borrás that goes, “Those who cannot remember the past are condemned to repeat it.”
So that’s a little ditty regarding my history regarding history.
Forwarding now to the present in which I have carved out my career in technology, and in particular, enterprise software, I’m afforded a great platform where I talk to lots of IT and business leaders. When I do, I usually ask them, “How are you implementing advanced projects that help the business become more agile or effective or opportunistically proactive?” They usually answer something along the lines of “this is the age and renaissance of data science and analytics” and then end up talking exclusively about their meat and potatoes business intelligence software projects and how 300 reports now run their business.
Then when I probe and hear their answer more in depth, I am once again reminded of THE history quote and think to myself there’s an amusing irony at play here. When I think about the Business Intelligence systems of today, most are designed to “remember” and report on the historical past through large data warehouses of a gazillion transactions, along with basic, but numerous shipping and billing histories and maybe assorted support records.
But when it comes right down to it, business intelligence “history” is still just that. Nothing is really learned and applied right when and where it counted – AND when it would have made all the difference had the company been able to react in time.
So, in essence, by using standalone BI systems as they are designed today, companies are indeed condemned to repeat what they have already learned because they are too late – so the same mistakes will be repeated again and again.
This means the challenge for BI is to reduce latency, measure the pertinent data / sensors / events, and get scalable – extremely scalable and flexible enough to handle the volume and variety of the forthcoming data onslaught.
There’s a part 2 to this story so keep an eye out for my next blog post History Repeats Itself (Part 2)
Everyone knows that Informatica is the Data Integration company that helps organizations connect their disparate software into a cohesive and synchronous enterprise information system. The value to business is enormous and well documented in the form of use cases, ROI studies and loyalty / renewal rates that are industry-leading.
Event Processing, on the other hand is a technology that has been around only for a few years now and has yet to reach Main Street in Systems City, IT. But if you look at how event processing is being used, it’s amazing that more people haven’t heard about it. The idea at its core (pun intended) is very simple – monitor your data / events – those things that happen on a daily, hourly, minute-ly basis and then look for important patterns that are positive or negative indicators, and then set up your systems to automatically take action when those patterns come up – like notify a sales rep when a pattern indicates a customer is ready to buy, or stop that transaction, your company is about to be defrauded.
Since this is an Informatica blog, then you probably have a decent set of “muscles” in place already and so why, you ask, would you need 6 pack abs? Because 6 packs abs are a good indication of a strong musculature core and are the basis of a stable and highly athletic body. It’s the same parallel for companies because in today’s competitive business environment, you need strength, stability, and agility to compete. And since IT systems increasingly ARE the business, if your company isn’t performing as strong, lean, and mean as possible, then you can be sure your competitors will be looking to implement every advantage they can.
You may also be thinking why would you need something like Event Processing when you already have good Business Intelligence systems in place? The reality is that it’s not easy to monitor and measure useful but sometimes hidden data /event / sensor / social media sources and also to discern which patterns have meaning and which patterns may be discovered as false negatives. But the real difference is that BI usually reports to you after the fact when the value of acting on the situation has diminished significantly.
So while muscles are important to be able to stand up and run, and good quality, strong muscles are necessary to do heavy lifting, it’s those 6 pack abs on top of it all that give you the mean lean fighting machine to identify significant threats and opportunities amongst your data, and in essence, to better compete and win.
Customers don’t always like change, and new product launch offers variety of changes so it’s important to showcase the value of the change for customers while launching a product. One key ingredient that can fuel the successful Product launch is leveraging the rich, varied, multi-sourced, readily available information. Yes, tons of information which is like a gold mine and is available to us more easily/readily than ever before from various different sources. Industry experts call it Big Data. Today Big Data can pull gold out of this information gold mine and positively impact a product launch. What follows are 3 secrets of how Product Marketers can tap the power of Big Data for a successful product launch.
Secret #1: Use Big Data to optimize content strategy and targeted messaging
The main challenge is not just to create a great product but also to communicate the clear compelling value of the product to your customers. You need to speak the language that resonates with needs and preferences of customers. Through social media platforms and weblogs, lots of information is available highlighting views/preferences of buyers. Big Data brings all these data points together from various sources, unlocks them to provide customer intelligence. Product Marketers can leverage this intelligence to create customer segmentation and targeted messaging.
Secret #2: Use Big Data to identify influential customers and incent them to influence others
One of the studies done by Forrester Research indicates that today your most valuable customer is the one who may buy little but influences 100 others to buy via blogs, tweets, Facebook and online product reviews. Using MDM with Big Data businesses can create a 360 degree customer profile by integrating transaction, social interaction and weblogs which help in identifying influential customers. Companies can engage these influential customers early by initiating a soft launch or beta testing of their product.
Secret #3: Use Big data to provide direction to ongoing Product improvement
Big Data is also a useful tool to monitor on-going product performance and keeping customers engaged post-launch. Insights into how customers are using the product and what they enjoy most can open the doors for improvements in future launches resulting in happier and loyal customers.
Zynga, creator of most popular Facebook game Farmville, collects terabytes of big data in a day and analyzes it to improve the game features and customer services. As indicated in a WSJ article after Version 1 launch of the game, the company analyzed customer behavior and found that customers were interacting with animals much more than the designers expected. So in the second release game designers increased the game offerings with more focus on animals keeping customer’s more engaged.
Big data is proving to be a game changer for product managers and marketers who want to deeply engage with their customers and launch products with a memorable and valued customer experience.
The phrase ‘Data Tsunami’ has been used by numerous authors in the last few months and it’s difficult to find another suitable analogy because what’s approaching is of such an increased order of magnitude that the IT industries continued expectations for data growth will be swamped in the next few years.
However impressive a spectacle a Tsunami is, it still wreaks havoc to those who are unprepared or believe they can tread water and simply float to the surface when the trouble has passed.
I have been developing two ideas for customer data management: entities vs. roles and differentiation. In the last post I suggested that customer is not a data type, but rather a role that can be played by some core entity in some context, with some set of characteristics assigned to that role within that context. (more…)
In a number of recent tutorials and training sessions, I have incorporated a little joke into some of the material to help motivate understanding. It isn’t really *that much* of a joke, but here it is:
Q: What is the most dangerous question to ask data professionals?