Category Archives: Data Quality

Are You Ready to Compete on Customer Experience?

 

This blog post initially appeared on CMSwire.com and is reblogged here with their consent.

Are You Ready to Compete on Customer Experience?

Are You Ready to Compete on Customer Experience?

Friends of mine were remodeling their master bath. After searching for a claw foot tub in stores and online, they found the perfect one that fit their space. It was only available for purchase on the retailer’s e-commerce site, they bought it online.

When it arrived, the tub was too big. The dimensions online were incorrect. They went to return it to the closest store, but were told they couldn’t — because it was purchased online, they had to ship it back.

The retailer didn’t have a total customer relationship view or a single view of product information or inventory across channels and touch points. This left the customer representative working with a system that was a silo of limited information. She didn’t have access to a rich customer profile. She didn’t know that Joe and his wife spent almost $10,000 with the brand in the last year. She couldn’t see the products they bought online and in stores. Without this information, she couldn’t deliver a great customer experience.

It was a terrible customer experience. My friends share it with everyone who asks about their remodel. They name the retailer when they tell the story. And, they don’t shop there anymore. This terrible customer experience is negatively impacting the retailer’s revenue and brand reputation.

Bad customer experiences happen a lot. Companies in the US lose an estimated $83 billion each year due to defections and abandoned purchases as a direct result of a poor experience, according to a Datamonitor/Ovum report.

Customer Experience is the New Marketing

Gartner believes that by 2016, companies will compete primarily on the customer experiences they deliver. So who should own customer experience?

Twenty-five percent of CMOs say that their CEOs expect them to lead customer experience. What’s their definition of customer experience? “The practice of centralizing customer data in an effort to provide customers with the best possible interactions with every part of the company, from marketing to sales and even finance.”

Mercedes Benz USA President and CEO, Steve Cannon said, “Customer experience is the new marketing.”

The Gap Between Customer Expectations + Your Ability to Deliver

My previous post, 3 Barriers to Delivering Omnichannel Experiences, explained how omnichannel is all about seeing your business through the eyes of your customer. Customers don’t think in terms of channels and touch points, they just expect a seamless, integrated and consistent customer experience. It’s one brand to the customer. But there’s a gap between customer expectations and what most businesses can deliver today.

Most companies who sell through multiple channels operate in silos. They are channel-centric rather than customer-centric. This business model doesn’t empower employees to deliver seamless, integrated and consistent customer experiences across channels and touch points. Different leaders manage each channel and are held accountable to their own P&L. In most cases, there’s no incentive for leaders to collaborate.

Old Navy’s CMO, Ivan Wicksteed got it right when he said,

“Seventy percent of searches for Old Navy are on a mobile device. Consumers look at the product online and often want to touch it in the store. The end goal is not to get them to buy in the store. The end goal is to get them to buy.”

The end goal is what incentives should be based on.

Executives at most organizations I’ve spoken with admit they are at the very beginning stages of their journey to becoming omnichannel retailers. They recognize that empowering employees with a total customer relationship view and a single view of product information and inventory across channels are critical success factors.

Becoming an omnichannel business is not an easy transition. It forces executives to rethink their definition of customer-centricity and whether their business model supports it. “Now that we need to deliver seamless, integrated and consistent customer experiences across channels and touch points, we realized we’re not as customer-centric as we thought we were,” admitted an SVP of marketing at a financial services company.

You Have to Transform Your Business

“We’re going through a transformation to empower our employees to deliver great customer experiences at every stage of the customer journey,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”

Hyatt uses data integration, data quality and master data management (MDM) technology to connect the numerous applications that contain fragmented customer data including sales, marketing, e-commerce, customer service and finance. It brings the core customer profiles together into a single, trusted location, where they are continually managed. Now its customer profiles are clean, de-duplicated, enriched and validated. Members of a household as well as the connections between corporate hierarchies are now visible. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively.

When he first joined Hyatt, Brogan did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay,” he joked. Those guest profiles have now been successfully consolidated.

According to Brogan,

“Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is a mess.”

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Some companies lack a single view of product information across channels and touch points. About 60 percent of retail managers believe that shoppers are better connected to product information than in-store associates. That’s a problem. The same challenges exist for product information as customer information. How many different systems contain valuable product information?

Harrods overcame this challenge. The retailer has a strategic initiative to transform from a single iconic store to an omnichannel business. In the past, Harrods’ merchants managed information for about 500,000 products for the store point of sale system and a few catalogs. Now they are using product information management technology (PIM) to effectively manage and merchandise 1.7 million products in the store and online.

Because they are managing product information centrally, they can fuel the ERP system and e-commerce platform with full, searchable multimedia product information. Harrods has also reduced the time it takes to introduce new products and generate revenue from them. In less than one hour, buyers complete the process from sourcing to market readiness.

It Ends with Satisfied Customers

By 2016, you will need to be ready to compete primarily on the customer experiences you deliver across channels and touch points. This means really knowing who your customers are so you can serve them better. Many businesses will transform from a channel-centric business model to a truly customer-centric business model. They will no longer tolerate messy data. They will recognize the importance of arming marketing, sales, e-commerce and customer service teams with the clean, consistent and connected customer, product and inventory information they need to deliver seamless, integrated and consistent experiences across touch points. And all of us will be more satisfied customers.

Share
Posted in 5 Sales Plays, CMO, Data Governance, Data Integration, Data Quality, Master Data Management, PiM | Tagged , , , , | Leave a comment

Major Oil Company Uses Analytics to Gain Business Advantage

analytics case studies-GasAccording Michelle Fox of CNBC and Stephen Schork, the oil industry is in ‘dire straits’. U.S. crude posted its ninth-straight weekly loss this week, landing under $50 a barrel. The news is bad enough that it is now expected to lead to major job losses. The Dallas Federal Reserve anticipates that the Texas could lose about 125,000 jobs by the end of June. Patrick Jankowski, an economist and vice president of research at the Greater Houston Partnership, expects exploration budgets will be cut 30-35 percent, which will result in approximately 9,000 fewer wells being drilled. The problem is “if oil prices keep falling, at some point it’s not profitable to pull it out of the ground” (“When, and where, oil is too cheap to be profitable”, CNBC, John W. Schoen). job losses 

This means that a portion of the world’s oil supply will become unprofitable to produce. According to Wood Mackenzie, “once the oil price reaches these levels, producers have a sometimes complex decision to continue producing, losing money on every barrel produced, or to halt production, which will reduce supply”. The question are these the only answers?

Major Oil Company Uses Analytics to Gain Business Advantage

analytics case studiesA major oil company that we are working with has determined that data is a success enabler for their business. They are demonstrating what we at Informatica like to call a “data ready business”—a business that is ready for any change in market conditions. This company is using next generation analytics to ensure their businesses survival and to make sure they do not become what Jim Cramer likes to call a “marginal producer”.  This company has said to us that their success is based upon being able to extract oil more efficiently than its competitors.

Historically data analysis was pretty simple

analytics case studies-oil drillingTraditionally oil producers would get oil by drilling a new hole in the ground.  And in 6 months they would start getting the oil flowing commercially and be in business. This meant it would typically take them 6 months or longer before they could get any meaningful results including data that could be used to make broader production decisions.

Drilling from data

Today, oil is, also, produced from shale or fracking techniques.  This process can take only 30-60 days before oil producers start seeing results.  It is based not just on innovation in the refining of oil, but also on innovation in the refining of data from operational business decisions can be made. The benefits of this approach including the following:

Improved fracking process efficiency

analytics case studies-FrackingFracking is a very technical process. Producers can have two wells on the same field that are performing at very different levels of efficiency. To address this issue, the oil company that we have been discussing throughout this piece is using real-time data to optimize its oil extraction across an entire oil field or region. Insights derived from these allow them to compare wells in the same region for efficiency or productivity and even switch off certain wells if the oil price drops below profitability thresholds. This ability is especially important as the price of oil continues to drop.  At $70/barrel, many operators go into the red while more efficient data driven operators can remain profitable at $40/barrel.  So efficiency is critical across a system of wells.

Using data to decide where to build wells in the first place

When constructing a fracking or sands well, you need more information on trends and formulas to extract oil from the ground.  On a site with 100+ wells for example, each one is slightly different because of water tables, ground structure, and the details of the geography. You need the right data, the right formula, and the right method to extract the oil at the best price and not impact the environment at the very same time.

The right technology delivers the needed business advantage

analytics case studiesOf course, technology is never been simple to implement. The company we are discussing has 1.2 Petabytes of data they were processing and this volume is only increasing.  They are running fiber optic cables down into wells to gather data in real time. As a result, they are receiving vast amounts of real time data but cannot store and analyze the volume of data efficiently in conventional systems. Meanwhile, the time to aggregate and run reports can miss the window of opportunity while increasing cost. Making matters worse, this company had a lot of different varieties of data. It also turns out that quite of bit of the useful information in their data sets was in the comments section of their source application.  So traditional data warehousing would not help them to extract the information they really need. They decided to move to new technology, Hadoop. But even seemingly simple problems, like getting access to data were an issue within Hadoop.  If you didn’t know the right data analyst, you might not get the data you needed in a timely fashion. Compounding things, a lack of Hadoop skills in Oklahoma proved to be a real problem.

The right technology delivers the right capability

The company had been using a traditional data warehousing environment for years.  But they needed help to deal with their Hadoop environment. This meant dealing with the volume, variety and quality of their source well data. They needed a safe, efficient way to integrate all types of data on Hadoop at any scale without having to learn the internals of Hadoop. Early adopters of Hadoop and other Big Data technologies have had no choice but to hand-code using Java or scripting languages such as Pig or Hive. Hiring and retaining big data experts proved time consuming and costly. This is because data scientists and analysts can spend only 20 percent of their time on data analysis and the rest on the tedious mechanics of data integration such as accessing, parsing, and managing data. Fortunately for this oil producer, it didn’t have to be this way. They were able to get away with none of the specialized coding required to scale performance on distributed computing platforms like Hadoop. Additionally, they were able “Map Once, Deploy Anywhere,” knowing that even as technologies change they can run data integration jobs without having to rebuild data processing flows.

Final remarks

It seems clear that we live in an era where data is at the center of just about every business. Data-ready enterprises are able to adapt and win regardless of changing market conditions. These businesses invested in building their enterprise analytics capability before market conditions change. In this case, these oil producers will be able to produce oil at lower costs than others within their industry. Analytics provides three benefits to oil refiners.

  • Better margins and lower costs from operations
  • Lowers risk of environmental impact
  • Lower time to build a successful well

In essence, those that build analytics as a core enterprise capability will continue to have a right to win within a dynamic oil pricing environment.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform
Author Twitter: @MylesSuer

Share
Posted in Big Data, CIO, Data Quality | Tagged , , , | Leave a comment

Getting Personal with Data as a Service (DaaS)

Getting Personal with Data as a Service (DaaS)

Getting Personal with Data as a Service (DaaS)

Last week, I spent three days at Retail’s Big Show hosted by the National Retail Federation (NRF) in New York City. This annual event gives retailers the opportunity to network with their colleagues from all over the world. In addition, they get the chance to interact with technology vendors that can help them improve their business.

From marketing automation to analytics software, there were countless technology offerings showcasing how to best assist the modern marketer in making every customer interaction personal. Throughout the week, I had numerous conversations with retail professionals about the importance of personalization in marketing and what it means to their organization’s future plans.

At the heart of their plans was the need to understand the data that they have today, and how to verify the data that they will inevitably acquire in the future. If it’s accurate, if it’s reliable, if it’s complete – customer data can fuel your ability to engage and interact.

The data driven marketer derives insight and ultimately provides a personalized experience by leveraging this valuable data for each customer.

And why is this important?

Well, according to McMurrayTMG, 78% of buyers believe that organizations providing a personalized experience are interested in building good relationships. But it all starts with accurate data.

Knowing who your customers are, how you can contact them, and what they are interested in are essential in order to engage with your customers. With the abundance of data available today, you have to figure that if you aren’t ensuring that your customer interactions are personalized, then your competitors are gaining ground. Every interaction, every correspondence counts towards a positive perception as well as increased sales and customer satisfaction.

By fueling your interactions with Data as a Service (DaaS) for accurate customer data, you will ensure that your customers have a personalized experience with your brand and ultimately accelerate your business.

Share
Posted in CMO, Customer Acquisition & Retention, Customers, DaaS, Data Quality, Retail, Total Customer Relationship | Tagged , , , , , , , | Leave a comment

Does Your Sales Team Have What They Need to Succeed in 2015?

Like me, you probably just returned from an inspiring Sales Kick Off 2015 event. You’ve invested in talented people. You’ve trained them with the skills and knowledge they need to identify, qualify, validate, negotiate and close deals. You’ve invested in world-class applications, like Salesforce Sales Cloud, to empower your sales team to sell more effectively. But does your sales team have what they need to succeed in 2015?

Gartner predicts that as early as next year, companies will compete primarily on the customer experiences they deliver. So, every customer interaction counts. Knowing your customers is key to delivering great sales experiences.

If you’re not fueling Salesforce Sales Cloud with clean, consistent and connected customer information, your sales team may be at a disadvantage.
If you’re not fueling Salesforce Sales Cloud with clean, consistent and connected customer information, your sales team may be at a disadvantage.

But, inaccurate, inconsistent and disconnected customer information may be holding your sales team back from delivering great sales experiences. If you’re not fueling Salesforce Sales Cloud (or another Sales Force Automation (SFA) application) with clean, consistent and connected customer information, your sales team may be at a disadvantage against the competition.

To successfully compete and deliver great sales experiences more efficiently, your sales team needs a complete picture of their customers. They don’t want to pull information from multiple applications and then reconcile it in spreadsheets. They want direct access to the Total Customer Relationship across channels, touch points and products within their Salesforce Sales Cloud.

Watch this short video comparing a day-in-the-life of two sales reps competing for the same business. One has access to the Total Customer Relationship in Salesforce Sales Cloud, the other does not. Watch now: Salesforce.com with Clean, Consistent and Connected Customer Information

Is your sales team spending time creating spreadsheets by pulling together customer information from multiple applications and then reconciling it to understand the Total Customer Relationship across channels, touch points and products? If so, how much is it costing your business? Or is your sales team engaging with customers without understanding the Total Customer Relationship? How much is that costing your business?

Many innovative sales leaders are gaining a competitive edge by better leveraging their customer data to empower their sales teams to deliver great sales experiences. They are fueling business and analytical applications, like Salesforce Sales Cloud, with clean, consistent and connected customer information.  They are arming their sales teams with direct access to richer customer profiles, which includes the Total Customer Relationship across channels, touch points and products.

What measurable results have these sales leaders acheived? Merrill Lynch boosted sales productivity by 15%, resulting in $50M in annual impact. A $60B manufacturing company improved cross-sell and up-sell success by 5%. Logitech increased across channels: online, in their retail partner’s stores and through distribution partners.

This year, I believe more sales leaders will focus on leveraging their customer information for competitive advantage. This will help them shift from sales automation to sales optimization. What do you think?

Share
Posted in 5 Sales Plays, Business Impact / Benefits, Business/IT Collaboration, CIO, Cloud, Cloud Computing, Cloud Data Integration, Cloud Data Management, Customer Acquisition & Retention, Data Integration, Data Quality, Enterprise Data Management, Intelligent Data Platform, Master Data Management, Operational Efficiency, SaaS, Total Customer Relationship | Tagged , , , , , , , , , , , , , , , | 1 Comment

Keep the Ring, Get Me an iPad: Emotional Vs Rational Marketing

Rational Marketing

Rational Marketing

The holidays that just passed weren’t the only thing to celebrate, according to historical trends. As we moved from December into 2015, how many of you were seeing a lot more engagement announcements on Facebook, or even became engaged yourself?

December is the most popular month to get engaged (according to wedding website TheKnot.com), so it’s likely many of us gearing up for the typical spring/summer calendar full of weekend weddings.

While December is not known as a big month for weddings, it is a big time for jewelers, including the months leading up to it. Diamonds, gold, and other fine jewelry become very popular purchases at this time.

Fine jewelry is an emotional buying decision, which you can see from the jewelry store commercials that evoke sentiment for our loved ones.

But that emotional pull to purchase diamonds, gold and precious stones could be changing significantly.

Diamond sales are down this year – but what is up? Technology-related gifts, including smart phones, tablets, and other functional devices. To understand why, all you have to do is think about the ages of people getting engaged: 18-34 year-olds.

People in that age range who are getting engaged right now just aren’t drawn in by the emotional purchase of fine jewelry anymore, if they ever were. They value technology purchases.

But it’s not just function over form. The emotional motivation behind a purchase (whether technology or fine jewelry or any high-dollar item) is always there.

“Status for this generation isn’t about money — it’s about attention,” said psychology professor Kit Yarrow in a recent Pacific Standard magazine article. Therefore, a smart phone is considered a better gift (and better use for the money) than fine jewelry, since it allows you to share your life and stay connected much more than a gold and diamond ring can do.

As the article notes, using technology to create “an everlasting Facebook album from that scuba diving trip in Bali says so much more than one lone photo of a pave diamond necklace.”

WHAT FUELS YOUR BUSINESS DECISIONS?

The average decision process for a consumer making a purchase is estimated at 80% emotional and 20% rational, according to an annual customer loyalty report from Brand Keys.

It’s interesting to think that the car in your garage, or the shoes on your feet, could have ultimately been something you felt you wanted (80%), and then justified the need for later (20%). Brands, especially in the luxury market, depend on this ratio.

This realization brings us to your business planning as we begin 2015. What guides your business decisions as a data-fueled marketer: emotions, or rationale?

How do brands make decisions about how to operate, what customers to market to, where to locate stores, what marketing campaigns to do, and many more strategic plans? It needs to be much more in the “rational” category – but how do you do that as a data-fueled marketer?

As consumers, we are emotional creatures without even realizing it. That can be a habit we bring to other things in our lives as well, including decisions at work.

Since your customers still have emotional reasons for making a purchase or using a service, the only thing that should be emotional is your messaging to your customers; not your planning. Creating customer profiles and making decisions from them should never be solely a ‘gut feeling’ or only based on your professional instincts.

At the same time, we all know that in our work, over time we develop good instincts about what we do. We learn to trust our sense of what will work or won’t work in the market, or in the supply chain, or within product development – whatever it is you do. You can never ignore that, because no one can completely predict the future with total accuracy. You have to trust your experience and knowledge to lead you.

Turn the 80/20 ratio on its head, and instead focus 20% on emotional thinking and 80% on rational thinking. Make your brand’s business decisions and planning based on good data.

Who are your customers? Where do they live? What do they do and what are their preferences? Basing the answers to these questions only on what has worked in the past, or what you think your customers should want, will only lead to bad business decisions.

The first step, however, is to know that your customer data is valid and complete. Gartner estimates that 40% of failed business initiatives are due to bad data. Validate, correct, and enrich your customer data before you use it. Then as a truly data-fueled marketer, you can use the 20/80 ratio properly and steer your brand to a great 2015.

For more about data quality best practices, check out this white paper written for marketers that goes beyond the basics.

Share
Posted in Customers, Data Quality, Data Services | Tagged , , , | Leave a comment

Analytics Stories: A Pharmaceutical Case Study

Pharma CIOAs I have shared within other posts within this series, businesses are using analytics to improve their internal and external facing business processes and to strengthen their “right to win” within the markets that they operate. For pharmaceutical businesses, strengthening the right to win begins and ends with the drug product development lifecycle. I remember, for example, talking several years ago to the CFO of major pharmaceutical company and having him tell me the most important financial metrics for him had to do with reducing the time to market for a new drug and maximizing the period of patent protection. Clearly, the faster a pharmaceutical company gets a product to market, the faster it can begin to earning a return on its investment.

Fragmented data challenged analytical efforts

PharmaceuticalAt Quintiles, what the business needed was a system with the ability to optimize design, execution, quality, and management of clinical trials. Management’s goal was to dramatically shorten time to complete each trial, including quickly identifying when a trial should be terminated. At the same time, management wanted to continuously comply with regulatory scrutiny from Federal Drug Administration and use it to proactively monitor and manage notable trial events.

The problem was Quintiles data was fragmented across multiple systems and this delayed the ability to make business decisions. Like many organizations, Quintiles data was located in multiple incompatible legacy systems. This meant there was extensive manual data manipulation before data could become useful. As well, incompatible legacy systems impeded data integration and normalization, and prohibited a holistic view across all sources. Making matters worse, management felt that it lacked the ability to take corrective actions in a timely manner.

Infosario launched to manage Quintiles analytical challenges

PharmaceuticalTo address these challenges, Quintiles leadership launched the Infosario Clinical Data Management Platform to power its pharmaceutical product development process. Infosario breaks down the silos of information that have limited combining massive quantities of scientific and operational data collected during clinical development with tens of millions of real-world patient records and population data. This step empowered researchers and drug developers to unlock a holistic view of data. This improved decision-making, and ultimately increasing the probability of success at every step in a product’s lifecycle. Quintiles Chief Information Officer, Richard Thomas says, “The drug development process is predicated upon the availability of high quality data with which to collaborate and make informed decisions during the evolution of a product or treatment”.

What Quintiles has succeeded in doing with Infosario is the integration of data and processes associated with a drug’s lifecycle. This includes creating a data engine to collect, clean, and prepare data for analysis. The data is then combined with clinical research data and information from other sources to provide a set of predictive analytics. This of course is aimed at impacting business outcomes.

The Infosario solution consists of several core elements

At its core, Infosario provides the data integration and data quality capabilities for extracting and organizing clinical and operational data. The approach combines and harmonizes data from multiple heterogeneous sources into what is called the Infosario Data Factory repository. The end is to accelerate reporting. Infosario leverages data federation /virtualization technologies to acquire information from disparate sources in a timely manner without affecting the underlying foundational enterprise data warehouse. As well, it implements a rule-based, real-time intelligent monitoring and alerting to enable the business to tweak and enhance business processes as they are needed. A “monitoring and alerting layer” sits on top of the data, with the facility to rapidly provide intelligent alerts to appropriate stakeholders regarding trial-related issues and milestone events. Here are some more specifics on the components of the Infosario solution:

• Data Mastering provides the capability to link multi-domains of data. This enables enterprise information assets to be actively managed, with an integrated view of the hierarchies and relationships.

• Data Management provides the high performance, scalable data integration needed to support enterprise data warehouses and critical operational data stores.

• Data Services provides the ability to combine data from multiple heterogeneous data sources into a single virtualized view. This allows Infosario to utilize data services to accelerate delivery of needed information.

• Complex Event Processing manages the critical task of monitoring enterprise data quality events and delivering alerts to key stakeholders to take necessary action.

Parting Thoughts

According to Richard Thomas, “the drug development process rests on the high quality data being used to make informed decisions during the evolution of a product or treatment. Quintiles’ Infosario clinical data management platform gives researchers and drug developers with the knowledge needed to improve decision-making and ultimately increase the probability of success at every step in a product’s lifecycle.” This it enables enhanced data accuracy, timeliness, and completeness. On the business side, it has enables Quintiles to establish industry-leading information and insight. And this in turn has enables the ability to make faster, more informed decisions, and to take action based on insights. This importantly has led to a faster time to market and a lengthening of the period of patent protection.

Related links

Related Blogs

Analytics Stories: A Banking Case Study
Analytics Stories: A Financial Services Case Study
Analytics Stories: A Healthcare Case Study
Who Owns Enterprise Analytics and Data?
Competing on Analytics: A Follow Up to Thomas H. Davenport’s Post in HBR
Thomas Davenport Book “Competing On Analytics”

Solution Brief: The Intelligent Data Platform

Author Twitter: @MylesSuer

Share
Posted in CIO, Data Governance, Data Quality | Tagged , , , , | Leave a comment

Data as a Service will ensure that 2015 will be known as “The Year of the Customer!”

Data as a Service

Customers are at the center

Not so long ago, customers were simply faceless names and transactions understood through disjointed sales data and potentially inaccurate contact information.

Over the past few years, we’ve seen companies across industries make remarkable business transformations to become customer-centric organizations. These companies understand that customers are no longer loyal to brands or products alone. Instead, they’re loyal to companies who provide the optimal, most personalized customer experiences.

By understanding more about their customers, their interests, and their interaction preferences, organizations can ultimately encourage increased sales and usage of their products and services.

As we begin 2015 and predict what the next trends will be, I believe that this year will finally be the year that customer centricity becomes the norm – and effective management of data will play the most critical role to date in getting companies to reach their customer centricity goals.

But it won’t necessarily happen overnight. So how should companies get started with this effort?

“A requirement behind customer centricity is the ability to understand customers at a fairly granular level and to be able to identify the customers or the segments of customers who are valuable from the ones who aren’t,” writes Peter Fader (Co-Director of the Wharton Customer Analytics Initiative at the University of Pennsylvania). “If you can’t sort out your customers — if you can’t look at them and know who is good and who is bad — then you can’t be customer centric. That’s step one.”

More and more companies are working through strategies for what Peter Fader describes as step one. They understand their data, and explore ways to utilize this information to gain valuable insights. For example, consider the advancements that Citrix achieved (read more in this case study). By better understanding their customer data, they saw a 20% improvement in lead conversion.

The organizations that have a better understanding of their customers are leading the way by utilizing technology to ensure data accuracy. If their contact data (address, email, and phone) is correct, then they can effectively reach that customer without fail. If their contact data is poor, connecting with customers becomes impossible and can ultimately impact their ability to compete.

Companies like BCBG understand this and are utilizing data quality services to reach up to 15% more customers (read more in this case study).

As companies continue to understand their customer data, they’ll look to fill in the gaps. Sometimes, these gaps are obvious. If a customer’s contact profile has a hole in it – for example a missing phone number – it becomes clear that the hole must be filled.

Utilizing Data as a Service enrichment and validation capabilities, organizations have the opportunity to clean up missing data without wasting a high value customer interaction to ask for their phone number. Instead, they can spend their time selling to this customer.

In addition to filling the contact profile gaps, Data as a Service subscription data is also a great way to expand the view of the customer and learn more about them. Companies can enrich their customer profiles with demographic information or industry data to round out their customer profiles, further supporting their customer-centricity goals.

In 2015, we will see companies utilizing their customer data to form a deeper connection and ultimately increase sales. The habit of “Speaking at” customers will fall by the wayside of true engagement. If customers are the lifeblood of an organization, then, in 2015, we’ll see more and more companies leveraging Data as a Service to increase customer loyalty — and ultimately fuel business growth.

 

Share
Posted in Customers, Data Quality | Tagged , , , , | Leave a comment

8 Information Quality Predictions for 2015 And Beyond

Information Quality Predictions

Information Quality Predictions

Andy Hayler of Information Difference wrote in October last year that it’s been 10 years since the master data management (MDM) industry emerged. Andy sees MDM technology maturing and project success rates rising. He concluded that MDM has moved past its infancy and has a promising future as it is approaching its teenage years.

The last few months have allowed me to see MDM, data quality and data governance from a completely different perspective. I sat with other leaders here at Informatica, analysts who focus on information quality and spent time talking to our partners who work closely with customers on data management initiatives. As we collectively attempted to peer into the crystal ball and forecast what will be hot – and what will not – in this year and beyond for MDM and data quality, here are few top predictions that stood out.

1. MDM will become a single platform for all master entities
“The classical notion of boundaries that existed where we would say, this is MDM versus this is not MDM is going to get blurred,” says Dennis Moore – SVP, Information Quality Solutions (IQS), Informatica. “Today, we master a fairly small number of attributes in MDM. Rather than only mastering core attributes, we need to master business level entities, like customer, product, location, assets, things, etc., and combine all relevant attributes into a single platform which can be used to develop new “data fueled” applications. This platform will allow mastering of data, aggregate data from other sources, and also syndicate that data out into other systems.”

Traditionally MDM was an invisible hub that was connected to all the spokes. Instead, Dennis says – “MDM will become more visible and will act as an application development platform.”

2. PIM is becoming more integrated environment that covers all information about products and related data in single place
More and more customers want to have single interface which will allow them to manage all product information. Along with managing a product’s length, width, height, color, cost etc., they probably want to see data about the history, credit rating, previous quality rating, sustainability scorecard, returns, credits and so on. Dennis says – “All the product information in one place helps make better decisions with embedded analytics, giving answers to questions such as:

  • What were my sales last week?
  • Which promotions are performing well and poorly?
  • Which suppliers are not delivering on their SLAs?
  • Which stores aren’t selling according to plan?
  • How are the products performing in specific markets?”

Essentially, PIM will become a sovereign supplier of product data that goes in your catalog and ecommerce system that will be used by merchandisers, buyers, and product and category managers. It will become the buyer’s guide and a desktop for the person whose job is to figure out how to effectively promote products to meet sales targets.

3. MDM will become an integral part of big data analytics projects
“Big data analytics suffers from the same challenges as traditional data warehouses – bad data quality produces sub-optimal intelligence. MDM has traditionally enabled better analysis and reporting with high quality master data. Big data analytics will also immensely benefit from MDM’s most trustworthy information.” – Said Ravi Shankar – VP of Product Marketing, MDM, Informatica

Naveen Sharma who heads Enterprise Data Management practice at Cognizant reemphasized what I heard from Dennis. He says – “With big data and information quality coming together, some of the boundaries between a pure MDM system and a pure analytical system will start to soften”. Naveen explains – “MDM is now seen as an integral part of big data analytics projects and it’s a huge change from a couple of years ago. Two of large retailers we work with are going down the path of trying to bring not only the customer dimension but the associated transactional data to derive meaning into an extended MDM platform. I see this trend continuing in 2015 and beyond with other verticals as well.”

4. Business requirements are leading to the creation of solutions
There are several business problems being solved by MDM, such as improving supplier spend management and collaboration with better supplier data. Supply chain, sourcing and procurement teams gain significant cost savings and a boost in productivity by mastering supplier, raw materials and product information and fueling their business and analytical applications with that clean, consistent and connected information. Jakki Geiger, Senior Director of IQS Solutions Marketing at Informatica says, “Business users want more than just the underlying infrastructure to manage business-critical data about suppliers, raw materials, and products. They want to access this information directly through a business-friendly user interface. They want a business process-driven workflow to manage the full supplier lifecycle, including: supplier registration, qualification, verification, onboarding and off-boarding. Instead of IT building these business-user focused solutions on top of an MDM foundation, vendors are starting to build ready-to-use MDM solutions like the Total Supplier Relationship solution.” Read more about Valspar’s raw materials spend management use case.

5. Increased adoption of matching and linking capabilities on Hadoop 
“Many of our customers have significantly increased the amount of data they want to master,” says Dennis Moore. Days when tens of millions of master records were a lot are long gone and having hundreds of millions of master records and billions of source records is becoming almost common. An increasing number of master data sources –internal and external to organization – are contributing significantly to the rise in data volumes. To accommodate these increasing volumes, Dennis predicts that large enterprises will look at running complex matching and linking capabilities on Hadoop – a cost effective and flexible way to analyze large amount of data.

6. Master insight management is going to be next big step
“MDM will evolve into master insight management as organizations try to relate trusted data they created in MDM with transactional and social interaction data,” said Rob Karel – VP of Product Strategy and Product Marketing, IQS, Informatica. “The innovations in machine and deep learning techniques will help organizations such as healthcare prescribe next best treatment based on history of patients, retailers suggest best offers based on customer interest and behavior, public sector companies will see big steps in social services, etc.”

Rob sees MDM at the heart of this innovation bringing together relevant information about multiple master entities and acting as a core system for insight management innovations.

7. MDM and Data Governance
Aaron Zornes – Chief research officer at the MDM Institute predicts that in 2014-15, vendor MDM solutions will move from “passive-aggressive” mode to “proactive” data governance mode. Data governance for MDM will move beyond simple stewardship to convergence of task management, workflow, policy management and enforcement according to Aaron.

8. The market will solidify for cloud based MDM adoption
Aaron says – “Cloud-innate services for DQ and DG will be more prevalent; however, enterprise MDM will remain on premise with increasing integration to cloud applications in 2015.

Naveen sees lot of synergy around cloud based MDM offerings and says – “The market is solidifying for MDM on cloud but the flood gates are yet to open”.  Naveen does not see any reason why MDM market will not go to cloud and gives the example of CRM which was at similar junction before Saleforce came into play. Naveen sees similar shift for MDM and says – “The fears companies have about their data security on cloud is eventually going to fade. If you look closely at any of the recent breaches, these all involved hacks into company networks and not into cloud provider networks. The fact that cloud service providers spend more dollars on data security than any one company can spend on their on-premise security layer will be a major factor affecting the transition”. Naveen sees that big players in MDM will include cloud offerings as part of their toolkit in coming years.

Ravi also predicts an increase in cloud adoption for MDM in future as the concern for placing master data in the cloud becomes less with maximum security provided by cloud vendors.

So, what do you predict? I would love to hear your opinions and comments.

~Prash
@MDMGeek
www.mdmgeek.com

Share
Posted in Big Data, Cloud, Data Governance, Data Quality, Enterprise Data Management, Master Data Management, Product Information Management | Tagged , , , , , | Leave a comment

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Imagine A New Sheriff In Town

As we renew or reinvent ourselves for 2015, I wanted to share a case of “imagine if” with you and combine it with the narrative of an American frontier town out West, trying to find a new Sheriff – a Wyatt Earp.  In this case the town is a legacy European communications firm and Wyatt and his brothers are the new managers – the change agents.

management

Is your new management posse driving change?

Here is a positive word upfront.  This operator has had some success in rolling outs broadband internet and IPTV products to residential and business clients to replace its dwindling copper install base.  But they are behind the curve on the wireless penetration side due to the number of smaller, agile MVNOs and two other multi-national operators with a high density of brick-and-mortar stores, excellent brand recognition and support infrastructure.  Having more than a handful of brands certainly did not make this any easier for our CSP.   To make matters even more challenging, price pressure is increasingly squeezing all operators in this market.  The ones able to offset the high-cost Capex for spectrum acquisitions and upgrades with lower-cost Opex for running the network and maximizing subscriber profitability, will set themselves up for success (see one of my earlier posts around the same phenomenon in banking).

Not only did they run every single brand on a separate CRM and billing application (including all the various operational and analytical packages), they also ran nearly every customer-facing-service (CFS) within a brand the same dysfunctional way.  In the end, they had over 60 CRM and the same number of billing applications across all copper, fiber, IPTV, SIM-only, mobile residential and business brands.  Granted, this may be a quite excessive example; but nevertheless, it is relevant for many other legacy operators.

As a consequence, their projections indicate they incur over €600,000 annually in maintaining duplicate customer records (ignoring duplicate base product/offer records for now) due to excessive hardware, software and IT operations.  Moreover, they have to stomach about the same amount for ongoing data quality efforts in IT and the business areas across their broadband and multi-play service segments.

Here are some more consequences they projected:

  • €18.3 million in call center productivity improvement
  • €790,000 improvement in profit due to reduced churn
  • €2.3 million reduction in customer acquisition cost
  • And if you include the fixing of duplicate and conflicting product information, add another €7.3 million in profit via billing error and discount reduction (which is inline with our findings from a prior telco engagement)

Despite major business areas not having contributed to the investigation and improvements being often on the conservative side, they projected a 14:1 return ratio between overall benefit amount and total project cost.

Coming back to the “imagine if” aspect now, one would ask how this behemoth of an organization can be fixed.  Well, it will take years but without management (in this case new managers busting through the door), this organization has the chance to become the next Rocky Mountain mining ghost town.

Busting into the cafeteria with new ideas & looking good while doing it?

Busting into the cafeteria with new ideas & looking good while doing it?

The good news is that this operator is seeing some management changes now.  The new folks have a clear understanding that business-as-usual won’t do going forward and that centralization of customer insight (which includes some data elements) has its distinct advantages.  They will tackle new customer analytics, order management, operational data integration (network) and next-best-action use cases incrementally. They know they are in the data, not just the communication business.  They realize they have to show a rapid succession of quick wins rather than make the organization wait a year or more for first results.  They have fairly humble initial requirements to get going as a result.

You can equate this to the new Sheriff not going after the whole organization of the three, corrupt cattle barons, but just the foreman of one of them for starters.  With little cost involved, the Sheriff acquires some first-hand knowledge plus he sends a message, which will likely persuade others to be more cooperative going forward.

What do you think? Is new management the only way to implement drastic changes around customer experience, profitability or at least understanding?

Share
Posted in Big Data, Business Impact / Benefits, CIO, CMO, Customer Acquisition & Retention, Customer Services, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Governance, Risk and Compliance, Master Data Management, Operational Efficiency, Product Information Management, Telecommunications, Vertical | Tagged , , , , , , , , , | Leave a comment