Category Archives: Customer Acquisition & Retention

Don’t Rely on CRM as Your Single Source of Trusted Customer Data

Step 1: Determine if you have a customer data problem

A statement I often hear from marketing and sales leaders unfamiliar with the concept of mastering customer data is, “My CRM application is our single source of trusted customer data.” They use CRM to onboard new customers, collecting addresses, phone numbers and email addresses. They append a DUNS number. So it’s no surprise they may expect they can master their customer data in CRM. (To learn more about the basics of managing trusted customer data, read this: How much does bad data cost your business?)

It may seem logical to expect your CRM investment to be your customer master – especially since so many CRM vendors promise a “360 degree view of your customer.” But you should only consider your CRM system as the source of truth for trusted customer data if:

Shopper
For most large enterprises, CRM never delivered on that promise of a trusted 360-degree customer view.

 ·  You have only a single instance of Salesforce.com, Siebel CRM, or other CRM

·  You have only one sales organization (vs. distributed across regions and LOBs)

·  Your CRM manages all customer-focused processes and interactions (marketing, service, support, order management, self-service, etc)

·  The master customer data in your CRM is clean, complete, fresh, and free of duplicates


Unfortunately most mid-to-large companies cannot claim such simple operations. For most large enterprises, CRM never delivered on that promise of a trusted 360-degree customer view. That’s what prompted Gartner analysts Bill O’Kane and Kimbery Collins to write this report,
 MDM is Critical to CRM Optimization, in February 2014.

“The reality is that the vast majority of the Fortune 2000 companies we talk to are complex,” says Christopher Dwight, who leads a team of master data management (MDM) and product information management (PIM) sales specialists for Informatica. Christopher and team spend each day working with retailers, distributors and CPG companies to help them get more value from their customer, product and supplier data. “Business-critical customer data doesn’t live in one place. There’s no clear and simple source. Functional organizations, processes, and systems landscapes are much more complicated. Typically they have multiple selling organizations across business units or regions.”

As an example, listed below are typical functional organizations, and common customer master data-dependent applications they rely upon, to support the lead-to-cash process within a typical enterprise:

·  Marketing: marketing automation, campaign management and customer analytics systems.
·  Ecommerce: e-commerce storefront and commerce applications.
·  Sales: sales force automation, quote management,
·  Fulfillment: ERP, shipping and logistics systems.
·  Finance: order management and billing systems.
·  Customer Service: CRM, IVR and case management systems.

The fragmentation of critical customer data across multiple organizations and applications is further exacerbated by the explosive adoption of Cloud applications such as Salesforce.com and Marketo. Merger and acquisition (M&A) activity is common among many larger organizations where additional legacy customer applications must be onboarded and reconciled. Suddenly your customer data challenge grows exponentially.  

Step 2: Measure how customer data fragmentation impacts your business

Ask yourself: if your customer data is inaccurate, inconstant and disconnected can you:

Customer data is fragmented across multiple applications used by business units, product lines, functions and regions.

Customer data is fragmented across multiple applications used by business units, product lines, functions and regions.

·  See the full picture of a customer’s relationship with the business across business units, product lines, channels and regions?  

·  Better understand and segment customers for personalized offers, improving lead conversion rates and boosting cross-sell and up-sell success?

·  Deliver an exceptional, differentiated customer experience?

·  Leverage rich sources of 3rd party data as well as big data such as social, mobile, sensors, etc.., to enrich customer insights?

“One company I recently spoke with was having a hard time creating a single consolidated invoice for each customer that included all the services purchased across business units,” says Dwight. “When they investigated, they were shocked to find that 80% of their consolidated invoices contained errors! The root cause was innaccurate, inconsistent and inconsistent customer data. This was a serious business problem costing the company a lot of money.”

Let’s do a quick test right now. Are any of these companies your customers: GE, Coke, Exxon, AT&T or HP? Do you know the legal company names for any of these organizations? Most people don’t. I’m willing to bet there are at least a handful of variations of these company names such as Coke, Coca-Cola, The Coca Cola Company, etc in your CRM application. Chances are there are dozens of variations in the numerous applications where business-critical customer data lives and these customer profiles are tied to transactions. That’s hard to clean up. You can’t just merge records because you need to maintain the transaction history and audit history. So you can’t clean up the customer data in this system and merge the duplicates.

The same holds true for B2C customers. In fact, I’m a nightmare for a large marketing organization. I get multiple offers and statements addressed to different versions of my name: Jakki Geiger, Jacqueline Geiger, Jackie Geiger and J. Geiger. But my personal favorite is when I get an offer from a company I do business with addressed to “Resident”. Why don’t they know I live here? They certainly know where to find me when they bill me!

Step 3: Transform how you view, manage and share customer data

Why do so many businesses that try to master customer data in CRM fail? Let’s be frank. CRM systems such as Salesforce.com and Siebel CRM were purpose built to support a specific set of business processes, and for the most part they do a great job. But they were never built with a focus on mastering customer data for the business beyond the scope of their own processes.

But perhaps you disagree with everything discussed so far. Or you’re a risk-taker and want to take on the challenge of bringing all master customer data that exists across the business into your CRM app. Be warned, you’ll likely encounter four major problems:

1) Your master customer data in each system has a different data model with different standards and requirements for capture and maintenance. Good luck reconciling them!

2) To be successful, your customer data must be clean and consistent across all your systems, which is rarely the case.

3) Even if you use DUNS numbers, some systems use the global DUNS number; others use a regional DUNS number. Some manage customer data at the legal entity level, others at the site level. How do you connect those?

4) If there are duplicate customer profiles in CRM tied to transactions, you can’t just merge the profiles because you need to maintain the transactional integrity and audit history. In this case, you’re dead on arrival.

There is a better way! Customer-centric, data-driven companies recognize these obstacles and they don’t rely on CRM as the single source of trusted customer data. Instead, they are transforming how they view, manage and share master customer data across the critical applications their businesses rely upon. They embrace master data management (MDM) best practices and technologies to reconcile, merge, share and govern business-critical customer data. 

More and more B2B and B2C companies are investing in MDM capabilities to manage customer households and multiple views of customer account hierarchies (e.g. a legal view can be shared with finance, a sales territory view can be shared with sales, or an industry view can be shared with a business unit).

 

Gartner Report, MDM is Critical to CRM Optimization, Bill O'Kane & Kimberly Collins, February 7 2014.

Gartner Report, MDM is Critical to CRM Optimization, Bill O’Kane & Kimberly Collins, February 7 2014.

According to Gartner analysts Bill O’Kane and Kimberly Collins, “Through 2017, CRM leaders who avoid MDM will derive erroneous results that annoy customers, resulting in a 25% reduction in potential revenue gains,” according to this Gartner report, MDM is Critical to CRM Optimization, February 2014.

Are you ready to reassess your assumptions about mastering customer data in CRM?

Get the Gartner report now: MDM is Critical to CRM Optimization.

FacebookTwitterLinkedInEmailPrintShare
Posted in CMO, Customer Acquisition & Retention, Customers, Data Governance, Master Data Management, Mergers and Acquisitions | Tagged , , , , , , , , , , , , , , , , | Leave a comment

Health Plans, Create Competitive Differentiation with Risk Adjustment

improve risk adjustmentExploring Risk Adjustment as a Source of Competitive Differentiation

Risk adjustment is a hot topic in healthcare. Today, I interviewed my colleague, Noreen Hurley to learn more. Noreen tell us about your experience with risk adjustment.

Before I joined Informatica I worked for a health plan in Boston. I managed several programs  including CMS Five Start Quality Rating System and Risk Adjustment Redesign.  We recognized the need for a robust diagnostic profile of our members in support of risk adjustment. However, because the information resides in multiple sources, gathering and connecting the data presented many challenges. I see the opportunity for health plans to transform risk adjustment.

As risk adjustment becomes an integral component in healthcare, I encourage health plans to create a core competency around the development of diagnostic profiles. This should be the case for health plans and ACO’s.  This profile is the source of reimbursement for an individual. This profile is also the basis for clinical care management.  Augmented with social and demographic data, the profile can create a roadmap for successfully engaging each member.

Why is risk adjustment important?

Risk Adjustment is increasingly entrenched in the healthcare ecosystem.  Originating in Medicare Advantage, it is now applicable to other areas.  Risk adjustment is mission critical to protect financial viability and identify a clinical baseline for  members.

What are a few examples of the increasing importance of risk adjustment?

1)      Centers for Medicare and Medicaid (CMS) continues to increase the focus on Risk Adjustment. They are evaluating the value provided to the Federal government and beneficiaries.  CMS has questioned the efficacy of home assessments and challenged health plans to provide a value statement beyond the harvesting of diagnoses codes which result solely in revenue enhancement.   Illustrating additional value has been a challenge. Integrating data across the health plan will help address this challenge and derive value.

2)      Marketplace members will also require risk adjustment calculations.  After the first three years, the three “R’s” will dwindle down to one ‘R”.  When Reinsurance and Risk Corridors end, we will be left with Risk Adjustment. To succeed with this new population, health plans need a clear strategy to obtain, analyze and process data.  CMS processing delays make risk adjustment even more difficult.  A Health Plan’s ability to manage this information  will be critical to success.

3)      Dual Eligibles, Medicaid members and ACO’s also rely on risk management for profitability and improved quality.

With an enhanced diagnostic profile — one that is accurate, complete and shared — I believe it is possible to enhance care, deliver appropriate reimbursements and provide coordinated care.

How can payers better enable risk adjustment?

  • Facilitate timely analysis of accurate data from a variety of sources, in any  format.
  • Integrate and reconcile data from initial receipt through adjudication and  submission.
  • Deliver clean and normalized data to business users.
  • Provide an aggregated view of master data about members, providers and the relationships between them to reveal insights and enable a differentiated level of service.
  • Apply natural language processing to capture insights otherwise trapped in text based notes.

With clean, safe and connected data,  health plans can profile members and identify undocumented diagnoses. With this data, health plans will also be able to create reports identifying providers who would benefit from additional training and support (about coding accuracy and completeness).

What will clean, safe and connected data allow?

  • Allow risk adjustment to become a core competency and source of differentiation.  Revenue impacts are expanding to lines of business representing larger and increasingly complex populations.
  • Educate, motivate and engage providers with accurate reporting.  Obtaining and acting on diagnostic data is best done when the member/patient is meeting with the caregiver.  Clear and trusted feedback to physicians will contribute to a strong partnership.
  • Improve patient care, reduce medical cost, increase quality ratings and engage members.
FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Data Governance, Data Integration, Enterprise Data Management, Healthcare, Master Data Management, Operational Efficiency | Tagged , , | Leave a comment

Business Beware! Corporate IT Is “Fixing” YOUR Data

It is troublesome to me to repeatedly get into conversations with IT managers who want to fix data “for the sake of fixing it”.  While this is presumably increasingly rare, due to my department’s role, we probably see a higher occurrence than the normal software vendor employee.  Given that, please excuse the inflammatory title of this post.

Nevertheless, once the deal is done, we find increasingly fewer of these instances, yet still enough, as the average implementation consultant or developer cares about this aspect even less.  A few months ago a petrochemical firm’s G&G IT team lead told me that he does not believe that data quality improvements can or should be measured.  He also said, “if we need another application, we buy it.  End of story.”  Good for software vendors, I thought, but in most organizations $1M here or there do not lay around leisurely plus decision makers want to see the – dare I say it – ROI.

This is not what a business - IT relationship should feel like

This is not what a business – IT relationship should feel like

However, IT and business leaders should take note that a misalignment due to lack OR disregard of communication is a critical success factor.  If the business does not get what it needs and wants AND it differs what Corporate IT is envisioning and working on – and this is what I am talking about here – it makes any IT investment a risky proposition.

Let me illustrate this with 4 recent examples I ran into:

1. Potential for flawed prioritization

A retail customer’s IT department apparently knew that fixing and enriching a customer loyalty record across the enterprise is a good and financially rewarding idea.  They only wanted to understand what the less-risky functional implementation choices where. They indicated that if they wanted to learn what the factual financial impact of “fixing” certain records or attributes, they would just have to look into their enterprise data warehouse.  This is where the logic falls apart as the warehouse would be just as unreliable as the “compromised” applications (POS, mktg, ERP) feeding it.

Even if they massaged the data before it hit the next EDW load, there is nothing inherently real-time about this as all OLTP are running processes of incorrect (no bidirectional linkage) and stale data (since the last load).

I would question if the business is now completely aligned with what IT is continuously correcting. After all, IT may go for the “easy or obvious” fixes via a weekly or monthly recurring data scrub exercise without truly knowing, which the “biggest bang for the buck” is or what the other affected business use cases are, they may not even be aware of yet.  Imagine the productivity impact of all the roundtripping and delay in reporting this creates.  This example also reminds me of a telco client, I encountered during my tenure at another tech firm, which fed their customer master from their EDW and now just found out that this pattern is doomed to fail due to data staleness and performance.

2. Fix IT issues and business benefits will trickle down

Client number two is a large North American construction Company.  An architect built a business case for fixing a variety of data buckets in the organization (CRM, Brand Management, Partner Onboarding, Mobility Services, Quotation & Requisitions, BI & EPM).

Grand vision documents existed and linked to the case, which stated how data would get better (like a sick patient) but there was no mention of hard facts of how each of the use cases would deliver on this.  After I gave him some major counseling what to look out and how to flesh it out – radio silence. Someone got scared of the math, I guess.

3. Now that we bought it, where do we start

The third culprit was a large petrochemical firm, which apparently sat on some excess funds and thought (rightfully so) it was a good idea to fix their well attributes. More power to them.  However, the IT team is now in a dreadful position having to justify to their boss and ultimately the E&P division head why they prioritized this effort so highly and spent the money.  Well, they had their heart in the right place but are a tad late.   Still, I consider this better late than never.

4. A senior moment

The last example comes from a South American communications provider. They seemingly did everything right given the results they achieved to date.  This gets to show that misalignment of IT and business does not necessarily wreak havoc – at least initially.

However, they are now in phase 3 of their roll out and reality caught up with them.  A senior moment or lapse in judgment maybe? Whatever it was; once they fixed their CRM, network and billing application data, they had to start talking to the business and financial analysts as complaints and questions started to trickle in. Once again, better late than never.

So what is the take-away from these stories. Why wait until phase 3, why have to be forced to cram some justification after the purchase?  You pick, which one works best for you to fix this age-old issue.  But please heed Sohaib’s words of wisdom recently broadcast on CNN Money “IT is a mature sector post bubble…..now it needs to deliver the goods”.  And here is an action item for you – check out the new way for the business user to prepare their own data (30 minutes into the video!).  Agreed?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Governance, Data Integration, Data Quality, Data Warehousing, Enterprise Data Management, Master Data Management | Leave a comment

MDM Day Advice: Connect MDM to a Tangible Business Outcome or You Will Fail

“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information  Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.

A record 500 people attended MDM Day in Las Vegas

A record 500 people attended MDM Day in Las Vegas

In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.

Here are my MDM Day fun facts and key takeaways:

  • Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services.  Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.

  •  Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
    Barbara Latulippe, EMC

    Barbara Latulippe, Senior Director, Enterprise Information Management at EMC

    EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”

  • Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?

    Michael Delgado, Citrix

    Michael Delgado, Information Management Director at Citrix

    Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.

  •  Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?

    John Poonnen, Quintiles

    John Poonnen, Director Infosario Data Factory at Quintiles

    Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.

While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:

  • Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
  • Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”

I also had the opportunity MDM Day Sponsorsto speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.

There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, CMO, Customer Acquisition & Retention, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Informatica World 2014, Master Data Management, Partners, PiM, Product Information Management, Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , | 1 Comment

Retail Interview: From Product Information to Product Performance

Five questions to Arkady Kleyner, Executive VP & Co-Founder of Intricity LLC on how retailers can manage the transition from Product Information to Product Performance.

Arkady-Kleyner

Arkady-Kleyner

Arkady, you recently came back from the National Retail Federation conference.  What are some of the issues that retailers are struggling with these days?

Arkady Kleyner: There are some interesting trends happening right now in retail.  Amazon’s presence is creating a lot of disruption which is pushing traditional retailers to modernize their customer experience strategies.  For example, most Brick and Mortar retailers have a web presence, but they’re realizing that web presence can’t just be a second arm to their business.  To succeed, they need to integrate their web presence with their stores in a very intimate way.  To make that happen, they really have to peel back the onion down to the fundamentals of how product data is shared and managed.

In the good old days, Brick and Mortar retailers could live with a somewhat disconnected product catalog, because they were always ultimately picking from physical goods.  However in an integrated Web and Brick & Mortar environment, retailers must be far more accurate in their product catalog.  The customers entire product selection process may happen on-line but then picked up at the store.  So you can see where retailers need to be far more disciplined with their product data.  This is really where a Product Information Management tool is critical, with so many SKUs to manage, retailers really need a process that makes sense from end to end for onboarding and communicating a product to the customer.  And that is at the foundation of building an integrated customer experience.

In times of the digital customer, being online and connected always, we announced “commerce relevancy” as the next era of omnichannel and tailoring sales and marketing better to customers. What information are you seeing to be important when creating better customer shopping experience?

Arkady Kleyner:This is another paradigm in the integrated customer experience that retailers are trying to get their heads around. To appreciate how involved this is, just consider what a company like Amazon is doing.  They have millions of customers and millions of products and thousands of partners.  It’s literally a many to many to many relationship.  And this is why Amazon is eating everybody alive.  They know what products their customers like, they know how to reach those customers with those products, and they make it easy to buy it when you do.  This isn’t something that Amazon created over night, but the requirements are no different for the rest of retailers.  They need to ramp up the same type of capacity and reach.  For example if I sell jewelry I may be selling it on my own company store but I may also have 5 other partnering sites including Amazon.  Additionally, I may be using a dozen different advertising methods to drive demand.  Now multiply that times the number of jewelry products I sell and you have a massive hairball of complexity.  This is what we mean when we say that retailers need to be far more disciplined with their product data.  Having a Product Information Management process that spans the onboarding of products all the way through to the digital communication of those products is critical to a retailer staying relevant.

In which businesses do you see the need for more efficient product catalog management and channel convergence?

Arkady Kleyner: There is a huge opportunity out there for the existing Brick & Mortar retailers that embrace an integrated customer experience.  Amazon is not the de facto winner.  We see a future where the store near you actually IS the online store.  But to make that happen, Brick and Mortar retailers need to take a serious step back and treat their product data with the same reverence as they treat the product itself.  This means a well-managed process for onboarding, de-duping, and categorizing their product catalog, because all the customer marketing efforts are ultimately an extension of that catalog.

Which performance indicators are important? How can retailers profit from it?

Arkady Kleyner: There are two layers of performance indicators that are important.  The first is Operational Intelligence.  This is the intelligence that determines what product should be shown to who.  This is all based on customer profiling of purchase history.  The second is Strategic Intelligence.  This type of intelligence is the kind the helps you make overarching decisions on things like
-Maximizing the product margin by analyzing shipping and warehousing options
-Understanding product performance by demographics and regions
-Providing Flash Reports for Sales and Marketing

Which tools are needed to streamline product introduction but also achieve sales numbers?

Arkady Kleyner: Informatica is one of the few vendors that cares about data the same way retailers care about their products.  So if you’re a retailer, you really need to treat your product data with the same reverence as your physical products then you need to consider leveraging Informatica as a partner.  Their platform for managing product data is designed to encapsulate the entire process of onboarding, de-duping, categorizing, and syndicating product data.  Additionally Informatica PIM provides a platform for managing all the digital media assets so Marketing teams are able to focus on the strategy rather than tactics. We’ve also worked with Informatica’s data integration products to bring the performance data from the Point of Sale systems for both Strategic and Tactical uses. On the tactical side we’ve used this to integrate inventories between Web and Brick & Mortar so customers can have an integrated experience. On the strategic side we’ve integrated Warehouse Management Systems with Labor Cost tracking systems to provide a 360 degree view of the product costing including shipping and storage to drive a higher per unit margins.

You can hear more from Arkady in our webinarThe Streamlined SKU: Using Analytics for Quick Product Introductions” on Tuesday, March 4, 2014.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Customer Acquisition & Retention, PiM, Product Information Management, Real-Time, Retail, Uncategorized | Leave a comment

Death of the Data Scientist: Silver Screen Fiction?

Maybe the word “death” is a bit strong, so let’s say “demise” instead.  Recently I read an article in the Harvard Business Review around how Big Data and Data Scientists will rule the world of the 21st century corporation and how they have to operate for maximum value.  The thing I found rather disturbing was that it takes a PhD – probably a few of them – in a variety of math areas to give executives the necessary insight to make better decisions ranging from what product to develop next to who to sell it to and where.

Who will walk the next long walk.... (source: Wikipedia)

Who will walk the next long walk…. (source: Wikipedia)

Don’t get me wrong – this is mixed news for any enterprise software firm helping businesses locate, acquire, contextually link, understand and distribute high-quality data.  The existence of such a high-value role validates product development but it also limits adoption.  It is also great news that data has finally gathered the attention it deserves.  But I am starting to ask myself why it always takes individuals with a “one-in-a-million” skill set to add value.  What happened to the democratization  of software?  Why is the design starting point for enterprise software not always similar to B2C applications, like an iPhone app, i.e. simpler is better?  Why is it always such a gradual “Cold War” evolution instead of a near-instant French Revolution?

Why do development environments for Big Data not accommodate limited or existing skills but always accommodate the most complex scenarios?  Well, the answer could be that the first customers will be very large, very complex organizations with super complex problems, which they were unable to solve so far.  If analytical apps have become a self-service proposition for business users, data integration should be as well.  So why does access to a lot of fast moving and diverse data require scarce PIG or Cassandra developers to get the data into an analyzable shape and a PhD to query and interpret patterns?

I realize new technologies start with a foundation and as they spread supply will attempt to catch up to create an equilibrium.  However, this is about a problem, which has existed for decades in many industries, such as the oil & gas, telecommunication, public and retail sector. Whenever I talk to architects and business leaders in these industries, they chuckle at “Big Data” and tell me “yes, we got that – and by the way, we have been dealing with this reality for a long time”.  By now I would have expected that the skill (cost) side of turning data into a meaningful insight would have been driven down more significantly.

Informatica has made a tremendous push in this regard with its “Map Once, Deploy Anywhere” paradigm.  I cannot wait to see what’s next – and I just saw something recently that got me very excited.  Why you ask? Because at some point I would like to have at least a business-super user pummel terabytes of transaction and interaction data into an environment (Hadoop cluster, in memory DB…) and massage it so that his self-created dashboard gets him/her where (s)he needs to go.  This should include concepts like; “where is the data I need for this insight?’, “what is missing and how do I get to that piece in the best way?”, “how do I want it to look to share it?” All that is required should be a semi-experienced knowledge of Excel and PowerPoint to get your hands on advanced Big Data analytics.  Don’t you think?  Do you believe that this role will disappear as quickly as it has surfaced?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Integration, Data Integration Platform, Data Quality, Data Warehousing, Enterprise Data Management, Financial Services, Healthcare, Life Sciences, Manufacturing, Master Data Management, Operational Efficiency, Profiling, Scorecarding, Telecommunications, Transportation, Uncategorized, Utilities & Energy, Vertical | Tagged , , , , | 1 Comment

Hospitality Execs: Invest in Great Customer Information to Support A Customer-Obsessed Culture

I love exploring new places. I’ve had exceptional experiences at the W in Hong Kong, El Dorado Royale in the Riviera Maya and Ventana Inn in Big Sur. I belong to almost every loyalty program under the sun, but not all hospitality companies are capitalizing on the potential of my customer information. Imagine if employees had access to it so they could personalize their interactions with me and send me marketing offers that appeal to my interests.

Do I have high expectations? Yes. But so do many travelers. This puts pressure on marketing and sales executives who want to compete to win. According to Deloitte’s report, “Hospitality 2015: Game changers or spectators?,” hospitality companies need to adapt to meet consumers’ increasing expectations to know their preferences and tastes and to customize packages that suit individual needs.

Jeff Klagenberg helps companies to use their data as a strategic asset

Jeff Klagenberg helps companies use data as a strategic asset and get the most value out of it.

In this interview, Jeff Klagenberg, senior principal at Myers-Holum, explains how one of the largest, most customer-focused companies in the hospitality industry is investing in better customer, product, and asset information. Why? To personalize customer interactions, bundle appealing promotion packages and personalize marketing offers across channels.

Q: What are the company’s goals?
A: The executive team at one of the world’s leading providers of family travel and leisure experiences is focused on achieving excellence in quality and guest services. They generate revenues from the sales of room nights at hotels, food and beverages, merchandise, admissions and vacation club properties. The executive team believes their future success depends on stronger execution based on better measurement and a better understanding of customers.

Q: What role does customer, product and asset information play in achieving these goals?
A: Without the highest quality business-critical data, how can employees continually improve customer interactions? How can they bundle appealing promotional packages or personalize marketing offers? How can they accurately measure the impact of sales and marketing efforts? The team recognized the powerful role of high quality information in their pursuit of excellence.

Q: What are they doing to improve the quality of this business-critical information?
A: To get the most value out of their data and deliver the highest quality information to business and analytical applications, they knew they needed to invest in an integrated information management infrastructure to support their data governance process. Now they use the Informatica Total Customer Relationship Solution, which combines data integration, data quality, and master data management (MDM). It pulls together fragmented customer information, product information, and asset information scattered across hundreds of applications in their global operations into one central, trusted location where it can be managed and shared with analytical and operational applications on an ongoing basis.

Many marketers overlook the importance of using high quality customer information in their personalization capabilities.

Many marketers overlook the importance of using high quality customer information in their investments in personalization.

Q: How will this impact marketing and sales?
A: With clean, consistent and connected customer information, product information, and asset information in the company’s applications, they are optimizing marketing, sales and customer service processes. They get limitless insights into who their customers are and their valuable relationships, including households, corporate hierarchies and influencer networks. They see which products and services customers have purchased in the past, their preferences and tastes. High quality information enables the marketing and sales team to personalize customer interactions across touch points, bundle appealing promotional packages, and personalize marketing offers across channels. They have a better understanding of which marketing, advertising and promotional programs work and which don’t.

Q: What is the role did the marketing and sales leaders play in this initiative?
A: The marketing leaders and sales leaders played a key role in getting this initiative off the ground. With an integrated information management infrastructure in place, they’ll benefit from better integration between business-critical master data about customers, products and assets and transaction data.

Q. How will this help them gain customer insights from “Big Data”?
A. We helped the business leaders understand that getting customer insights from “Big Data” such as weblogs, call logs, social and mobile data requires a strong backbone of integrated business-critical data. By investing in a data-centric approach, they future-proofed their business. They are ready to incorporate any type of data they will want to analyze, such as interaction data. A key realization was there is no such thing as “Small Data.” The future is about getting very bit of understanding out of every data source.

Q: What advice do you have for hospitality industry executives?
A: Ask yourself, “Which of our strategic initiatives can be achieved with inaccurate, inconsistent and disconnected information?” Most executives know that the business-critical data in their applications, used by employees across the globe, is not the highest quality. But they are shocked to learn how much this is costing the company. My advice is talk to IT about the current state of your customer, product and asset information. Find out if it is holding you back from achieving your strategic initiatives.

Also, many business executives are excited about the prospect of analyzing “Big Data” to gain revenue-generating insights about customers. But the business-critical data about customers, products and assets is often in terrible shape. To use an analogy: look at a wheat field and imagine the bread it will yield. But don’t forget if you don’t separate the grain from the chaff you’ll be disappointed with the outcome. If you are working on a Big Data initiative, don’t forget to invest in the integrated information management infrastructure required to give you the clean, consistent and connected information you need to achieve great things.

FacebookTwitterLinkedInEmailPrintShare
Posted in Customer Acquisition & Retention, Customers, Data Integration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , , , , , , , , , , , , , , , | Leave a comment

Citrix Boosts Lead Conversion Rates by 20% with Better Customer Information

“If you don’t like change, you’re going to like irrelevancy a lot less.” I saw this powerful Ralph Waldo Emerson quotation in an MDM Summit presentation by Dagmar Garcia, senior manager of marketing data management at Citrix.  In this interview, Dagmar explains how Citrix is achieving a measurable impact on marketing results by improving the quality of customer information and prospect information.

improving the quality of customer information at Citrix

To  improve marketing campaign effectiveness Dagmar Garcia delivers clean, consistent and connected customer information.

Q: What is Citrix’s mission?
A: Citrix is a $2.6 billion company. We help people work and collaborate from anywhere by easily accessing enterprise applications and data from any device. More than 250,000 organizations around the globe use our solutions and we have over 10,000 partners in 100 countries who resell Citrix solutions.  

Q: What are marketing’s goals?
A: We operate in a hyper-competitive market. It’s critical to retain and expand relationships with existing enterprise and SMB customers and attract new ones. The marketing team’s goals are to boost campaign effectiveness and lead-to-opportunity conversion rates, while improving operational efficiencies.

But, it’s difficult to create meaningful customer segments and target them with relevant cross-sell and up-sell offers if marketing lacks access to clean, consistent and connected customer information  and visibility into the total customer relationship across product lines.

Q: What is your role in achieving these goals?
A:
I’ve been responsible for global marketing data management at Citrix for six years. My role is to identify, implement and maintain technical and business data management processes.I work with marketing leadership, GEO-based team members, sales operations, and operational experts to understand requirements, develop solutions and communicate results. I strive to create innovative solutions to improve the quality of master data at Citrix, including the roll-out and successful adoption of data governance and stewardship practices within Marketing and across other departments.

Q: What drove the decision to tackle inaccurate, inconsistent and disconnected customer and prospect information?
A: In 2011, the quality of customer information and prospect information was identified as the #1 problem by our sales and marketing teams. Account and contact information was incomplete, inaccurate and duplicated in our CRM system.

Another challenge was fragmented and inconsistent master account information scattered across the organization’s multiple applications.  It was difficult to know which source had the most accurate and up-to-date customer and prospect information.

To be successful, we needed a single source of the truth, one system of reference where data management best practices were centralized and consistent. This was a requirement to understand the total customer relationship across product lines. We asked ourselves:

  1. How can we improve campaign effectiveness if more than 40% of the contacts in our customer relationship management system (CRM) are inactive?
  2. How can we create meaningful customer segments for targeted cross-sell and up-sell offers when we don’t have visibility into all the products they already have?
  3. How can we improve lead to opportunity conversion rates if we have incomplete prospect data?
  4. How can we improve operational efficiencies if we have double the duplicate customer and prospect information than the industry standard?
  5. How can we maintain high data quality standards in our global operations if we lack the data quality technology and processes needed to be successful?
Customer information

Citrix is investing in a marketing data management foundation to deliver better quality customer information to operational and analytical applications.

Q: How are you managing customer and prospect information now?
A: We built a marketing data management foundation. We centralized our data management and reduced manual, error-prone and time-consuming data quality efforts. To decrease the duplicate account and contact rate, we focused on managing the quality of our data as close to the source as possible by improving data validation at points of entry.

Q: What role does Informatica play?
A: We using master data management (MDM) to:

  • pull together fragmented customer, prospect and partner information scattered across applications into one central, trusted location where it can be mastered, managed and shared on an ongoing basis,
  • organize customer, prospect and partner information so we know how companies and people are related to each other, which hierarchies and networks they belong to, including their roles and organizations, and
  • syndicate clean, consistent and connected customer, partner and product information to applications, such as CRM and data warehouses for analytics.

Q: Why did you choose Informatica?
A
:  After completing a thorough analysis of our gaps, we knew the best solution was a combination of MDM technology and a data governance process. We wanted to empower the business to manage customer information, navigate multiple hierarchies, handle exceptions and make changes with a transparent process through an easy-to-use interface.

At the same time, we did extensive industry research and learned Informatica MDM was ranked as a visionary and thought leader in the master data management solution space and could support our data governance process.

Q: Can you share some of the results you’ve achieved?
A:
Now that marketing uses clean, consistent and connected customer and prospect information and an understanding of the total customer relationship, we’ve seen a positive impact on these key metrics:

↑ 20% lead-to-opportunity conversion rates
↑ 20% operational efficiency
↑ 50% quality data at point of entry
↓ 50% in prospect accounts duplication rate
↓ 50% in creation of duplicate prospect accounts and contacts
↓ 50% in junk data rate

FacebookTwitterLinkedInEmailPrintShare
Posted in Customer Acquisition & Retention, Data Governance, Master Data Management | Tagged , , , , , , , , | Leave a comment

Murphy’s First Law of Bad Data – If You Make A Small Change Without Involving Your Client – You Will Waste Heaps Of Money

I have not used my personal encounter with bad data management for over a year but a couple of weeks ago I was compelled to revive it.  Why you ask? Well, a complete stranger started to receive one of my friend’s text messages – including mine – and it took days for him to detect it and a week later nobody at this North American wireless operator had been able to fix it.  This coincided with a meeting I had with a European telco’s enterprise architecture team.  There was no better way to illustrate to them how a customer reacts and the risk to their operations, when communication breaks down due to just one tiny thing changing – say, his address (or in the SMS case, some random SIM mapping – another type of address).

Imagine the cost of other bad data (thecodeproject.com)

Imagine the cost of other bad data (thecodeproject.com)

In my case, I  moved about 250 miles within the United States a couple of years ago and this seemingly common experience triggered a plethora of communication screw ups across every merchant a residential household engages with frequently, e.g. your bank, your insurer, your wireless carrier, your average retail clothing store, etc.

For more than two full years after my move to a new state, the following things continued to pop up on a monthly basis due to my incorrect customer data:

  • In case of my old satellite TV provider they got to me (correct person) but with a misspelled last name at my correct, new address.
  • My bank put me in a bit of a pickle as they sent “important tax documentation”, which I did not want to open as my new tenants’ names (in the house I just vacated) was on the letter but with my new home’s address.
  • My mortgage lender sends me a refinancing offer to my new address (right person & right address) but with my wife’s as well as my name completely butchered.
  • My wife’s airline, where she enjoys the highest level of frequent flyer status, continually mails her offers duplicating her last name as her first name.
  • A high-end furniture retailer sends two 100-page glossy catalogs probably costing $80 each to our address – one for me, one for her.
  • A national health insurer sends “sensitive health information” (disclosed on envelope) to my new residence’s address but for the prior owner.
  • My legacy operator turns on the wrong premium channels on half my set-top boxes.
  • The same operator sends me a SMS the next day thanking me for switching to electronic billing as part of my move, which I did not sign up for, followed by payment notices (as I did not get my invoice in the mail).  When I called this error out for the next three months by calling their contact center and indicating how much revenue I generate for them across all services, they counter with “sorry, we don’t have access to the wireless account data”, “you will see it change on the next bill cycle” and “you show as paper billing in our system today”.

Ignoring the potential for data privacy law suits, you start wondering how long you have to be a customer and how much money you need to spend with a merchant (and they need to waste) for them to take changes to your data more seriously.  And this are not even merchants to whom I am brand new – these guys have known me and taken my money for years!

One thing I nearly forgot…these mailings all happened at least once a month on average, sometimes twice over 2 years.  If I do some pigeon math here, I would have estimated the postage and production cost alone to run in the hundreds of dollars.

However, the most egregious trespass though belonged to my home owner’s insurance carrier (HOI), who was also my mortgage broker.  They had a double whammy in store for me.  First, I received a cancellation notice from the HOI for my old residence indicating they had cancelled my policy as the last payment was not received and that any claims will be denied as a consequence.  Then, my new residence’s HOI advised they added my old home’s HOI to my account.

After wondering what I could have possibly done to trigger this, I called all four parties (not three as the mortgage firm did not share data with the insurance broker side – surprise, surprise) to find out what had happened.

It turns out that I had to explain and prove to all of them how one party’s data change during my move erroneously exposed me to liability.  It felt like the old days, when seedy telco sales people needed only your name and phone number and associate it with some sort of promotion (back of a raffle card to win a new car), you never took part in, to switch your long distance carrier and present you with a $400 bill the coming month.  Yes, that also happened to me…many years ago.  Here again, the consumer had to do all the legwork when someone (not an automatic process!) switched some entry without any oversight or review triggering hours of wasted effort on their and my side.

We can argue all day long if these screw ups are due to bad processes or bad data, but in all reality, even processes are triggered from some sort of underlying event, which is something as mundane as a database field’s flag being updated when your last purchase puts you in a new marketing segment.

Now imagine you get married and you wife changes her name. With all these company internal (CRM, Billing, ERP),  free public (property tax), commercial (credit bureaus, mailing lists) and social media data sources out there, you would think such everyday changes could get picked up quicker and automatically.  If not automatically, then should there not be some sort of trigger to kick off a “governance” process; something along the lines of “email/call the customer if attribute X has changed” or “please log into your account and update your information – we heard you moved”.  If American Express was able to detect ten years ago that someone purchased $500 worth of product with your credit card at a gas station or some lingerie website, known for fraudulent activity, why not your bank or insurer, who know even more about you? And yes, that happened to me as well.

Tell me about one of your “data-driven” horror scenarios?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Business Impact / Benefits, Business/IT Collaboration, Complex Event Processing, Customer Acquisition & Retention, Customer Services, Customers, Data Aggregation, Data Governance, Data Privacy, Data Quality, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Healthcare, Master Data Management, Retail, Telecommunications, Uncategorized, Vertical | Tagged , , , , , , , , , | Leave a comment

Understand Customer Intentions To Manage The Experience

I recently had a lengthy conversation with a business executive of a European telco.  His biggest concern was to not only understand the motivations and related characteristics of consumers but to accomplish this insight much faster than before.  Given available resources and current priorities this is something unattainable for many operators.

Unlike a few years ago – remember the time before iPad – his organization today is awash with data points from millions of devices, hundreds of device types and many applications.

What will he do next?

What will he do next?

One way for him to understand consumer motivation; and therefore intentions, is to get a better view of a user’s network and all related interactions and transactions.  This includes his family household, friends and business network (also a type of household).  The purpose of householding is to capture social and commercial relationships in a grouping of individuals (or businesses or both mixed together) in order to identify patterns (context), which can be exploited to better serve a customer a new individual product or bundle upsell, to push relevant apps, audio and video content.

Let’s add another layer of complexity by understanding not only who a subscriber is, who he knows and how often he interacts with these contacts and the services he has access to via one or more devices but also where he physically is at the moment he interacts.  You may also combine this with customer service and (summarized) network performance data to understand who is high-value, high-overhead and/or high in customer experience.  Most importantly, you will also be able to assess who will do what next and why.

Some of you may be thinking “Oh gosh, the next NSA program in the making”.   Well, it may sound like it but the reality is that this data is out there today, available and interpretable if cleaned up, structured and linked and served in real time.  Not only do data quality, ETL, analytical and master data systems provide the data backbone for this reality but process-based systems dealing with the systematic real-time engagement of consumers are the tool to make it actionable.  If you add some sort of privacy rules using database or application-level masking technologies, most of us would feel more comfortable about this proposition.

This may feel like a massive project but as many things in IT life; it depends on how you scope it.  I am a big fan of incremental mastering of increasingly more attributes of certain customer segments, business units, geographies, where lessons learnt can be replicated over and over to scale.  Moreover, I am a big fan of figuring out what you are trying to achieve before even attempting to tackle it.

The beauty behind a “small” data backbone – more about “small data” in a future post – is that if a certain concept does not pan out in terms of effort or result, you have just wasted a small pile of cash instead of the $2 million for a complete throw-away.  For example: if you initially decided that the central lynch pin in your household hub & spoke is the person, who owns the most contracts with you rather than the person who pays the bills every month or who has the largest average monthly bill, moving to an alternative perspective does not impact all services, all departments and all clients.  Nevertheless, the role of each user in the network must be defined over time to achieve context, i.e. who is a contract signee, who is a payer, who is a user, who is an influencer, who is an employer, etc.

Why is this important to a business? It is because without the knowledge of who consumes, who pays for and who influences the purchase/change of a service/product, how can one create the right offers and target them to the right individual.

However, in order to make this initial call about household definition and scope or look at the options available and sensible, you have to look at social and cultural conventions, what you are trying to accomplish commercially and your current data set’s ability to achieve anything without a massive enrichment program.  A couple of years ago, at a Middle Eastern operator, it was very clear that the local patriarchal society dictated that the center of this hub and spoke model was the oldest, non-retired male in the household, as all contracts down to children of cousins would typically run under his name.  The goal was to capture extended family relationships more accurately and completely in order to create and sell new family-type bundles for greater market penetration and maximize usage given new bandwidth capacity.

As a parallel track aside from further rollout to other departments, customer segments and geos, you may also want to start thinking like another European operator I engaged a couple of years ago.  They were trying to outsource some data validation and enrichment to their subscribers, which allowed for a more accurate and timely capture of changes, often life-style changes (moves, marriages, new job).  The operator could then offer new bundles and roaming upsells. As a side effect, it also created a sense of empowerment and engagement in the client base.

I see bits and pieces of some of this being used when I switch on my home communication systems running broadband signal through my X-Box or set-top box into my TV using Netflix and Hulu and gaming.  Moreover, a US cable operator actively promotes a “moving” package to help make sure you do not miss a single minute of entertainment when relocating.

Every time now I switch on my TV, I get content suggested to me.  If telecommunication services would now be a bit more competitive in the US (an odd thing to say in every respect) and prices would come down to European levels, I would actually take advantage of the offer.  And then there is the log-on pop up asking me to subscribe (or throubleshoot) a channel I have already subscribed to.  Wonder who or what automated process switched that flag.

Ultimately, there cannot be a good customer experience without understanding customer intentions.  I would love to hear stories from other practitioners on what they have seen in such respect

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Complex Event Processing, Customer Acquisition & Retention, Customer Services, Customers, Data Integration, Data Quality, Master Data Management, Profiling, Real-Time, Telecommunications, Vertical | Tagged , , , , , , , , , | Leave a comment