Category Archives: Master Data Management

Right Product, Right Customer, Right Place – The Informed Purchase Journey

The Informed Purchase Journey

The way we shop has changed. It’s hard to keep up with customer demands in a single channel, much less many. Selling products today has changed and always will. The video below shows how today’s customer takes The Informed Purchase Journey:

“Customers expect a seamless experience that makes it easy for them to engage at every touchpoint on their “decision journey. Informatica PIM is key component on  transformation from a product centric view to a consumer experience driven marketing with more efficiency.” – Heather Hanson – Global Head of Marketing Technology at Electrolux

Selling products today is:

  • Shopper-controlled. It’s never been easier for consumers to compare products and prices. This has eroded old customer loyalty and means you have to earn every sale.
  • Global. If you’re selling your products in different regions, you’re facing complex localization and supply chain coordination.
  • Fast. Product lifecycles are short. Time-to-market is critical (and gets tougher the more channels you’re selling through).
  • SKU-heavy. Endless-aisle assortments are great for margins. That’s a huge opportunity, but product data overload due to the large number of SKUs and their attributes adds up to a huge admin burden.
  • Data driven. Product data alone is more than a handful to deal with. But you also need to know as much about your customers as you know about your products. And the explosion of channels and touch points doesn’t make it any easier to connect the dots.

Conversion Power – From Deal Breaker To Deal Maker

For years, a customer’s purchase journey was something of “An Unexpected Journey.” Lack of insight into the journey was a struggle for retailers and brands. The journey is fraught with more questions about product than ever before, even for fast moving consumer goods.

Today, the consumer behaviors and the role of product information have changed since the advent of substantial bandwidths and social buying. To do so, lets examine the way shoppers buy today.

  • Due to Google shoppers use 10.4 sources in average (zero moment of truth ZMOT google research)
  •  133% higher conversion rate shown by mobile shoppers who view customer content like reviews.
  • Digital devices’ influence 50% of in-store purchase behavior by end of 2014 (Deloitte’s Digital Divide)

How Informatica PIM 7.1 turns information from deal breaker to deal maker

PIM 7.1 comes with new data quality dashboards, helping users like category managers, marketing texters, managers or ecommerce specialists to do the right things. The quality dashboards point users to the things they have to do next in order to get the data right, out and ready for sales.

Eliminate Shelf Lag: The Early Product Closes the Sale

For vendors, this effectively means time-to-market: the availability of a product plus the time it takes to collect all relevant product information so you can display it to the customer (product introduction time).

The biggest threat is not the competition – it’s your own time-consuming, internal processes. We call this Shelf Lag, and it’s a big inhibitor of retailer profits. Here’s why:

  • You can’t sell what you can’t display.
  • Be ready to spin up new channels
  • Watch your margins.

How Informatica PIM 7.1 speeds up product introduction and customer experience

“By 2017… customer experience is what buyers are going to use to make purchase decisions.” (Source: Gartner’s Hype Cycle for E-Commerce, 2013) PIM 7.1 comes with new editable channel previews. This helps business users like marketing, translators, merchandisers or product managers to envistion how the product looks at the cutomer facing webshop, catalog or other touchpoint. Getting products live online within seconds, we is key because the customer always wants it now. For eCommerce product data Informatica PIM is certified for IBM WebSphere Commerce to get products ready for ecommerce within seconds.

The editable channel previews helps professionals in product management, merchandizing, marketing and ecommerce to envision their products as customers are facing it. The way of “what you see is what you get (WYSIWYG)” product data management improves customer shopping experience with best and authentic information. With the new eCommerce integration, Informatica speeds up the time to market in eBusiness. The new standard (certified by IBM WebSphere Commerce enables a live update of eShops with real time integration.

The growing need for fast and s ecure collaboration across globally acting enterprises is addressed by the Business Process Management tool of Informatica, which can now be used for PIM customers.

Intelligent insights: How relevant is our offering to your customers?

This is the age of annoyance and information overload. Each day, the average person has to handle more than 7,000 pieces of information. Only 25% of Americans say there are brand loyal. That means brands and retailers have to earn every new sale in a transparent world. In this context information needs to be relevant to the recipient.

  • Where do the data come from? How can product information auto-cleansed and characterizing into a taxonomy?
  • Is the supplier performance hitting our standards?
  • How can we mitigate risks like hidden costs and work with trusted suppliers only?
  • How can we and build customer segmentations for marketing?
  • How to build product personalization and predict the next logical buy of the customer?

It is all about The Right product. To the Right Person. In the Right Way. Learn more about the vision of the Intelligent Data Plaform.

Informatica PIM Builds the Basis of Real Time Commerce Information

All these innovations speed up the new product introduction and collaboration massively. As buyers today are always online and connected, PIM helps our customer to serve the informed purchase journey, with the right information in at the right touch point and in real time.

  1. Real-time commerce (certification with IBM WebSphere Commerce), which eliminates shelf lag
  2. Editable channel preview which help to envision how customers view the product
  3. Data quality dashboards for improved conversion power, which means selling more with better information
  4. Business Process Management for better collaboration throughout the enterprise
  5. Accelerator for global data synchronization (GDSN like GS1 for food and CPG) – which helps to improve quality of data and fulfill legal requirements

All this makes merchandizers more productive and increases average spend per customer.

Find out how the new release of Informatica PIM 7.1 helps you to unleash conversion power on the customer’s informed purchase journey.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, CMO, Manufacturing, Master Data Management, PiM, Product Information Management, Retail | Tagged , , , | Leave a comment

The Five C’s of Data Management

The Five C’s of Data Management

The Five C’s of Data Management

A few days ago, I came across a post, 5 C’s of MDM (Case, Content, Connecting, Cleansing, and Controlling), by Peter Krensky, Sr. Research Associate, Aberdeen Group and this response by Alan Duncan with his 5 C’s (Communicate, Co-operate, Collaborate, Cajole and Coerce). I like Alan’s list much better. Even though I work for a product company specializing in information management technology, the secret to successful enterprise information management (EIM) is in tackling the business and organizational issues, not the technology challenges. Fundamentally, data management at the enterprise level is an agreement problem, not a technology problem.

So, here I go with my 5 C’s: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data Governance, Data Integration, Enterprise Data Management, Integration Competency Centers, Master Data Management | Tagged , , , , , | Leave a comment

The Information Difference Pegs Informatica as a Top MDM Vendor

Information Difference MDM Diagram v2This blog post feels a little bit like bragging… and OK, I guess it is pretty self-congratulatory to announce that this year, Informatica was again chosen as a leader in MDM and PIM by The Information Difference. As you may know, The Information Difference is an independent research firm that specializes in the MDM industry and each year surveys, analyzes and ranks MDM and PIM providers and customers around the world. This year, like last year, The Information Difference named Informatica tops in the space.

Why do I feel especially chuffed about this?  Because of our customers.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , , | Leave a comment

One Search Procurement – For the Purchasing of Indirect Goods and Services

One Search Procurement – for purchasing of indirect goods and services 

Informatica Procurement is the internal Amazon for purchasing of MRO, C-goods, indirect materials and services. Informatica Procurement supports enterprise companies in catalog procurement with an industry-independent catalog procurement solution that enables fast and cost-efficient procurement of products and services and supplier integration in an easy to use self-service concept.

Information Procurement at a glance

informatica-procurement-at-a-glance

Informatica recently announced the availability of Informatica Procurement 7.3, the catalog procurement solution. I meet with Melanie Kunz our product manager to learn from here what’s new.

Melanie, for our readers and followers, who is using Informatica Procurement, for which purposes?

Melanie Kunz

Melanie Kunz: Informatica Procurement is industry-independent. Our customers are based in different industries – from engineering and the automotive to companies in the public sector (e.g. Cities). The responsibilities of people who work with Informatica Procurement differ depending on the company. For some customers, only employees from the purchasing department order items in Informatica Procurement. For other customers, all employees are allowed to order their needs themselves. Examples are employees who need screws for the completion of their product or office staff who ordered the business cards for the manager.

What is the most important thing to know about Informatica Procurement 7.3?

Melanie Kunz: In companies where a lot of IT equipment is ordered, it is important to always see the current prices. With each price changes, the catalog would have to be imported into Informatica Procurement. With a punch out to the online shop of IT equipment manufacturer, this is much easier and more efficient. The data from these catalogs are all available in Informatica Procurement, but the price can always be called on a daily basis from the online shop.

Users no longer need to leave Informatica Procurement to order items from external online shops. Informatica Procurement now enables the user to locate internal and indexed external items in just one search. That means you do not have to use different eShops for when you order new office stationary, IT equipment or services.

Great, what is the value for enterprise users and purchasing departments?

Melanie Kunz: All items in Informatica Procurement have the negotiated prices. Informatica Procurement is simple and intuitive that each employee can use the system without training. The view concept allows the restriction on products. For each employee (each department), the administrator can define a view. This view contains only the products that can be seen and ordered.

When you open the detail view for an indexed external item, the current price is determined from the external online shop. This price is saved in item detail view for a defined period. In this way, the user always gets the current price for the item.

The newly designed detail view has an elegant and clear layout. Thus, a high level of user experience is safe. This also applies to the possibility of image enlargement in the search result list.

What if I order same products frequently, like my business cards?

Melanie Kunz: The overview of recent shopping carts help users to reorder the same items on an easy and fast way. A shopping cart from a previous order can use as basis for this new order.

Large organizations with 1000s of employees are even more might have totally different needs what they need for the daily business and maybe dedicated to their career level. How do you address this?

Melanie Kunz: The standard assortment feature has been enhanced in Informatica Procurement 7.3. Administrators can define the assortment per user. Furthermore, it is possible to specify whether users have to search the standard assortment first and only search in the entire assortment if they do not find the relevant item in the standard assortment.

All of these features and many more minor features not only enhance the user experience, but also reduce the processing time of an order drastically.

Informatica Procurement 7.3 “One Search” at a glance

One Search Procurement

 

Learn more on Informatica Procurement 7.3 with the latest webinar.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Enterprise Data Management, Life Sciences, Manufacturing, Marketplace, Master Data Management, News & Announcements, Operational Efficiency, PiM, Public Sector | Tagged , , , | Leave a comment

How Much is Poorly Managed Supplier Information Costing Your Business?

Supplier Information“Inaccurate, inconsistent and disconnected supplier information prohibits us from doing accurate supplier spend analysis, leveraging discounts, comparing and choosing the best prices, and enforcing corporate standards.”

This is quotation from a manufacturing company executive. It illustrates the negative impact that poorly managed supplier information can have on a company’s ability to cut costs and achieve revenue targets.

Many supply chain and procurement teams at large companies struggle to see the total relationship they have with suppliers across product lines, business units and regions. Why? Supplier information is scattered across dozens or hundreds of Enterprise Resource Planning (ERP) and Accounts Payable (AP) applications. Too much valuable time is spent manually reconciling inaccurate, inconsistent and disconnected supplier information in an effort to see the big picture. All this manual effort results in back office administrative costs that are higher than they should be.

Do these quotations from supply chain leaders and their teams sound familiar?

  • “We have 500,000 suppliers. 15-20% of our supplier records are duplicates. 5% are inaccurate.”
  • I get 100 e-mails a day questioning which supplier to use.”
  • “To consolidate vendor reporting for a single supplier between divisions is really just a guess.”
  • “Every year 1099 tax mailings get returned to us because of invalid addresses, and we play a lot of Schedule B fines to the IRS.”
  • “Two years ago we spent a significant amount of time and money cleansing supplier data. Now we are back where we started.”
Webinar, Supercharge Your Supply Chain Apps with Better Supplier Information

Join us for a Webinar to find out how to supercharge your supply chain applications with clean, consistent and connected supplier information

Please join me and Naveen Sharma, Director of the Master Data Management (MDM) Practice at Cognizant for a Webinar, Supercharge Your Supply Chain Applications with Better Supplier Information, on Tuesday, July 29th at 11 am PT.

During the Webinar, we’ll explain how better managing supplier information can help you achieve the following goals:

  1. Accelerate supplier onboarding
  2. Mitiate the risk of supply disruption
  3. Better manage supplier performance
  4. Streamline billing and payment processes
  5. Improve supplier relationship management and collaboration
  6. Make it easier to evaluate non-compliance with Service Level Agreements (SLAs)
  7. Decrease costs by negotiating favorable payment terms and SLAs

I hope you can join us for this upcoming Webinar!

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Quality, Manufacturing, Master Data Management | Tagged , , , , , , , , , , , , , , | Leave a comment

Master Data and Data Security …It’s Not Complicated

Master Data and Data Security…It’s Not Complicated

Master Data and Data Security…It’s Not Complicated

The statement on Master Data and Data security was well intended.  I can certainly understand the angst around data security.  Especially after Target’s data breach, it is top of mind for all IT and now business executives.  But the root of the statement was flawed.  And it got me thinking about master data and data security.

“If I use master data technology to create a 360-degree view of my client and I have a data breach, then someone could steal all the information about my client.”

Um, wait, what?  Insurance companies take personally identifiable information very seriously.  The statement is flawed in the relationship between client master data and securing your client data.  Let’s dissect the statement and see what master data and data security really mean for insurers.  We’ll start by level setting a few concepts.

What is your Master Client Record?

Your master client record is your 360-degree view of your client.  It represents everything about your client.  It uses Master Data Management technology to virtually integrate and syndicate all of that data into a single view.  It leverages identifiers to ensure integrity in the view of the client record.  And finally it makes an effort through identifiers to correlate client records for a network effect.

There are benefits to understanding everything about your client.  The shape and view of each client is specific to your business.  As an insurer looks at their policyholders, the view of “client” is based on relationships and context that the client has to the insurer.  This are policies, claims, family relationships, history of activities and relationships with agency channels.

And what about security?

Naturally there is private data in a client record.  But there is nothing about the consolidated client record that contains any more or less personally identifiable information.  In fact, most of the data that a malicious party would be searching for can likely be found in just a handful of database locations.  Additionally breaches happen “on the wire”.  Policy numbers, credit card info, social security numbers, and birth dates can be found in less than five database tables.  And they can be found without a whole lot of intelligence or analysis.

That data should be secured.  That means that the data should be encrypted or masked so that any breach will protect the data.  Informatica’s data masking technology allows this data to be secured in whatever location.  It provides access control so that only the right people and applications can see the data in an unsecured format.  You could even go so far as to secure ALL of your client record data fields.  That’s a business and application choice.  Do not confuse field or database level security with a decision to NOT assemble your golden policyholder record.

What to worry about?  And what not to worry about?

Do not succumb to fear of mastering your policyholder data.  Master Data Management technology can provide a 360-degree view.  But it is only meaningful within your enterprise and applications.  The view of “client” is very contextual and coupled with your business practices, products and workflows.  Even if someone breaches your defenses and grabs data, they’re looking for the simple PII and financial data.  Then they’re grabbing it and getting out. If the attacker could see your 360-degree view of a client, they wouldn’t understand it.  So don’t over complicate the security of your golden policyholder record.  As long as you have secured the necessary data elements, you’re good to go.  The business opportunity cost of NOT mastering your policyholder data far outweighs any imagined risk to PII breach.

So what does your Master Policyholder Data allow you to do?

Imagine knowing more about your policyholders.  Let that soak in for a bit.  It feels good to think that you can make it happen.  And you can do it.  For an insurer, Master Data Management provides powerful opportunities across everything from sales, marketing, product development, claims and agency engagement.  Each channel and activity has discreet ROI.  It also has direct line impact on revenue, policyholder satisfaction and market share.  Let’s look at just a few very real examples that insurers are attempting to tackle today.

  1. For a policyholder of a certain demographic with an auto and home policy, what is the next product my agent should discuss?
  2. How many people live in a certain policyholder’s household?  Are there any upcoming teenage drivers?
  3. Does this personal lines policyholder own a small business?  Are they a candidate for a business packaged policy?
  4. What is your policyholder claims history?  What about prior carriers and network of suppliers?
  5. How many touch points have your agents and had with your policyholders?  Were they meaningful?
  6. How can you connect with you policyholders in social media settings and make an impact?
  7. What is your policyholder mobility usage and what are they doing online that might interest your Marketing team?

These are just some of the examples of very streamlined connections that you can make with your policyholders once you have your 360-degree view. Imagine the heavy lifting required to do these things without a Master Policyholder record.

Fear is the enemy of innovation.  In mastering policyholder data it is important to have two distinct work streams.  First, secure the necessary data elements using data masking technology.  Once that is secure, gain understanding through the mastering of your policyholder record.  Only then will you truly be able to take your clients’ experience to the next level.  When that happens watch your revenue grow in leaps and bounds.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Security, Financial Services, Master Data Management | Tagged , , | Leave a comment

How Much is Disconnected Well Data Costing Your Business?

“Not only do we underestimate the cost for projects up to 150%, but we overestimate the revenue it will generate.” This quotation from an Energy & Petroleum (E&P) company executive illustrates the negative impact of inaccurate, inconsistent and disconnected well data and asset data on revenue potential. 

“Operational Excellence” is a common goal of many E&P company executives pursuing higher growth targets. But, inaccurate, inconsistent and disconnected well data and asset data may be holding them back. It obscures the complete picture of the well information lifecycle, making it difficult to maximize production efficiency, reduce Non-Productive Time (NPT), streamline the oilfield supply chain, calculate well by-well profitability,  and mitigate risk.

Well data expert, Stephanie Wilkin shares details about the award-winning collaboration between Noah Consulting and Devon Energy.

Well data expert, Stephanie Wilkin shares details about the award-winning collaboration between Noah Consulting and Devon Energy.

To explain how E&P companies can better manage well data and asset data, we hosted a webinar, “Attention E&P Executives: Streamlining the Well Information Lifecycle.” Our well data experts Stephanie Wilkin, Senior Principal Consultant at Noah Consulting, and Stephan Zoder, Director of Value Engineering at Informatica shared some advice. E&P companies should reevaluate “throwing more bodies at a data cleanup project twice a year.” This approach does not support the pursuit of operational excellence.

In this interview, Stephanie shares details about the award-winning collaboration between Noah Consulting and Devon Energy to create a single trusted source of well data, which is standardized and mastered.

Q. Congratulations on winning the 2014 Innovation Award, Stephanie!
A. Thanks Jakki. It was really exciting working with Devon Energy. Together we put the technology and processes in place to manage and master well data in a central location and share it with downstream systems on an ongoing basis. We were proud to win the 2014 Innovation Award for Best Enterprise Data Platform.

Q. What was the business need for mastering well data?
A. As E&P companies grow so do their needs for business-critical well data. All departments need clean, consistent and connected well data to fuel their applications. We implemented a master data management (MDM) solution for well data with the goals of improving information management, business productivity, organizational efficiency, and reporting.

Q. How long did it take to implement the MDM solution for well data?
A. The Devon Energy project kicked off in May of 2012. Within five months we built the complete solution from gathering business requirements to development and testing.

Q. What were the steps in implementing the MDM solution?
A: The first and most important step was securing buy-in on a common definition for master well data or Unique Well Identifier (UWI). The key was to create a definition that would meet the needs of various business functions. Then we built the well master, which would be consistent across various systems, such as G&G, Drilling, Production, Finance, etc. We used the Professional Petroleum Data Management Association (PPDM) data model and created more than 70 unique attributes for the well, including Lahee Class, Fluid Direction, Trajectory, Role and Business Interest.

As part of the original go-live, we had three source systems of well data and two target systems connected to the MDM solution. Over the course of the next year, we added three additional source systems and four additional target systems. We did a cross-system analysis to make sure every department has the right wells and the right data about those wells. Now the company uses MDM as the single trusted source of well data, which is standardized and mastered, to do analysis and build reports.

Q. What’s been the traditional approach for managing well data?
A. Typically when a new well is created, employees spend time entering well data into their own systems. For example, one person enters well data into the G&G application. Another person enters the same well data into the Drilling application. A third person enters the same well data into the Finance application. According to statistics, it takes about 30 minutes to enter wells into a particular financial application.

So imagine if you need to add 500 new wells to your systems. This is common after a merger or acquisition. That translates to roughly 250 hours or 6.25 weeks of employee time saved on the well create process! By automating across systems, you not only save time, you eliminate redundant data entry and possible errors in the process.

Q. That sounds like a painfully slow and error-prone process.
A. It is! But that’s only half the problem. Without a single trusted source of well data, how do you get a complete picture of your wells? When you compare the well data in the G&G system to the well data in the Drilling or Finance systems, it’s typically inconsistent and difficult to reconcile. This leads to the question, “Which one of these systems has the best version of the truth?” Employees spend too much time manually reconciling well data for reporting and decision-making.

Q. So there is a lot to be gained by better managing well data.
A. That’s right. The CFO typically loves the ROI on a master well data project. It’s a huge opportunity to save time and money, boost productivity and get more accurate reporting.

Q: What were some of the business requirements for the MDM solution?
A: We couldn’t build a solution that was narrowly focused on meeting the company’s needs today. We had to keep the future in mind. Our goal was to build a framework that was scalable and supportable as the company’s business environment changed. This allows the company to add additional data domains or attributes to the well data model at any time.

Noah Consulting's MDM Trust Framework for well data

The Noah Consulting MDM Trust Framework was used to build a single trusted source of well data

Q: Why did you choose Informatica MDM?
A: The decision to use Informatica MDM for the MDM Trust Framework came down to the following capabilities:

  • Match and Merge: With Informatica, we get a lot of flexibility. Some systems carry the API or well government ID, but some don’t. We can match and merge records differently based on the system.
  • X-References: We keep a cross-reference between all the systems. We can go back to the master well data and find out where that data came from and when. We can see where changes have occurred because Informatica MDM tracks the history and lineage.
  • Scalability: This was a key requirement. While we went live after only 5 months, we’ve been continually building out the well master based on the requiremets of the target systems.
  • Flexibility: Down the road, if we want to add an additional facet or classification to the well master, the framework allows for that.
  • Simple Integration: Instead of building point-to-point integrations, we use the hub model.

In addition to Informatica MDM, our Noah Consulting MDM Trust Framework includes Informatica PowerCenter for data integration, Informatica Data Quality for data cleansing and Informatica Data Virtualization.

Q: Can you give some examples of the business value gained by mastering well data?
A: One person said to me, “I’m so overwhelmed! We’ve never had one place to look at this well data before.” With MDM centrally managing master well data and fueling key business applications, many upstream processes can be optimized to achieve their full potential value.

People spend less time entering well data on the front end and reconciling well data on the back end. Well data is entered once and it’s automatically shared across all systems that need it. People can trust that it’s consistent across systems. Also, because the data across systems is now tied together, it provides business value they were unable to realize before, such as predictive analytics. 

Q. What’s next?
A. There’s a lot of insight that can be gained by understanding the relationships between the well, and the people, equipment and facilities associated with it. Next, we’re planning to add the operational hierarchy. For example, we’ll be able to identify which production engineer, reservoir engineer and foreman are working on a particular well.

We’ve also started gathering business requirements for equipment and facilities to be tied to each well. There’s a lot more business value on the horizon as the company streamlines their well information lifecycle and the valuable relationships around the well.

If you missed the webinar, you can watch the replay now: Attention E&P Executives: Streamlining the Well Information Lifecycle.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Operational Efficiency, PowerCenter, Utilities & Energy | Tagged , , , , , , , | Leave a comment

To Engage Business, Focus on Information Management rather than Data Management

Focus on Information Management

Focus on Information Management

IT professionals have been pushing an Enterprise Data Management agenda for decades rather than Information Management and are frustrated with the lack of business engagement. So what exactly is the difference between Data Management and Information Management and why does it matter? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Architects, Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance, Data Integration, Enterprise Data Management, Integration Competency Centers, Master Data Management | Tagged , , , , | Leave a comment

Don’t Rely on CRM as Your Single Source of Trusted Customer Data

Step 1: Determine if you have a customer data problem

A statement I often hear from marketing and sales leaders unfamiliar with the concept of mastering customer data is, “My CRM application is our single source of trusted customer data.” They use CRM to onboard new customers, collecting addresses, phone numbers and email addresses. They append a DUNS number. So it’s no surprise they may expect they can master their customer data in CRM. (To learn more about the basics of managing trusted customer data, read this: How much does bad data cost your business?)

It may seem logical to expect your CRM investment to be your customer master – especially since so many CRM vendors promise a “360 degree view of your customer.” But you should only consider your CRM system as the source of truth for trusted customer data if:

Shopper
For most large enterprises, CRM never delivered on that promise of a trusted 360-degree customer view.

 ·  You have only a single instance of Salesforce.com, Siebel CRM, or other CRM

·  You have only one sales organization (vs. distributed across regions and LOBs)

·  Your CRM manages all customer-focused processes and interactions (marketing, service, support, order management, self-service, etc)

·  The master customer data in your CRM is clean, complete, fresh, and free of duplicates


Unfortunately most mid-to-large companies cannot claim such simple operations. For most large enterprises, CRM never delivered on that promise of a trusted 360-degree customer view. That’s what prompted Gartner analysts Bill O’Kane and Kimbery Collins to write this report,
 MDM is Critical to CRM Optimization, in February 2014.

“The reality is that the vast majority of the Fortune 2000 companies we talk to are complex,” says Christopher Dwight, who leads a team of master data management (MDM) and product information management (PIM) sales specialists for Informatica. Christopher and team spend each day working with retailers, distributors and CPG companies to help them get more value from their customer, product and supplier data. “Business-critical customer data doesn’t live in one place. There’s no clear and simple source. Functional organizations, processes, and systems landscapes are much more complicated. Typically they have multiple selling organizations across business units or regions.”

As an example, listed below are typical functional organizations, and common customer master data-dependent applications they rely upon, to support the lead-to-cash process within a typical enterprise:

·  Marketing: marketing automation, campaign management and customer analytics systems.
·  Ecommerce: e-commerce storefront and commerce applications.
·  Sales: sales force automation, quote management,
·  Fulfillment: ERP, shipping and logistics systems.
·  Finance: order management and billing systems.
·  Customer Service: CRM, IVR and case management systems.

The fragmentation of critical customer data across multiple organizations and applications is further exacerbated by the explosive adoption of Cloud applications such as Salesforce.com and Marketo. Merger and acquisition (M&A) activity is common among many larger organizations where additional legacy customer applications must be onboarded and reconciled. Suddenly your customer data challenge grows exponentially.  

Step 2: Measure how customer data fragmentation impacts your business

Ask yourself: if your customer data is inaccurate, inconstant and disconnected can you:

Customer data is fragmented across multiple applications used by business units, product lines, functions and regions.

Customer data is fragmented across multiple applications used by business units, product lines, functions and regions.

·  See the full picture of a customer’s relationship with the business across business units, product lines, channels and regions?  

·  Better understand and segment customers for personalized offers, improving lead conversion rates and boosting cross-sell and up-sell success?

·  Deliver an exceptional, differentiated customer experience?

·  Leverage rich sources of 3rd party data as well as big data such as social, mobile, sensors, etc.., to enrich customer insights?

“One company I recently spoke with was having a hard time creating a single consolidated invoice for each customer that included all the services purchased across business units,” says Dwight. “When they investigated, they were shocked to find that 80% of their consolidated invoices contained errors! The root cause was innaccurate, inconsistent and inconsistent customer data. This was a serious business problem costing the company a lot of money.”

Let’s do a quick test right now. Are any of these companies your customers: GE, Coke, Exxon, AT&T or HP? Do you know the legal company names for any of these organizations? Most people don’t. I’m willing to bet there are at least a handful of variations of these company names such as Coke, Coca-Cola, The Coca Cola Company, etc in your CRM application. Chances are there are dozens of variations in the numerous applications where business-critical customer data lives and these customer profiles are tied to transactions. That’s hard to clean up. You can’t just merge records because you need to maintain the transaction history and audit history. So you can’t clean up the customer data in this system and merge the duplicates.

The same holds true for B2C customers. In fact, I’m a nightmare for a large marketing organization. I get multiple offers and statements addressed to different versions of my name: Jakki Geiger, Jacqueline Geiger, Jackie Geiger and J. Geiger. But my personal favorite is when I get an offer from a company I do business with addressed to “Resident”. Why don’t they know I live here? They certainly know where to find me when they bill me!

Step 3: Transform how you view, manage and share customer data

Why do so many businesses that try to master customer data in CRM fail? Let’s be frank. CRM systems such as Salesforce.com and Siebel CRM were purpose built to support a specific set of business processes, and for the most part they do a great job. But they were never built with a focus on mastering customer data for the business beyond the scope of their own processes.

But perhaps you disagree with everything discussed so far. Or you’re a risk-taker and want to take on the challenge of bringing all master customer data that exists across the business into your CRM app. Be warned, you’ll likely encounter four major problems:

1) Your master customer data in each system has a different data model with different standards and requirements for capture and maintenance. Good luck reconciling them!

2) To be successful, your customer data must be clean and consistent across all your systems, which is rarely the case.

3) Even if you use DUNS numbers, some systems use the global DUNS number; others use a regional DUNS number. Some manage customer data at the legal entity level, others at the site level. How do you connect those?

4) If there are duplicate customer profiles in CRM tied to transactions, you can’t just merge the profiles because you need to maintain the transactional integrity and audit history. In this case, you’re dead on arrival.

There is a better way! Customer-centric, data-driven companies recognize these obstacles and they don’t rely on CRM as the single source of trusted customer data. Instead, they are transforming how they view, manage and share master customer data across the critical applications their businesses rely upon. They embrace master data management (MDM) best practices and technologies to reconcile, merge, share and govern business-critical customer data. 

More and more B2B and B2C companies are investing in MDM capabilities to manage customer households and multiple views of customer account hierarchies (e.g. a legal view can be shared with finance, a sales territory view can be shared with sales, or an industry view can be shared with a business unit).

 

Gartner Report, MDM is Critical to CRM Optimization, Bill O'Kane & Kimberly Collins, February 7 2014.

Gartner Report, MDM is Critical to CRM Optimization, Bill O’Kane & Kimberly Collins, February 7 2014.

According to Gartner analysts Bill O’Kane and Kimberly Collins, “Through 2017, CRM leaders who avoid MDM will derive erroneous results that annoy customers, resulting in a 25% reduction in potential revenue gains,” according to this Gartner report, MDM is Critical to CRM Optimization, February 2014.

Are you ready to reassess your assumptions about mastering customer data in CRM?

Get the Gartner report now: MDM is Critical to CRM Optimization.

FacebookTwitterLinkedInEmailPrintShare
Posted in CMO, Customer Acquisition & Retention, Customers, Data Governance, Master Data Management, Mergers and Acquisitions | Tagged , , , , , , , , , , , , , , , , | Leave a comment

World Cup of Data: The Early Bird Closes the Sale

Did you know the 2014 Brasil World Cup is actually the World Cup of Data? In addition to the visible matches played on the pitch, eShops will be in a simultaneous struggle to win real-time online merchandise customers.

Let me explain. Jogi Löw, the manager of the German team, is known for his stylish attire. At every major event, each European Cup and World Cup, he wears newly designed shirts and suits. As a result, when television audiences see each new article of clothing, there is a corresponding increase in related online retail activity. When Löw began this tradition, people didn’t know that his outfits were made by Strenesse. As a result, people searched using the keywords “Jogi Löw Shirt.” This drove traffic to the eShop with the best search engine optimization, giving them more conversions and more revenue.

If a manager’s attire drives online retail sales, imagine how much demand there is for the jerseys worn by the most visible World Cup athletes? Many of the these players have huge social media followings. Consider the size of the social media followings of Ronaldo, Kakà, Neymar, Ronaldinho and Wayne Rooney:

football social top5

(Source: http://fanpagelist.com/category/athletes/soccer/view/list/sort/followers/page1)

There is huge demand for these player’s jerseys. This demand will only increase as the games progress. Once the winner is decided, Google searches will rise for phrases like “World Cup Winner Jersey 2014 of xxx”. Some refer to this as the super long tail. And research does show that search queries with 3 or more words have better conversion rates than queries with only 1 or 2 words.

Longtail-image

(Source: http://www.conductor.com/resource-center/research/long-tail-search )

Who can predict the winners?

What happens if a fairly unknown player scores the last goal in over time? How will that event impact social media activity and search engine volumes? Who will be able to leverage this activity to sell the relevant merchandising products fast enough? The eShop with the best data will have the quickest response. And the eShop with the quickest response will get the traffic and the revenue.

The world cup is a battle. The early bird closes the sale. It’s time to play the World Cup of Data.

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, PiM, Product Information Management, Real-Time, Retail | Tagged , , , , | Leave a comment