Category Archives: Master Data Management

Brandspiration – The Story of the Gap Stick and an Award Winner

This is the story about a great speaker, a simple but funny product and the idea of a Ventana Award winning company which does “Brandspiration”.

When I invited Dale Denham, CIO from Geiger to speak on his at Informatica World this year, I was not sure what I will get. I only knew that Dale is known as an entertaining speaker. What could we expect from a person who, calls himself “the selling CIO”?

And Dale delivered. He opened his session “How product information in ecommerce improved Geiger’s ability to promote and sell promotional products” with a video.

What I liked about it was: It is a simple product, addressing a everyday problem, everybody knows. And this is the business of Geiger & Crestline, two brands in one company which sell promotional products to help companies inspire with their brand. They call is “Brandspiration”.

What this has to do with PIM?

Well the business need for Geiger was to sell 100,000s of products more efficient. Which includes update products faster and more accurately and add more products. But also Geiger was planning to

  • Eliminate reliance on ERP
  • Launch new web properties
  • Improve SEO
  • Centralize product management & control
  • Standardize business processes & workflows
  • Produce print catalog more faster

Before working with Informatica PIM it took a week to launch a new product. And Geiger/ Crestline has a complex price management for bundles, brands, packages and more under their two own brands for two different target groups: low price products with aggressive pricing and more high quality promotional products.

Business Outcomes

With PIM the product entry time could be reduced by about an hour. Geiger achieved 25% time saving for catalog creation and implemented PIM in about six months. (btw with the integrator “Ideosity“.) Another fact which made me proud on our offering was, that Dale told me his company was able to upgrade on the latest PIM version within hours.

“PIM has allowed us to be more proactive Instead of being handcuffed to a system that made us reactive. A great invest for this company. I can’t believe we survived for as long as we did without this software.” 
Dale Denham, CIO

Whatch the video of Dale and how his company Geiger realizes Brandspiration with Informatica PIM. Did you know, Geiger is a proud winner of the Ventana Research Innovation Award for their PIM initiative?

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, PiM, Product Information Management, Retail | Tagged , , , , , | Leave a comment

We Are Sports – SportScheck Omnichannel Retail

Are you a manager dedicated to fashion, B2C or retail? This blog provides an overview what companies can learn on omnichannel from SportScheck.

SportScheck is one of Germany’s most successful multichannel businesses. SportScheck (btw Ventana Research Innovation Award Winner) is an equipment and clothing specialist for almost every sport and its website gets over 52 million hits per year, making it one of the most successful online stores in Germany.

Each year, more than million customers sign up for the mail-order business while over 17 million customers visit its brick and mortar stores (Source). These figures undoubtedly describe the success of SportScheck’s multichannel strategy. SportScheck also strives to deliver innovative concepts in all of its sales channels, while always aiming to provide customers with the best shopping experience possible. This philosophy can be carried out only in conjunction with modern systems landscapes and optimized processes.

Complete, reliable, and attractive information – across every channel – is the key to a great customer experience and better sales. It’s hard to keep up with customer demands in a single channel, much less multiple channels. Download The Informed Purchase Journey. The Informed Purchase Journey requires the right product, to right customer at the right place. Enjoy the video!

What is the Business Initiative in SportScheck

  • Providing the customer the same deals across all sales channels with a centralized location for all product information
  • Improve customer service in all sales channels with perfect product data
  • Make sure customers have enough product information to make a purchase without the order being returned

Intelligent and Agile Processes are Key to Success

“Good customer service, whether online, in-store, or in print, needs perfect product data” said Alexander Pischetsrieder in an interview. At the Munich-based sporting goods retailer, there had been no centralized system for product data before now. After extensive research and evaluation, the company decided to implement the product information management (PIM) system from Informatica.

The main reason for the introduction of Informatica Product Information Management (PIM) solutions was its support for a true multichannel strategy. Customers should have access to the same deals across all sales channels. In addition to making a breadth of information available, customer service still remains key.

In times where information is THE killer app, key challenges are, keeping information up to date and ensuring efficient processes. In a retail scenario, product catalog onboarding starts with PIM to get the latest product information. A dataset in the relevant systems that is always up-to-date is a further basis, which allows companies to react immediately to market movements and implement marketing requirements as quickly as possible. Data must be exchanged between the systems practically in real time. If you want to learn more details, how SportScheck solved the technical integration between SAP ERP and Informatica PIM

Product Data Equals Demonstrated Expertise

“I am convinced that a well-presented product with lots of pictures and details sells better. For us, this signals knowing our product. That sets us apart from the large discount stores,” notes Alexander Pischetsrieder. “In the end, we have to ask: who is the customer going to trust? We gain trust here with our product knowledge and our love of sports in general.” Just like our motto says, “We get our fans excited.” By offering a professional search engine, product comparisons, and many other features, 
PIM adds value not only in ecommerce – and that gets us excited!”

Benefits for SportScheck

  • Centralized location for all product information across all sales channels
  • An agile system that is capable of interweaving the different retail processes across sales channels into a smooth, cross-channel function
  • Self-Service portal for agencies and suppliers with direct upload to the PIM system

For German readers I can highly recommend this video on the customer use case. If you are interested in more details, ask me on Twitter @benrund.

PS: This blog is based on the PIM case study on SportScheck.

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, PiM, Product Information Management, Real-Time, Retail, Uncategorized | Tagged , , , , , | Leave a comment

8 Information Management Challenges for UDI Compliance

“My team spends far too much time pulling together medical device data that’s scattered across different systems and reconciling it in spreadsheets to create compliance reports.” This quotation from a regulatory affairs leader at a medical device manufacturer highlights the impact of poorly managed medical device data on compliance reporting, such as the reports needed for the FDA’s Universal Device Identification (UDI) regulation. In fact, an overreliance on manual, time-consuming processes brings an increased risk of human error in UDI compliance reports.

frustrated_man_computer

Is your compliance team manually reconciling data for UDI compliance reports?

If you are an information management leader working for a medical device manufacturer, and your compliance team needs quick and easy access to medical device data for UDI compliance reporting, I have five questions for you:

1) How many Class III and Class II devices do you have?
2) How many systems or reporting data stores contain data about these medical devices?
3) How much time do employees spend manually fixing data errors before the data can be used for reporting?
4) How do you plan to manage medical device data so the compliance team can quickly and easily produce accurate reports for UDI Compliance?
5) How do you plan to help the compliance team manage the multi-step submission process?

Watch this on-demand webinar "3 EIM Best Practices for UDI Compliance"

Watch this on-demand webinar “3 EIM Best Practices for UDI Compliance”

For some helpful advice from data management experts, watch this on-demand webinar “3 Enterprise Information Management (EIM) Best Practices for UDI Compliance.”

The deadline to submit the first UDI compliance report to the FDA for Class III devices is September 24, 2014. But, the medical device data needed to produce the report is typically scattered among different internal systems, such as Enterprise Resource Planning (ERP) e.g. SAP and JD Edwards, Product Lifecycle Management (PLM), Manufacturing Execution Systems (MES) and external 3rd party device identifiers.

The traditional approach for dealing with poorly managed data is the compliance team burns the midnight oil to bring together and then manually reconcile all the medical device data in a spreadsheet. And, they have to do this each and every time a compliance report is due. The good news is your compliance team doesn’t have to.

Many medical device manufacturers are are leveraging their existing data governance programs, supported by a combination of data integration, data quality and master data management (MDM) technology to eliminate the need for manual data reconciliation. They are centralizing their medical device data management, so they have a single source of trusted medical device data for UDI compliance reporting as well as other compliance and revenue generating initiatives.

Get UDI data management advice from data experts Kelle O'Neal, Managing Partner at First San Francisco Partners and Bryan Balding, MDM Specialist at Informatica
Get UDI data management advice from data experts Kelle O’Neal, Managing Partner at First San Francisco Partners and Bryan Balding, MDM Specialist at Informatica

During this this on-demand webinar, Kelle O’Neal, Managing Partner at First San Francisco Partners, covers the eight information management challenges for UDI compliance as well as best practices for medical device data management.

Bryan Balding, MDM Solution Specialist at Informatica, shows you how to apply these best practices with the Informatica UDI Compliance Solution.

You’ll learn how to automate the process of capturing, managing and sharing medical device data to make it quicker and easier to create the reports needed for UDI compliance on ongoing basis.

 

 

20 Questions & Answers about Complying with the FDA Requirement for Unique Device Identification (UDI)

20 Questions & Answers about Complying with the FDA Requirement
for Unique Device Identification (UDI)

Also, we just published a joint whitepaper with First San Francisco Partners, Information Management FAQ for UDI: 20 Questions & Answers about Complying with the FDA Requirement for Unique Device Identification (UDI). Get answers to questions such as:

What is needed to support an EIM strategy for UDI compliance?
What role does data governance play in UDI compliance?
What are the components of a successful data governance program?
Why should I centralize my business-critical medical device data?
What does the architecture of a UDI compliance solution look like?

I invite you to download the UDI compliance FAQ now and share your feedback in the comments section below.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Enterprise Data Management, Life Sciences, Manufacturing, Master Data Management, Vertical | Tagged , , , , , , , , , , , , , , | Leave a comment

CSI: “Enter Location Here”

Last time I talked about how benchmark data can be used in IT and business use cases to illustrate the financial value of data management technologies.  This time, let’s look at additional use cases, and at how to philosophically interpret the findings.

ROI interpretation

We have all philosophies covered

So here are some additional areas of investigation for justifying a data quality based data management initiative:

  • Compliance or any audits data and report preparation and rebuttal  (FTE cost as above)
  • Excess insurance premiums on incorrect asset or party information
  • Excess tax payments due to incorrect asset configuration or location
  • Excess travel or idle time between jobs due to incorrect location information
  • Excess equipment downtime (not revenue generating) or MTTR due to incorrect asset profile or misaligned reference data not triggering timely repairs
  • Equipment location or ownership data incorrect splitting service cost or revenues incorrectly
  • Party relationship data not tied together creating duplicate contacts or less relevant offers and lower response rates
  • Lower than industry average cross-sell conversion ratio due to inability to match and link departmental customer records and underlying transactions and expose them to all POS channels
  • Lower than industry average customer retention rate due to lack of full client transactional profile across channels or product lines to improve service experience or apply discounts
  • Low annual supplier discounts due to incorrect or missing alternate product data or aggregated channel purchase data

I could go on forever, but allow me to touch on a sensitive topic – fines. Fines, or performance penalties by private or government entities, only make sense to bake into your analysis if they happen repeatedly in fairly predictable intervals and are “relatively” small per incidence.  They should be treated like M&A activity. Nobody will buy into cost savings in the gazillions if a transaction only happens once every ten years. That’s like building a business case for a lottery win or a life insurance payout with a sample size of a family.  Sure, if it happens you just made the case but will it happen…soon?

Use benchmarks and ranges wisely but don’t over-think the exercise either.  It will become paralysis by analysis.  If you want to make it super-scientific, hire an expensive consulting firm for a 3 month $250,000 to $500,000 engagement and have every staffer spend a few days with them away from their day job to make you feel 10% better about the numbers.  Was that worth half a million dollars just in 3rd party cost?  You be the judge.

In the end, you are trying to find out and position if a technology will fix a $50,000, $5 million or $50 million problem.  You are also trying to gauge where key areas of improvement are in terms of value and correlate the associated cost (higher value normally equals higher cost due to higher complexity) and risk.  After all, who wants to stand before a budget committee, prophesy massive savings in one area and then fail because it would have been smarter to start with something simpler and quicker win to build upon?

The secret sauce to avoiding this consulting expense and risk is a natural curiosity, willingness to do the legwork of finding industry benchmark data, knowing what goes into them (process versus data improvement capabilities) to avoid inappropriate extrapolation and using sensitivity analysis to hedge your bets.  Moreover, trust an (internal?) expert to indicate wider implications and trade-offs.  Most importantly, you have to be a communicator willing to talk to many folks on the business side and have criminal interrogation qualities, not unlike in your run-of-the-mill crime show.  Some folks just don’t want to talk, often because they have ulterior motives (protecting their legacy investment or process) or hiding skeletons in the closet (recent bad performance).  In this case, find more amenable people to quiz or pry the information out of these tough nuts, if you can.

CSI: "Enter Location Here"

CSI: “Enter Location Here”

Lastly; if you find ROI numbers, which appear astronomical at first, remember that leverage is a key factor.  If a technical capability touches one application (credit risk scoring engine), one process (quotation), one type of transaction (talent management self-service), a limited set of people (procurement), the ROI will be lower than a technology touching multiple of each of the aforementioned.  If your business model drives thousands of high-value (thousands of dollars) transactions versus ten twenty-million dollar ones or twenty-million one-dollar ones, your ROI will be higher.  After all, consider this; retail e-mail marketing campaigns average an ROI of 578% (softwareprojects.com) and this with really bad data.   Imagine what improved data can do just on that front.

I found massive differences between what improved asset data can deliver in a petrochemical or utility company versus product data in a fashion retailer or customer (loyalty) data in a hospitality chain.   The assertion of cum hoc ergo propter hoc is a key assumption how technology delivers financial value.  As long as the business folks agree or can fence in the relationship, you are on the right path.

What’s your best and worst job to justify someone giving you money to invest?  Share that story.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Quality, Governance, Risk and Compliance, Master Data Management, Mergers and Acquisitions | Tagged , , , | Leave a comment

Is the Internet of Things relevant for the government?

Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.

shutterstock_132378518

Is the Internet of Things relevant for the government?

Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.

It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.

One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.

This is the Internet of Things for government. This is the challenge. This is transformation.

This article was originally published on www.federaltimes.com. Please view the original listing here

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Security, Master Data Management, Public Sector, Uncategorized | Tagged , , , , , | Leave a comment

The Catalog is Dead – Long Live the Catalog?

The Catalog is Dead.

Print solution provider Werk II came up with a provocative marketing campaign in 2012. Their ads have been designed like the obituary notice for the “Main Catalog” which is “no longer with us”…

According to the Multi Channel Merchant Outlook 2014 survey, the eCommerce website (not a surprise ;-) ) is the top channel through which merchants market (90%). The social media (87.2%) and email (83%) channels follow close behind. Although catalogs may have dropped as a marketing tool, 51.7% of retailers said they still use the catalog to market their brands.

importance of channels chart

Source: MCM Outlook 2014

The Changing Role of the Catalog

Merchants are still using catalogs to sell products. However, their role has changed from transactional to sales tool. On a scale of 1 to 10, with 10 being the most important, merchant respondents said that using catalogs as mobile traffic drivers and custom retention tools were the most important activities (both scored an 8.25). At 7.85, web traffic driver was a close third.

methods of prospecting chart

Source: MCM Outlook 2014

Long Live the Catalog: Prospecting 

More than three-quarters of merchant respondents said catalogs were the top choice for the method of prospecting they will use in the next 12 months (77.7%). Catalog was the most popular answer, followed by Facebook (68%), email (66%), Twitter (42.7%) and Pinterest (40.8%).

What is your point of view?

How have catalogs changed in your business? What are your plans and outlook for 2015? It would be very interesting to hear points of views from different industries and countries… I’d be happy to discuss here or on Twitter @benrund. My favorite fashion retailer keeps sending me a stylish catalog, which makes me order online. Brands, retailer, consumer – how do you act, what do you expect?

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Manufacturing, Master Data Management, PiM, Product Information Management, Retail, Uncategorized | Tagged , , , | 1 Comment

What’s In A Name?

Naming ConventionsSometimes, the choice of a name has unexpected consequences. Often these consequences aren’t fair. But they exist, nonetheless. For an example of this, consider the well-known study by the National Bureau of Economic Research study that compares the hiring prospects of candidates with identical resumes, but different names. During the study, titled a “Field Experiment on Labor Market Discrimination,” employers were found to be more likely to reply candidates with popular, traditionally Caucasian names than to candidates with either unique, eclectic names or with traditionally African-American names. Though these biases are clearly unfair to the candidates, they do illustrate a key point: One’s choice when naming something can come with perceptions that influence outcomes.

For an example from the IT world, consider my recent engagement at a regional retail bank. In this engagement, half of the meeting time was consumed by IT and business leaders debating how to label their Master Data Management (MDM) Initiative.  Consider these excerpts:

  • Should we even call it MDM? Answer: No. Why? Because nobody on the business side will understand what that means. Also, as we just implemented a Data Warehouse/Mart last year and we are in the middle of our new CRM roll-out, everybody in business and retail banking will assume their data is already mastered in both of these.  On a side note; telcos understand MDM as Mobile Device Management.
  • Should we call it “Enterprise Data Master’? Answer: No. Why? Because unless you roll out all data domains and all functionality (standardization, matching, governance, hierarchy management, etc.) to the whole enterprise, you cannot.  And doing so is a bad idea as it is with every IT project.  Boiling the ocean and going live with a big bang is high cost, high risk and given shifting organizational strategies and leadership, quick successes are needed to sustain the momentum.
  • Should we call it “Data Warehouse – Release 2”? Answer: No. Why? Because it is neither a data warehouse, nor a version 2 of one.  It is a backbone component required to manage a key organizational ingredient – data –in a way that it becomes useful to many use cases, processes, applications and people, not just analytics, although it is often the starting block.  Data warehouses have neither been conceived nor designed to facilitate data quality (they assume it is there already) nor are they designed for real time interactions.  Did anybody ask if ETL is “Pneumatic Tubes – Version 2”?
  • Should we call it “CRM Plus”? Answer: No. Why? Because it has never intended or designed to handle the transactional volume and attribution breadth of high volume use cases, which are driven by complex business processes. Also, if it were a CRM system, it would have a more intricate UI capability beyond comparatively simple data governance workflows and UIs.

Consider this; any data quality solution like MDM, makes any existing workflow or application better at what it does best: manage customer interactions, create orders, generate correct invoices, etc.  To quote a colleague “we are the BASF of software”.  Few people understand what a chemical looks like or does but it makes a plastic container sturdy, transparent, flexible and light.

I also explained hierarchy management in a similar way. Consider it the LinkedIn network of your company, which you can attach every interaction and transaction to.  I can see one view, people in my network see a different one and LinkedIn has probably the most comprehensive view but we are all looking at the same core data and structures ultimately.

So let’s call the “use” of your MDM “Mr. Clean”, aka Meister Proper, because it keeps everything clean.

While naming is definitely a critical point to consider given the expectations, fears and reservations that come with MDM and the underlying change management, it was hilarious to see how important it suddenly was.  However, it was puzzling to me (maybe a naïve perspective) why mostly recent IT hires had to categorize everything into new, unique functional boxes, while business and legacy IT people wanted to re-purpose existing boxes.  I guess, recent IT used their approach to showcase that they were familiar with new technologies and techniques, which was likely a reason for their employment.  Business leaders, often with the exception of highly accomplished and well regarded ones, as well as legacy IT leaders, needed to reassure continuity and no threat of disruption or change.  Moreover, they also needed to justify their prior software investments’ value proposition.

Aside from company financial performance and regulatory screw-ups, legions of careers will be decide if, how and how successful this initiative will be.

Naming a new car model for a 100,000 production run or a shampoo for worldwide sales could not face much more scrutiny.  Software vendors give their future releases internal names of cities like Atlanta or famous people like Socrates instead of descriptive terms like “Gamification User Interface Release” or “Unstructured Content Miner”. This may be a good avenue for banks and retailers to explore.  It would avoid the expectation pitfalls associated with names like “Customer Success Data Mart”, “Enterprise Data Factory”, “Data Aggregator” or “Central Property Repository”.  In reality, there will be many applications, which can claim bits and pieces of the same data, data volume or functionality.  Who will make the call on which one will be renamed or replaced to explain to the various consumers what happened to it and why.

You can surely name any customer facing app something more descriptive like “Payment Central” or “Customer Success Point” but the reason why you can do this is that the user will only have one or maybe two points to interface with the organization. Internal data consumers will interact many more repositories.  Similarly, I guess this is all the reason why I call my kids by their first name and strangers label them by their full name, “Junior”, “Butter Fingers” or “The Fast Runner”.

I would love to hear some other good reasons why naming conventions should be more scrutinized.  Maybe you have some guidance on what should and should not be done and the reasons for it?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, CIO, Data Quality, Master Data Management | Tagged , , | Leave a comment

The King of Benchmarks Rules the Realm of Averages

A mid-sized insurer recently approached our team for help. They wanted to understand how they fell short in making their case to their executives. Specifically, they proposed that fixing their customer data was key to supporting the executive team’s highly aggressive 3-year growth plan. (This plan was 3x today’s revenue).  Given this core organizational mission – aside from being a warm and fuzzy place to work supporting its local community – the slam dunk solution to help here is simple.  Just reducing the data migration effort around the next acquisition or avoiding the ritual annual, one-off data clean-up project already pays for any tool set enhancing data acquisitions, integration and hygiene.  Will it get you to 3x today’s revenue?  It probably won’t.  What will help are the following:

The King of Benchmarks Rules the Realm of Averages

Making the Math Work (courtesy of Scott Adams)

Hard cost avoidance via software maintenance or consulting elimination is the easy part of the exercise. That is why CFOs love it and focus so much on it.  It is easy to grasp and immediate (aka next quarter).

Soft cost reduction, like staff redundancies are a bit harder.  Despite them being viable, in my experience very few decision makers want work on a business case to lay off staff.  My team had one so far. They look at these savings as freed up capacity, which can be re-deployed more productively.   Productivity is also a bit harder to quantify as you typically have to understand how data travels and gets worked on between departments.

However, revenue effects are even harder and esoteric to many people as they include projections.  They are often considered “soft” benefits, although they outweigh the other areas by 2-3 times in terms of impact.  Ultimately, every organization runs their strategy based on projections (see the insurer in my first paragraph).

The hardest to quantify is risk. Not only is it based on projections – often from a third party (Moody’s, TransUnion, etc.) – but few people understand it. More often, clients don’t even accept you investigating this area if you don’t have an advanced degree in insurance math. Nevertheless, risk can generate extra “soft” cost avoidance (beefing up reserve account balance creating opportunity cost) but also revenue (realizing a risk premium previously ignored).  Often risk profiles change due to relationships, which can be links to new “horizontal” information (transactional attributes) or vertical (hierarchical) from parent-child relationships of an entity and the parent’s or children’s transactions.

Given the above, my initial advice to the insurer would be to look at the heartache of their last acquisition, use a benchmark for IT productivity from improved data management capabilities (typically 20-26% – Yankee Group) and there you go.  This is just the IT side so consider increasing the upper range by 1.4x (Harvard Business School) as every attribute change (last mobile view date) requires additional meetings on a manager, director and VP level.  These people’s time gets increasingly more expensive.  You could also use Aberdeen’s benchmark of 13hrs per average master data attribute fix instead.

You can also look at productivity areas, which are typically overly measured.  Let’s assume a call center rep spends 20% of the average call time of 12 minutes (depending on the call type – account or bill inquiry, dispute, etc.) understanding

  • Who the customer is
  • What he bought online and in-store
  • If he tried to resolve his issue on the website or store
  • How he uses equipment
  • What he cares about
  • If he prefers call backs, SMS or email confirmations
  • His response rate to offers
  • His/her value to the company

If he spends these 20% of every call stringing together insights from five applications and twelve screens instead of one frame in seconds, which is the same information in every application he touches, you just freed up 20% worth of his hourly compensation.

Then look at the software, hardware, maintenance and ongoing management of the likely customer record sources (pick the worst and best quality one based on your current understanding), which will end up in a centrally governed instance.  Per DAMA, every duplicate record will cost you between $0.45 (party) and $0.85 (product) per transaction (edit touch).  At the very least each record will be touched once a year (likely 3-5 times), so multiply your duplicated record count by that and you have your savings from just de-duplication.  You can also use Aberdeen’s benchmark of 71 serious errors per 1,000 records, meaning the chance of transactional failure and required effort (% of one or more FTE’s daily workday) to fix is high.  If this does not work for you, run a data profile with one of the many tools out there.

If the sign says it - do it!

If the sign says it – do it!

If standardization of records (zip codes, billing codes, currency, etc.) is the problem, ask your business partner how many customer contacts (calls, mailing, emails, orders, invoices or account statements) fail outright and/or require validation because of these attributes.  Once again, if you apply the productivity gains mentioned earlier, there are you savings.  If you look at the number of orders that get delayed in form of payment or revenue recognition and the average order amount by a week or a month, you were just able to quantify how much profit (multiply by operating margin) you would be able to pull into the current financial year from the next one.

The same is true for speeding up the introduction or a new product or a change to it generating profits earlier.  Note that looking at the time value of funds realized earlier is too small in most instances especially in the current interest environment.

If emails bounce back or snail mail gets returned (no such address, no such name at this address, no such domain, no such user at this domain), e(mail) verification tools can help reduce the bounces. If every mail piece (forget email due to the miniscule cost) costs $1.25 – and this will vary by type of mailing (catalog, promotion post card, statement letter), incorrect or incomplete records are wasted cost.  If you can, use fully loaded print cost incl. 3rd party data prep and returns handling.  You will never capture all cost inputs but take a conservative stab.

If it was an offer, reduced bounces should also improve your response rate (also true for email now). Prospect mail response rates are typically around 1.2% (Direct Marketing Association), whereas phone response rates are around 8.2%.  If you know that your current response rate is half that (for argument sake) and you send out 100,000 emails of which 1.3% (Silverpop) have customer data issues, then fixing 81-93% of them (our experience) will drop the bounce rate to under 0.3% meaning more emails will arrive/be relevant. This in turn multiplied by a standard conversion rate (MarketingSherpa) of 3% (industry and channel specific) and average order (your data) multiplied by operating margin gets you a   benefit value for revenue.

If product data and inventory carrying cost or supplier spend are your issue, find out how many supplier shipments you receive every month, the average cost of a part (or cost range), apply the Aberdeen master data failure rate (71 in 1,000) to use cases around lack of or incorrect supersession or alternate part data, to assess the value of a single shipment’s overspend.  You can also just use the ending inventory amount from the 10-k report and apply 3-10% improvement (Aberdeen) in a top-down approach. Alternatively, apply 3.2-4.9% to your annual supplier spend (KPMG).

You could also investigate the expediting or return cost of shipments in a period due to incorrectly aggregated customer forecasts, wrong or incomplete product information or wrong shipment instructions in a product or location profile. Apply Aberdeen’s 5% improvement rate and there you go.

Consider that a North American utility told us that just fixing their 200 Tier1 suppliers’ product information achieved an increase in discounts from $14 to $120 million. They also found that fixing one basic out of sixty attributes in one part category saves them over $200,000 annually.

So what ROI percentages would you find tolerable or justifiable for, say an EDW project, a CRM project, a new claims system, etc.? What would the annual savings or new revenue be that you were comfortable with?  What was the craziest improvement you have seen coming to fruition, which nobody expected?

Next time, I will add some more “use cases” to the list and look at some philosophical implications of averages.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Migration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , | Leave a comment