Tag Archives: Business Case

The King of Benchmarks Rules the Realm of Averages

A mid-sized insurer recently approached our team for help. They wanted to understand how they fell short in making their case to their executives. Specifically, they proposed that fixing their customer data was key to supporting the executive team’s highly aggressive 3-year growth plan. (This plan was 3x today’s revenue).  Given this core organizational mission – aside from being a warm and fuzzy place to work supporting its local community – the slam dunk solution to help here is simple.  Just reducing the data migration effort around the next acquisition or avoiding the ritual annual, one-off data clean-up project already pays for any tool set enhancing data acquisitions, integration and hygiene.  Will it get you to 3x today’s revenue?  It probably won’t.  What will help are the following:

The King of Benchmarks Rules the Realm of Averages

Making the Math Work (courtesy of Scott Adams)

Hard cost avoidance via software maintenance or consulting elimination is the easy part of the exercise. That is why CFOs love it and focus so much on it.  It is easy to grasp and immediate (aka next quarter).

Soft cost reduction, like staff redundancies are a bit harder.  Despite them being viable, in my experience very few decision makers want work on a business case to lay off staff.  My team had one so far. They look at these savings as freed up capacity, which can be re-deployed more productively.   Productivity is also a bit harder to quantify as you typically have to understand how data travels and gets worked on between departments.

However, revenue effects are even harder and esoteric to many people as they include projections.  They are often considered “soft” benefits, although they outweigh the other areas by 2-3 times in terms of impact.  Ultimately, every organization runs their strategy based on projections (see the insurer in my first paragraph).

The hardest to quantify is risk. Not only is it based on projections – often from a third party (Moody’s, TransUnion, etc.) – but few people understand it. More often, clients don’t even accept you investigating this area if you don’t have an advanced degree in insurance math. Nevertheless, risk can generate extra “soft” cost avoidance (beefing up reserve account balance creating opportunity cost) but also revenue (realizing a risk premium previously ignored).  Often risk profiles change due to relationships, which can be links to new “horizontal” information (transactional attributes) or vertical (hierarchical) from parent-child relationships of an entity and the parent’s or children’s transactions.

Given the above, my initial advice to the insurer would be to look at the heartache of their last acquisition, use a benchmark for IT productivity from improved data management capabilities (typically 20-26% – Yankee Group) and there you go.  This is just the IT side so consider increasing the upper range by 1.4x (Harvard Business School) as every attribute change (last mobile view date) requires additional meetings on a manager, director and VP level.  These people’s time gets increasingly more expensive.  You could also use Aberdeen’s benchmark of 13hrs per average master data attribute fix instead.

You can also look at productivity areas, which are typically overly measured.  Let’s assume a call center rep spends 20% of the average call time of 12 minutes (depending on the call type – account or bill inquiry, dispute, etc.) understanding

  • Who the customer is
  • What he bought online and in-store
  • If he tried to resolve his issue on the website or store
  • How he uses equipment
  • What he cares about
  • If he prefers call backs, SMS or email confirmations
  • His response rate to offers
  • His/her value to the company

If he spends these 20% of every call stringing together insights from five applications and twelve screens instead of one frame in seconds, which is the same information in every application he touches, you just freed up 20% worth of his hourly compensation.

Then look at the software, hardware, maintenance and ongoing management of the likely customer record sources (pick the worst and best quality one based on your current understanding), which will end up in a centrally governed instance.  Per DAMA, every duplicate record will cost you between $0.45 (party) and $0.85 (product) per transaction (edit touch).  At the very least each record will be touched once a year (likely 3-5 times), so multiply your duplicated record count by that and you have your savings from just de-duplication.  You can also use Aberdeen’s benchmark of 71 serious errors per 1,000 records, meaning the chance of transactional failure and required effort (% of one or more FTE’s daily workday) to fix is high.  If this does not work for you, run a data profile with one of the many tools out there.

If the sign says it - do it!

If the sign says it – do it!

If standardization of records (zip codes, billing codes, currency, etc.) is the problem, ask your business partner how many customer contacts (calls, mailing, emails, orders, invoices or account statements) fail outright and/or require validation because of these attributes.  Once again, if you apply the productivity gains mentioned earlier, there are you savings.  If you look at the number of orders that get delayed in form of payment or revenue recognition and the average order amount by a week or a month, you were just able to quantify how much profit (multiply by operating margin) you would be able to pull into the current financial year from the next one.

The same is true for speeding up the introduction or a new product or a change to it generating profits earlier.  Note that looking at the time value of funds realized earlier is too small in most instances especially in the current interest environment.

If emails bounce back or snail mail gets returned (no such address, no such name at this address, no such domain, no such user at this domain), e(mail) verification tools can help reduce the bounces. If every mail piece (forget email due to the miniscule cost) costs $1.25 – and this will vary by type of mailing (catalog, promotion post card, statement letter), incorrect or incomplete records are wasted cost.  If you can, use fully loaded print cost incl. 3rd party data prep and returns handling.  You will never capture all cost inputs but take a conservative stab.

If it was an offer, reduced bounces should also improve your response rate (also true for email now). Prospect mail response rates are typically around 1.2% (Direct Marketing Association), whereas phone response rates are around 8.2%.  If you know that your current response rate is half that (for argument sake) and you send out 100,000 emails of which 1.3% (Silverpop) have customer data issues, then fixing 81-93% of them (our experience) will drop the bounce rate to under 0.3% meaning more emails will arrive/be relevant. This in turn multiplied by a standard conversion rate (MarketingSherpa) of 3% (industry and channel specific) and average order (your data) multiplied by operating margin gets you a   benefit value for revenue.

If product data and inventory carrying cost or supplier spend are your issue, find out how many supplier shipments you receive every month, the average cost of a part (or cost range), apply the Aberdeen master data failure rate (71 in 1,000) to use cases around lack of or incorrect supersession or alternate part data, to assess the value of a single shipment’s overspend.  You can also just use the ending inventory amount from the 10-k report and apply 3-10% improvement (Aberdeen) in a top-down approach. Alternatively, apply 3.2-4.9% to your annual supplier spend (KPMG).

You could also investigate the expediting or return cost of shipments in a period due to incorrectly aggregated customer forecasts, wrong or incomplete product information or wrong shipment instructions in a product or location profile. Apply Aberdeen’s 5% improvement rate and there you go.

Consider that a North American utility told us that just fixing their 200 Tier1 suppliers’ product information achieved an increase in discounts from $14 to $120 million. They also found that fixing one basic out of sixty attributes in one part category saves them over $200,000 annually.

So what ROI percentages would you find tolerable or justifiable for, say an EDW project, a CRM project, a new claims system, etc.? What would the annual savings or new revenue be that you were comfortable with?  What was the craziest improvement you have seen coming to fruition, which nobody expected?

Next time, I will add some more “use cases” to the list and look at some philosophical implications of averages.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Migration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , | Leave a comment

Where Is My Broadband Insurance Bundle?

As I continue to counsel insurers about master data, they all agree immediately that it is something they need to get their hands around fast.  If you ask participants in a workshop at any carrier; no matter if life, p&c, health or excess, they all raise their hands when I ask, “Do you have broadband bundle at home for internet, voice and TV as well as wireless voice and data?”, followed by “Would you want your company to be the insurance version of this?”

Buying insurance like broadband

Buying insurance like broadband

Now let me be clear; while communication service providers offer very sophisticated bundles, they are also still grappling with a comprehensive view of a client across all services (data, voice, text, residential, business, international, TV, mobile, etc.) each of their touch points (website, call center, local store).  They are also miles away of including any sort of meaningful network data (jitter, dropped calls, failed call setups, etc.)

Similarly, my insurance investigations typically touch most of the frontline consumer (business and personal) contact points including agencies, marketing (incl. CEM & VOC) and the service center.  On all these we typically see a significant lack of productivity given that policy, billing, payments and claims systems are service line specific, while supporting functions from developing leads and underwriting to claims adjucation often handle more than one type of claim.

This lack of performance is worsened even more by the fact that campaigns have sub-optimal campaign response and conversion rates.  As touchpoint-enabling CRM applications also suffer from a lack of complete or consistent contact preference information, interactions may violate local privacy regulations. In addition, service centers may capture leads only to log them into a black box AS400 policy system to disappear.

Here again we often hear that the fix could just happen by scrubbing data before it goes into the data warehouse.  However, the data typically does not sync back to the source systems so any interaction with a client via chat, phone or face-to-face will not have real time, accurate information to execute a flawless transaction.

On the insurance IT side we also see enormous overhead; from scrubbing every database from source via staging to the analytical reporting environment every month or quarter to one-off clean up projects for the next acquired book-of-business.  For a mid-sized, regional carrier (ca. $6B net premiums written) we find an average of $13.1 million in annual benefits from a central customer hub.  This figure results in a ROI of between 600-900% depending on requirement complexity, distribution model, IT infrastructure and service lines.  This number includes some baseline revenue improvements, productivity gains and cost avoidance as well as reduction.

On the health insurance side, my clients have complained about regional data sources contributing incomplete (often driven by local process & law) and incorrect data (name, address, etc.) to untrusted reports from membership, claims and sales data warehouses.  This makes budgeting of such items like medical advice lines staffed  by nurses, sales compensation planning and even identifying high-risk members (now driven by the Affordable Care Act) a true mission impossible, which makes the life of the pricing teams challenging.

Over in the life insurers category, whole and universal life plans now encounter a situation where high value clients first faced lower than expected yields due to the low interest rate environment on top of front-loaded fees as well as the front loading of the cost of the term component.  Now, as bonds are forecast to decrease in value in the near future, publicly traded carriers will likely be forced to sell bonds before maturity to make good on term life commitments and whole life minimum yield commitments to keep policies in force.

This means that insurers need a full profile of clients as they experience life changes like a move, loss of job, a promotion or birth.   Such changes require the proper mitigation strategy, which can be employed to protect a baseline of coverage in order to maintain or improve the premium.  This can range from splitting term from whole life to using managed investment portfolio yields to temporarily pad premium shortfalls.

Overall, without a true, timely and complete picture of a client and his/her personal and professional relationships over time and what strategies were presented, considered appealing and ultimately put in force, how will margins improve?  Surely, social media data can help here but it should be a second step after mastering what is available in-house already.  What are some of your experiences how carriers have tried to collect and use core customer data?

Disclaimer:
Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer  and on our observations.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warrantee or representation of success, either express or implied, is made.
FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customer Services, Customers, Data Governance, Data Privacy, Data Quality, Data Warehousing, Enterprise Data Management, Governance, Risk and Compliance, Healthcare, Master Data Management, Vertical | Tagged , , , , , , , , | Leave a comment

Build A Prioritized Data Management Roadmap

In my recent white paper, “Holistic Data Governance: A Framework for Competitive Advantage”, I aspirationally state that data governance should be managed as a self-sustaining business function no different than Finance.  With this in mind, last year I chased down Earl Fry, Informatica’s Chief Financial Officer, and asked him how his team helps our company prioritize investments and resources.  Earl suggested I speak with the head of our enterprise risk management group … and I left inspired!   I was shown a portfolio management-style approach to prioritizing risk management investment.  It used an easy to understand, business executive-friendly visualization “heat map” dashboard that aggregates and summarizes the multiple dimensions we use to model risk .    I asked myself: if an extremely mature and universally relevant business function like Finance manages its business this way, can’t the emerging discipline of data governance learn from it? Here’s what I’ve developed… (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance | Tagged , , , , , , , , , | Leave a comment

Ten Facets of Data Governance

In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:

  • Vision and Business Case to deliver business value
  • People
  • Tools and Architecture to support architectural scope of data governance
  • Policies that make up data governance function (security, archiving, etc.)
  • Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
  • Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
  • Organizational Alignment: how the organization will work together across silos
  • Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
  • Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
  • Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).

For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance | Tagged , , , , , , , , , , , , , | Leave a comment

Build Your PIM Business Case – 5 Things to Consider

Do you have to justify your MDM or PIM investment? Does your CEO ask for ROI and business cases? Justify ROI & gain quick wins with these key steps  that will help you build your individual MDM or PIM business case.

Gartner analysts Andrew White and Bill O’Kane wrote in their 2012 Magic Quadrant of Master Data Management for Product Data Solutions Report: “The usage and focus (that is, MDM use case) of the product master data — ranging across use cases for design (information architecture), construction (“building the business”), operations (“running the business”) and analytics (“reporting the business”).

In response to the initial questions, I have recently published a white paper “Build Your PIM Business Case” which summarizes important key factors based on ten years experience in this market, analyst reports, and recent research. It aims to support project managers when they have to define their MDM business with product data. Here is a brief outline:

1. Understand the big MDM picture: Define a first milestone and data domain to start with. Product data has direct impact on business results like conversion rates and product returns, to name two examples.

2. Think strategically, but act operationally: Define a target for a quick win. An example may be to update your online store with the most important product category or to expand your multichannel strategy by adopting a new channel. Get management support or a CEO commitment; at least one of the management team should be an official sponsor of the project and the strategy. MDM is a strategic thing. Data is a competitive advantage.

3. Follow steps recommended by analysts and experienced consultants. Gartner Research developed eight steps for building a PIM business case. I attended a Gartner session on it at the latest MDM summit.

4. Focus on implementation style and methodology: The successful PIM project relies on a specialized consulting methodology. It should cover three key areas: business processes, technical implementation, and professional project management. More details from our Senior PIM Consultant Michael Weiss, along with the Heiler PIM Implementation Methodology, can be found in the white paper “Build your PIM business case” on the top right form.

5. Measure performance and results of PIM: Choose KPIs that match your company and industry – based on the best practice KPI list from the PIM ROI research. Ask your vendor or integrator for examples and work with some important benchmarks. Pick those which best align with your vertical and industry.

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, Product Information Management | Tagged , , , | Leave a comment

The Inches We Need Are Everywhere

So goes the line in the 1999 Oliver Stone film, Any Given Sunday. In the film, Al Pacino plays Tony D’Amato, a “been there, done that” football coach who, faced with a new set of challenges, has to re-evaluate his tried and true assumptions about everything he had learned through his career. In an attempt to rally his troops, D’Amato delivers a wonderful stump speech challenging them to look for ways to move the ball forward, treating every inch of the field as something sacred and encouraging them to think differently about how to do so.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Quality, Pervasive Data Quality | Tagged , , , , , | Leave a comment

The Value of Knowing the Value

Ever wondered if an initiative is worth the effort?  Ever wondered how to quantify its worth?  This is a loaded question as you may suspect but I wanted to ask it nevertheless as my team of Global Industry Consultants work with clients around the world to do just that (aka Business Value Assessment or BVA) for solutions anchored around Informatica’s products.

How far will your investment stretch?

As these solutions typically involve multiple core business processes stretching over multiple departments and leveraging a legion of technology components like ETL, metadata management, business glossary, BPM, data virtualization, legacy ERP, CRM and billing systems, it initially sounds like a daunting level of complexity.  Opening this can of worms may end up in a measurement fatigue (I think I just discovered a new medical malaise.) (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Master Data Management, Operational Efficiency, Scorecarding | Tagged , , , , | Leave a comment

Building A Business Case For Data Quality: Create The Business Case

Building A Business Case For Data Quality, 7 of a 7-part series

Finally, you need to create a business case and present the finding of the data quality checkup. There are two levels of presentation that typically take place after the data quality assessment. The first is a technical presentation to IT giving all the details of completeness, conformity, consistency, accuracy, duplication, and integrity characteristics of the data. IT needs to understand the types of issues in order to figure out what needs to be repaired and have an idea what can be fixed and what it might cost.

The more important presentation is what impact these issues are having on the business. Does the lack of accuracy in the data affect the accuracy of business decisions? How does the completeness of the data affect insurance ratings, loan applications, or well drilling decisions?  Are your customer’s committing a crime? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality | Tagged , , , , , , | 1 Comment

Understanding The Need for Quality Master Data

Of my recent series of papers on the value of data quality improvement, the first focused on the economic or financial aspects of data quality improvement. One of the goals of the paper was to show that if you iteratively drill down along the different economic value dimensions to look at the use of information that contributes to organizational success, you can establish a link between data failures and business or operational process success. For example, when looking at cost reduction as the high level value dimension, we see that when attempting to reduce the spend associated with particular products through better negotiations with vendors, duplicate product entries in the supplier catalog reduced the ability to do accurate review of costs of each item as well as classes of items. This inconsistency impacted the ability to achieve the cost reductions.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , | Leave a comment

Customer Data Forum Off To A Great Start Featuring MDM

We launched a coast-to-coast Customer Data Forum road show with visits to Atlanta and Washington, D.C., that attracted business and IT professionals interested in using master data management (MDM) to attract and retain customers.

From the business side, our guests consisted of analysts, sales operations personnel, and business liaisons to IT, while the IT side was represented by enterprise and data architects, IT directors, and business intelligence and data warehousing professionals. In Washington, about half the audience was from public sector and government agencies. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Integration Platform, Data Quality, Data Services, Data Warehousing, Enterprise Data Management, Identity Resolution, Informatica Events, Master Data Management, Partners, Pervasive Data Quality, Profiling, Public Sector, Scorecarding | Tagged , , , , , , , , , , , , , , , , , | Leave a comment