Tag Archives: ROI

The King of Benchmarks Rules the Realm of Averages

A mid-sized insurer recently approached our team for help. They wanted to understand how they fell short in making their case to their executives. Specifically, they proposed that fixing their customer data was key to supporting the executive team’s highly aggressive 3-year growth plan. (This plan was 3x today’s revenue).  Given this core organizational mission – aside from being a warm and fuzzy place to work supporting its local community – the slam dunk solution to help here is simple.  Just reducing the data migration effort around the next acquisition or avoiding the ritual annual, one-off data clean-up project already pays for any tool set enhancing data acquisitions, integration and hygiene.  Will it get you to 3x today’s revenue?  It probably won’t.  What will help are the following:

The King of Benchmarks Rules the Realm of Averages

Making the Math Work (courtesy of Scott Adams)

Hard cost avoidance via software maintenance or consulting elimination is the easy part of the exercise. That is why CFOs love it and focus so much on it.  It is easy to grasp and immediate (aka next quarter).

Soft cost reduction, like staff redundancies are a bit harder.  Despite them being viable, in my experience very few decision makers want work on a business case to lay off staff.  My team had one so far. They look at these savings as freed up capacity, which can be re-deployed more productively.   Productivity is also a bit harder to quantify as you typically have to understand how data travels and gets worked on between departments.

However, revenue effects are even harder and esoteric to many people as they include projections.  They are often considered “soft” benefits, although they outweigh the other areas by 2-3 times in terms of impact.  Ultimately, every organization runs their strategy based on projections (see the insurer in my first paragraph).

The hardest to quantify is risk. Not only is it based on projections – often from a third party (Moody’s, TransUnion, etc.) – but few people understand it. More often, clients don’t even accept you investigating this area if you don’t have an advanced degree in insurance math. Nevertheless, risk can generate extra “soft” cost avoidance (beefing up reserve account balance creating opportunity cost) but also revenue (realizing a risk premium previously ignored).  Often risk profiles change due to relationships, which can be links to new “horizontal” information (transactional attributes) or vertical (hierarchical) from parent-child relationships of an entity and the parent’s or children’s transactions.

Given the above, my initial advice to the insurer would be to look at the heartache of their last acquisition, use a benchmark for IT productivity from improved data management capabilities (typically 20-26% – Yankee Group) and there you go.  This is just the IT side so consider increasing the upper range by 1.4x (Harvard Business School) as every attribute change (last mobile view date) requires additional meetings on a manager, director and VP level.  These people’s time gets increasingly more expensive.  You could also use Aberdeen’s benchmark of 13hrs per average master data attribute fix instead.

You can also look at productivity areas, which are typically overly measured.  Let’s assume a call center rep spends 20% of the average call time of 12 minutes (depending on the call type – account or bill inquiry, dispute, etc.) understanding

  • Who the customer is
  • What he bought online and in-store
  • If he tried to resolve his issue on the website or store
  • How he uses equipment
  • What he cares about
  • If he prefers call backs, SMS or email confirmations
  • His response rate to offers
  • His/her value to the company

If he spends these 20% of every call stringing together insights from five applications and twelve screens instead of one frame in seconds, which is the same information in every application he touches, you just freed up 20% worth of his hourly compensation.

Then look at the software, hardware, maintenance and ongoing management of the likely customer record sources (pick the worst and best quality one based on your current understanding), which will end up in a centrally governed instance.  Per DAMA, every duplicate record will cost you between $0.45 (party) and $0.85 (product) per transaction (edit touch).  At the very least each record will be touched once a year (likely 3-5 times), so multiply your duplicated record count by that and you have your savings from just de-duplication.  You can also use Aberdeen’s benchmark of 71 serious errors per 1,000 records, meaning the chance of transactional failure and required effort (% of one or more FTE’s daily workday) to fix is high.  If this does not work for you, run a data profile with one of the many tools out there.

If the sign says it - do it!

If the sign says it – do it!

If standardization of records (zip codes, billing codes, currency, etc.) is the problem, ask your business partner how many customer contacts (calls, mailing, emails, orders, invoices or account statements) fail outright and/or require validation because of these attributes.  Once again, if you apply the productivity gains mentioned earlier, there are you savings.  If you look at the number of orders that get delayed in form of payment or revenue recognition and the average order amount by a week or a month, you were just able to quantify how much profit (multiply by operating margin) you would be able to pull into the current financial year from the next one.

The same is true for speeding up the introduction or a new product or a change to it generating profits earlier.  Note that looking at the time value of funds realized earlier is too small in most instances especially in the current interest environment.

If emails bounce back or snail mail gets returned (no such address, no such name at this address, no such domain, no such user at this domain), e(mail) verification tools can help reduce the bounces. If every mail piece (forget email due to the miniscule cost) costs $1.25 – and this will vary by type of mailing (catalog, promotion post card, statement letter), incorrect or incomplete records are wasted cost.  If you can, use fully loaded print cost incl. 3rd party data prep and returns handling.  You will never capture all cost inputs but take a conservative stab.

If it was an offer, reduced bounces should also improve your response rate (also true for email now). Prospect mail response rates are typically around 1.2% (Direct Marketing Association), whereas phone response rates are around 8.2%.  If you know that your current response rate is half that (for argument sake) and you send out 100,000 emails of which 1.3% (Silverpop) have customer data issues, then fixing 81-93% of them (our experience) will drop the bounce rate to under 0.3% meaning more emails will arrive/be relevant. This in turn multiplied by a standard conversion rate (MarketingSherpa) of 3% (industry and channel specific) and average order (your data) multiplied by operating margin gets you a   benefit value for revenue.

If product data and inventory carrying cost or supplier spend are your issue, find out how many supplier shipments you receive every month, the average cost of a part (or cost range), apply the Aberdeen master data failure rate (71 in 1,000) to use cases around lack of or incorrect supersession or alternate part data, to assess the value of a single shipment’s overspend.  You can also just use the ending inventory amount from the 10-k report and apply 3-10% improvement (Aberdeen) in a top-down approach. Alternatively, apply 3.2-4.9% to your annual supplier spend (KPMG).

You could also investigate the expediting or return cost of shipments in a period due to incorrectly aggregated customer forecasts, wrong or incomplete product information or wrong shipment instructions in a product or location profile. Apply Aberdeen’s 5% improvement rate and there you go.

Consider that a North American utility told us that just fixing their 200 Tier1 suppliers’ product information achieved an increase in discounts from $14 to $120 million. They also found that fixing one basic out of sixty attributes in one part category saves them over $200,000 annually.

So what ROI percentages would you find tolerable or justifiable for, say an EDW project, a CRM project, a new claims system, etc.? What would the annual savings or new revenue be that you were comfortable with?  What was the craziest improvement you have seen coming to fruition, which nobody expected?

Next time, I will add some more “use cases” to the list and look at some philosophical implications of averages.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Migration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , | Leave a comment

Becoming a Revenue Driven Business Model through Data is Painful for Government Agencies

Recently, I presented a Business Value Assessment to a client.  The findings were based on a revenue-generating state government agency. Everyone at the presentation was stunned to find out how much money was left on the table by not basing their activities on transactions, which could be cleanly tied to the participating citizenry and a variety of channel partners. There was over $38 million in annual benefits left over, which included partially recovered lost revenue, cost avoidance and reduction. A higher data impact to this revenue driven business model could have prevented this.

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Given the total revenue volume, this may seem small. However, after factoring in the little technology effort required to “collect and connect” data from existing transactions, it is actually extremely high.

The real challenge for this organization will be the required policy transformation to turn the organization from “data-starved” to “data-intensive”. This would eliminate strategic decisions around new products, locations and customers relying on surveys that face sampling errors, biases, etc. Additionally, surveys are often delayed, making them practically ineffective in this real-time world we live in today.

Despite no applicable legal restrictions, the leadership’s main concern was that gathering more data would erode the public’s trust and positive image of the organization.

To be clear; by “more” data being collected by this type of government agency I mean literally 10% of what any commercial retail entity has gathered on all of us for decades.  This is not the next NSA revelation as any conspiracy theorist may fear.

While I respect their culturally driven self-censorship despite no legal barricades, it raises their stakeholders’ (the state’s citizenry) concern over its performance.  To be clear, there would be no additional revenue for the state’s programs without more citizen data.  You may believe that they already know everything about you, including your income, property value, tax information, etc. However, inter-departmental sharing of criminally-non-relevant information is legally constrained.

Another interesting finding from this evaluation was that they had no sense of conversion rate from email and social media campaigns. Impressions from click-throughs as well as hard/soft bounces were more important than tracking who actually generated revenue.

This is a very market-driven organization compared to other agencies. It actually does try to measure itself like a commercial enterprise and attempts to change in order to generate additional revenue for state programs benefiting the citizenry. I can only imagine what non-revenue-generating agencies (local, state or federal) do in this respect.  Is revenue-oriented thinking something the DoD, DoJ or Social Security should subscribe to?

Think tanks and political pundits are now looking at the trade-off between bringing democracy to every backyard on our globe and its long-term, budget ramifications. The DoD is looking to reduce the active component to its lowest in decades given the U.S. federal debt level.

Putting the data bits and pieces together for revenue

Putting the data bits and pieces together for revenue

recent article in HBR explains that cost cutting has never sustained an organization’s growth over a longer period of time, but new revenue sources did. Is your company or government agency only looking at cost and personnel productivity?

Disclaimer:

Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Operational Efficiency, Real-Time | Tagged , , | Leave a comment

Are You Getting an EPIC ROI? Retire Legacy Healthcare Applications!

Healthcare organizations are currently engaged in major transformative initiatives. The American Recovery and Reinvestment Act of 2009 (ARRA) provided the healthcare industry incentives for the adoption and modernization of point-of-care computing solutions including electronic medical and health records (EMRs/EHRs).   Funds have been allocated, and these projects are well on their way.  In fact, the majority of hospitals in the US are engaged in implementing EPIC, a software platform that is essentially the ERP for healthcare.

These Cadillac systems are being deployed from scratch with very little data being ported from the old systems into the new.  The result is a dearth of legacy applications running in aging hospital data centers, consuming every last penny of HIS budgets.  Because the data still resides on those systems, hospital staff continues to use them making it difficult to shut down or retire.

Most of these legacy systems are not running on modern technology platforms – they run on systems such as HP Turbo Image, Intercache Mumps, and embedded proprietary databases.  Finding people who know how to manage and maintain these systems is costly and risky – risky in that if data residing in those applications is subject to data retention requirements (patient records, etc.) and the data becomes inaccessible.

A different challenge for CFOs of these hospitals is the ROI on these EPIC implementations.  Because these projects are multi-phased, multi-year, boards of directors are asking about the value realized from these investments.  Many are coming up short because they are maintaining both applications in parallel.  Relief will come when systems can be retired – but getting hospital staff and regulators to approve a retirement project requires evidence that they can still access data while adhering to compliance needs.

Many providers have overcome these hurdles by successfully implementing an application retirement strategy based on the Informatica Data Archive platform.  Several of the largest pediatrics’ children’s hospitals in the US are either already saving or expecting to save $2 Million or more annually from retiring legacy applications.  The savings come from:

  • Eliminating software maintenance and license costs
  • Eliminate hardware dependencies and costs
  • Reduced storage requirements by 95% (data archived is stored in a highly compressed, accessible format)
  • Improved efficiencies in IT by eliminating specialized processes or skills associated with legacy systems
  • Freed IT resources – teams can spend more of their time working on innovations and new projects

Informatica Application Retirement Solutions for Healthcare provide hospitals with the ability to completely retire legacy applications, retire and maintain access to archive data for hospital staff.  And with built in security and retention management, records managers and legal teams are satisfying compliance requirements.   Contact your Informatica Healthcare team for more information on how you can get that EPIC ROI the board of directors is asking for.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Data Archiving, Healthcare | Tagged , , , , , , , , | Leave a comment

Helping Information Leaders Put Their Potential To Work

During Informatica World in early June, we were excited to announce our new Potential at Work Community.  You can read Jakki Geiger’s blog introducing the Community to learn more about the goals for this great resource.  (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Privacy, Data Quality, Enterprise Data Management | Tagged , , , , , , , , , , | Leave a comment

Two Sci-Fi Big Data Predictions From Over 50 Years Ago

Science fiction represents some of the most impactful stories I’ve read throughout my life.  By impactful, I mean the ideas have stuck with me 30 years since I last read them.  I recently recalled two of these stories and realized they represent two very different paths for Big Data.  One path, quite literally, was towards enlightenment.  Let’s just say the other path went in a different direction.  The amazing thing is that both of these stories were written between 50-60 years ago. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Governance | Tagged , , , , , | 5 Comments

A key growth milestone for the Informatica Marketplace

Recently, the Informatica Marketplace reached a major milestone: we exceeded 1,000 Blocks (Apps). Looking back to three years ago when we started with 70 Blocks from a handful of partners, it’s an amazing achievement to have reached this volume of solutions in such a short time. For me, it speaks to the tremendous value that the Marketplace brings not only to our customers who download more than 10,000 Blocks per month, but also to our partners who have found in the Marketplace a viable route to market and a great awareness and monetization vehicle for their solutions.

There has been a lot of discussion around the explosion of data and what it means to companies trying to leverage this extremely valuable resource. Informatica has a huge part to play in helping customers solve those problems not only through the technologies we provide directly, but through the tremendous ecosystem that we have built through our partners. The Marketplace has grown to more than 165 unique partner companies, and we’re adding more every day. Blocks such as BI & Analytics sing Social Media Data from Deloitte, and Interstage XWand – XBRL Processor from Fujitsu represent offerings from large, established software companies, while Blocks such as Skybot Enterprise Job Scheduler and Undraleu Code Review Tool from Coeurdata are solutions that have been contributed by earlier stage companies that have experienced significant success and growth. It has been a pleasure helping these companies to grow and reach new customers through the Marketplace.

One of the most exciting things about reaching the 1K Block milestone is not just the amount of companies that are on the Marketplace, but the amount of solutions that have been contributed from our developer community. Blocks such as Autotype Excel Macro, Execute Workflow, and iExportNormalizer are all solutions that Informatica developers have built because it helps them in their daily activities, and through the Marketplace they have found a way to share these valuable assets with the community. In fact, over half of our solutions are free to use, which is a ringing endorsement of the power of the community and a great way to try out any number of useful solutions at no risk. By leveraging enabling technologies such as Informatica’s Cloud Platform as a Service, developers can create and share solutions more quickly and easily than ever before.

Overall, it has been an exciting ride as the Marketplace has rocketed to 1,000 Blocks in under three years, and I look forward to what the next three years has in store!

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Customers, Marketplace, PaaS, Uncategorized | Tagged , , , , , , , , , , , | Leave a comment

Why Big Data Leaders are Made and Not Born

There are organizations truly reaping the rewards of Big Data, and then there are those who are just trying to catch up. What are the Big Data “leaders” doing that the “laggards” are missing? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data | Tagged , , , | Leave a comment

Asset in, Garbage Out: Measuring data degradation

Following up on the discussion I started on GovernYourData.com (thanks to all who provided great feedback), here’s my full proposal on this topic: 

We all know about the “Garbage In/Garbage Out” reality that data quality and data governance practitioners have been fighting against for decades.   If you don’t trust data when it’s initially captured, how can you trust it when it’s time to consume or analyze it?  But I’m also looking at the tougher problem of data degradation.  The data comes into your environment just fine, but any number of actions, events – or inactions – turns that “good” data “bad”.   

So far I’ve been able to hypothesize eight root causes of data degradation.  I’d really love your feedback on both the validity and completeness of these categories.   I’ve used similar examples across a number of these to simplify. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance | Tagged , , , , , , , | 1 Comment

9 KPIs for Top Retailers’ Multichannel Commerce – Infographic driven by PIM

When seeking to justify an investment in Product Information Management (PIM) and building the business case, companies can investigate which key performance indicators are impacted by PIM. The results of the international PIM study demonstrate that a return on investment (ROI) from an Enterprise Product Information Management (PIM) solution is possible within the framework of a multichannel commerce strategy.

1. Search engines
60% of web users use search engines to search for products. (Source: Searchengineland.com)

2. Time-to-market
The results of the international PIM study show that a return on investment (ROI) from an Enterprise Product Information Management (PIM) solution is possible within the framework of a multichannel commerce strategy. Over 300 retailers and manufacturers from 17 countries participated in the extensive study by Heiler Software. The study delivers more than 30 pages of measurable results. One such result is that PIM leads to a 25% faster time-to-market thanks to SEO products. (Source: www.pim-roi.com)

3. Shopping cart abandonment
90% of shopping cart abandonments occur because of poor product information. (Source: Internet World Business Magazine)

4. Product returns
40 is the critical number. 40% of buyers intend to return a product when they order it. 40% order more variations of a product. 40% of all product returns are due to poor product information. (Source: Magazine Wirtschaftswoche 7.1.2013 and Return Research  – average German mail-order market)

5. Print impacts online
Printed catalogs lead to a 30% boost in online sales. (Source: ECC multichannel survey)

6. Cost savings in print catalog publishing
PIM enables a saving of USD 280,000 by automating manual print catalog production. (Source: LNC PIM survey 2007)

7. In-store sales and customer service
61% of retail managers believe that shoppers are better connected to product information than in-store associates. (Source: Motorola Holiday Shopping Study 2012)

8. Margins with niche products
80% of Heiler PIM customers say they sell at higher margins by pursuing a long tail strategy and increasing assortment size. (Source: www.pim-roi.com)

9. Social sharing
Social sharing generates value. And, on average across all social networks, the value of a social share drives $3.23 in additional revenue for an event each time someone shares. (Source: Social commerce numbers, October 23, 2012)

9-top-kpis-for-top-retailers

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, Product Information Management | Tagged , , , , | Leave a comment

Build A Prioritized Data Management Roadmap

In my recent white paper, “Holistic Data Governance: A Framework for Competitive Advantage”, I aspirationally state that data governance should be managed as a self-sustaining business function no different than Finance.  With this in mind, last year I chased down Earl Fry, Informatica’s Chief Financial Officer, and asked him how his team helps our company prioritize investments and resources.  Earl suggested I speak with the head of our enterprise risk management group … and I left inspired!   I was shown a portfolio management-style approach to prioritizing risk management investment.  It used an easy to understand, business executive-friendly visualization “heat map” dashboard that aggregates and summarizes the multiple dimensions we use to model risk .    I asked myself: if an extremely mature and universally relevant business function like Finance manages its business this way, can’t the emerging discipline of data governance learn from it? Here’s what I’ve developed… (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance | Tagged , , , , , , , , , | Leave a comment