Tag Archives: Business Case
As I continue to counsel insurers about master data, they all agree immediately that it is something they need to get their hands around fast. If you ask participants in a workshop at any carrier; no matter if life, p&c, health or excess, they all raise their hands when I ask, “Do you have broadband bundle at home for internet, voice and TV as well as wireless voice and data?”, followed by “Would you want your company to be the insurance version of this?”
Now let me be clear; while communication service providers offer very sophisticated bundles, they are also still grappling with a comprehensive view of a client across all services (data, voice, text, residential, business, international, TV, mobile, etc.) each of their touch points (website, call center, local store). They are also miles away of including any sort of meaningful network data (jitter, dropped calls, failed call setups, etc.)
Similarly, my insurance investigations typically touch most of the frontline consumer (business and personal) contact points including agencies, marketing (incl. CEM & VOC) and the service center. On all these we typically see a significant lack of productivity given that policy, billing, payments and claims systems are service line specific, while supporting functions from developing leads and underwriting to claims adjucation often handle more than one type of claim.
This lack of performance is worsened even more by the fact that campaigns have sub-optimal campaign response and conversion rates. As touchpoint-enabling CRM applications also suffer from a lack of complete or consistent contact preference information, interactions may violate local privacy regulations. In addition, service centers may capture leads only to log them into a black box AS400 policy system to disappear.
Here again we often hear that the fix could just happen by scrubbing data before it goes into the data warehouse. However, the data typically does not sync back to the source systems so any interaction with a client via chat, phone or face-to-face will not have real time, accurate information to execute a flawless transaction.
On the insurance IT side we also see enormous overhead; from scrubbing every database from source via staging to the analytical reporting environment every month or quarter to one-off clean up projects for the next acquired book-of-business. For a mid-sized, regional carrier (ca. $6B net premiums written) we find an average of $13.1 million in annual benefits from a central customer hub. This figure results in a ROI of between 600-900% depending on requirement complexity, distribution model, IT infrastructure and service lines. This number includes some baseline revenue improvements, productivity gains and cost avoidance as well as reduction.
On the health insurance side, my clients have complained about regional data sources contributing incomplete (often driven by local process & law) and incorrect data (name, address, etc.) to untrusted reports from membership, claims and sales data warehouses. This makes budgeting of such items like medical advice lines staffed by nurses, sales compensation planning and even identifying high-risk members (now driven by the Affordable Care Act) a true mission impossible, which makes the life of the pricing teams challenging.
Over in the life insurers category, whole and universal life plans now encounter a situation where high value clients first faced lower than expected yields due to the low interest rate environment on top of front-loaded fees as well as the front loading of the cost of the term component. Now, as bonds are forecast to decrease in value in the near future, publicly traded carriers will likely be forced to sell bonds before maturity to make good on term life commitments and whole life minimum yield commitments to keep policies in force.
This means that insurers need a full profile of clients as they experience life changes like a move, loss of job, a promotion or birth. Such changes require the proper mitigation strategy, which can be employed to protect a baseline of coverage in order to maintain or improve the premium. This can range from splitting term from whole life to using managed investment portfolio yields to temporarily pad premium shortfalls.
Overall, without a true, timely and complete picture of a client and his/her personal and professional relationships over time and what strategies were presented, considered appealing and ultimately put in force, how will margins improve? Surely, social media data can help here but it should be a second step after mastering what is available in-house already. What are some of your experiences how carriers have tried to collect and use core customer data?
Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warrantee or representation of success, either express or implied, is made.
In my recent white paper, “Holistic Data Governance: A Framework for Competitive Advantage”, I aspirationally state that data governance should be managed as a self-sustaining business function no different than Finance. With this in mind, last year I chased down Earl Fry, Informatica’s Chief Financial Officer, and asked him how his team helps our company prioritize investments and resources. Earl suggested I speak with the head of our enterprise risk management group … and I left inspired! I was shown a portfolio management-style approach to prioritizing risk management investment. It used an easy to understand, business executive-friendly visualization “heat map” dashboard that aggregates and summarizes the multiple dimensions we use to model risk . I asked myself: if an extremely mature and universally relevant business function like Finance manages its business this way, can’t the emerging discipline of data governance learn from it? Here’s what I’ve developed… (more…)
In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:
- Vision and Business Case to deliver business value
- Tools and Architecture to support architectural scope of data governance
- Policies that make up data governance function (security, archiving, etc.)
- Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
- Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
- Organizational Alignment: how the organization will work together across silos
- Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
- Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
- Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).
For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.
Do you have to justify your MDM or PIM investment? Does your CEO ask for ROI and business cases? Justify ROI & gain quick wins with these key steps that will help you build your individual MDM or PIM business case.
Gartner analysts Andrew White and Bill O’Kane wrote in their 2012 Magic Quadrant of Master Data Management for Product Data Solutions Report: “The usage and focus (that is, MDM use case) of the product master data — ranging across use cases for design (information architecture), construction (“building the business”), operations (“running the business”) and analytics (“reporting the business”).
In response to the initial questions, I have recently published a white paper “Build Your PIM Business Case” which summarizes important key factors based on ten years experience in this market, analyst reports, and recent research. It aims to support project managers when they have to define their MDM business with product data. Here is a brief outline:
1. Understand the big MDM picture: Define a first milestone and data domain to start with. Product data has direct impact on business results like conversion rates and product returns, to name two examples.
2. Think strategically, but act operationally: Define a target for a quick win. An example may be to update your online store with the most important product category or to expand your multichannel strategy by adopting a new channel. Get management support or a CEO commitment; at least one of the management team should be an official sponsor of the project and the strategy. MDM is a strategic thing. Data is a competitive advantage.
3. Follow steps recommended by analysts and experienced consultants. Gartner Research developed eight steps for building a PIM business case. I attended a Gartner session on it at the latest MDM summit.
4. Focus on implementation style and methodology: The successful PIM project relies on a specialized consulting methodology. It should cover three key areas: business processes, technical implementation, and professional project management. More details from our Senior PIM Consultant Michael Weiss, along with the Heiler PIM Implementation Methodology, can be found in the white paper “Build your PIM business case” on the top right form.
5. Measure performance and results of PIM: Choose KPIs that match your company and industry – based on the best practice KPI list from the PIM ROI research. Ask your vendor or integrator for examples and work with some important benchmarks. Pick those which best align with your vertical and industry.
So goes the line in the 1999 Oliver Stone film, Any Given Sunday. In the film, Al Pacino plays Tony D’Amato, a “been there, done that” football coach who, faced with a new set of challenges, has to re-evaluate his tried and true assumptions about everything he had learned through his career. In an attempt to rally his troops, D’Amato delivers a wonderful stump speech challenging them to look for ways to move the ball forward, treating every inch of the field as something sacred and encouraging them to think differently about how to do so.
Ever wondered if an initiative is worth the effort? Ever wondered how to quantify its worth? This is a loaded question as you may suspect but I wanted to ask it nevertheless as my team of Global Industry Consultants work with clients around the world to do just that (aka Business Value Assessment or BVA) for solutions anchored around Informatica’s products.
As these solutions typically involve multiple core business processes stretching over multiple departments and leveraging a legion of technology components like ETL, metadata management, business glossary, BPM, data virtualization, legacy ERP, CRM and billing systems, it initially sounds like a daunting level of complexity. Opening this can of worms may end up in a measurement fatigue (I think I just discovered a new medical malaise.) (more…)
Finally, you need to create a business case and present the finding of the data quality checkup. There are two levels of presentation that typically take place after the data quality assessment. The first is a technical presentation to IT giving all the details of completeness, conformity, consistency, accuracy, duplication, and integrity characteristics of the data. IT needs to understand the types of issues in order to figure out what needs to be repaired and have an idea what can be fixed and what it might cost.
The more important presentation is what impact these issues are having on the business. Does the lack of accuracy in the data affect the accuracy of business decisions? How does the completeness of the data affect insurance ratings, loan applications, or well drilling decisions? Are your customer’s committing a crime? (more…)
Of my recent series of papers on the value of data quality improvement, the first focused on the economic or financial aspects of data quality improvement. One of the goals of the paper was to show that if you iteratively drill down along the different economic value dimensions to look at the use of information that contributes to organizational success, you can establish a link between data failures and business or operational process success. For example, when looking at cost reduction as the high level value dimension, we see that when attempting to reduce the spend associated with particular products through better negotiations with vendors, duplicate product entries in the supplier catalog reduced the ability to do accurate review of costs of each item as well as classes of items. This inconsistency impacted the ability to achieve the cost reductions.
We launched a coast-to-coast Customer Data Forum road show with visits to Atlanta and Washington, D.C., that attracted business and IT professionals interested in using master data management (MDM) to attract and retain customers.
From the business side, our guests consisted of analysts, sales operations personnel, and business liaisons to IT, while the IT side was represented by enterprise and data architects, IT directors, and business intelligence and data warehousing professionals. In Washington, about half the audience was from public sector and government agencies. (more…)
Building a business case for data quality is a waste of time. Nobody really cares. Improving data quality for quality’s sake is a waste of money. Sounds funny coming from a data quality specialist, someone who has spent the last decade preaching data profiling and data quality. But the fact is people from the business side do not care about data quality. What they care about is the impact poor data quality has on their line of business.
When you look at how the business measures itself (after you get past revenue and profit), the talk is about key performance indicators (KPI). What are some of the KPIs for a call center? You will hear about goals of reducing talk time. The business wants to lower costs. You will hear about goals of decreasing hold times. The business wants to improve the customer experience. (more…)