Category Archives: Customers

Understand Customer Intentions To Manage The Experience

I recently had a lengthy conversation with a business executive of a European telco.  His biggest concern was to not only understand the motivations and related characteristics of consumers but to accomplish this insight much faster than before.  Given available resources and current priorities this is something unattainable for many operators.

Unlike a few years ago – remember the time before iPad – his organization today is awash with data points from millions of devices, hundreds of device types and many applications.

What will he do next?

What will he do next?

One way for him to understand consumer motivation; and therefore intentions, is to get a better view of a user’s network and all related interactions and transactions.  This includes his family household, friends and business network (also a type of household).  The purpose of householding is to capture social and commercial relationships in a grouping of individuals (or businesses or both mixed together) in order to identify patterns (context), which can be exploited to better serve a customer a new individual product or bundle upsell, to push relevant apps, audio and video content.

Let’s add another layer of complexity by understanding not only who a subscriber is, who he knows and how often he interacts with these contacts and the services he has access to via one or more devices but also where he physically is at the moment he interacts.  You may also combine this with customer service and (summarized) network performance data to understand who is high-value, high-overhead and/or high in customer experience.  Most importantly, you will also be able to assess who will do what next and why.

Some of you may be thinking “Oh gosh, the next NSA program in the making”.   Well, it may sound like it but the reality is that this data is out there today, available and interpretable if cleaned up, structured and linked and served in real time.  Not only do data quality, ETL, analytical and master data systems provide the data backbone for this reality but process-based systems dealing with the systematic real-time engagement of consumers are the tool to make it actionable.  If you add some sort of privacy rules using database or application-level masking technologies, most of us would feel more comfortable about this proposition.

This may feel like a massive project but as many things in IT life; it depends on how you scope it.  I am a big fan of incremental mastering of increasingly more attributes of certain customer segments, business units, geographies, where lessons learnt can be replicated over and over to scale.  Moreover, I am a big fan of figuring out what you are trying to achieve before even attempting to tackle it.

The beauty behind a “small” data backbone – more about “small data” in a future post – is that if a certain concept does not pan out in terms of effort or result, you have just wasted a small pile of cash instead of the $2 million for a complete throw-away.  For example: if you initially decided that the central lynch pin in your household hub & spoke is the person, who owns the most contracts with you rather than the person who pays the bills every month or who has the largest average monthly bill, moving to an alternative perspective does not impact all services, all departments and all clients.  Nevertheless, the role of each user in the network must be defined over time to achieve context, i.e. who is a contract signee, who is a payer, who is a user, who is an influencer, who is an employer, etc.

Why is this important to a business? It is because without the knowledge of who consumes, who pays for and who influences the purchase/change of a service/product, how can one create the right offers and target them to the right individual.

However, in order to make this initial call about household definition and scope or look at the options available and sensible, you have to look at social and cultural conventions, what you are trying to accomplish commercially and your current data set’s ability to achieve anything without a massive enrichment program.  A couple of years ago, at a Middle Eastern operator, it was very clear that the local patriarchal society dictated that the center of this hub and spoke model was the oldest, non-retired male in the household, as all contracts down to children of cousins would typically run under his name.  The goal was to capture extended family relationships more accurately and completely in order to create and sell new family-type bundles for greater market penetration and maximize usage given new bandwidth capacity.

As a parallel track aside from further rollout to other departments, customer segments and geos, you may also want to start thinking like another European operator I engaged a couple of years ago.  They were trying to outsource some data validation and enrichment to their subscribers, which allowed for a more accurate and timely capture of changes, often life-style changes (moves, marriages, new job).  The operator could then offer new bundles and roaming upsells. As a side effect, it also created a sense of empowerment and engagement in the client base.

I see bits and pieces of some of this being used when I switch on my home communication systems running broadband signal through my X-Box or set-top box into my TV using Netflix and Hulu and gaming.  Moreover, a US cable operator actively promotes a “moving” package to help make sure you do not miss a single minute of entertainment when relocating.

Every time now I switch on my TV, I get content suggested to me.  If telecommunication services would now be a bit more competitive in the US (an odd thing to say in every respect) and prices would come down to European levels, I would actually take advantage of the offer.  And then there is the log-on pop up asking me to subscribe (or throubleshoot) a channel I have already subscribed to.  Wonder who or what automated process switched that flag.

Ultimately, there cannot be a good customer experience without understanding customer intentions.  I would love to hear stories from other practitioners on what they have seen in such respect

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Complex Event Processing, Customer Acquisition & Retention, Customer Services, Customers, Data Integration, Data Quality, Master Data Management, Profiling, Real-Time, Telecommunications, Vertical | Tagged , , , , , , , , , | Leave a comment

Where Is My Broadband Insurance Bundle?

As I continue to counsel insurers about master data, they all agree immediately that it is something they need to get their hands around fast.  If you ask participants in a workshop at any carrier; no matter if life, p&c, health or excess, they all raise their hands when I ask, “Do you have broadband bundle at home for internet, voice and TV as well as wireless voice and data?”, followed by “Would you want your company to be the insurance version of this?”

Buying insurance like broadband

Buying insurance like broadband

Now let me be clear; while communication service providers offer very sophisticated bundles, they are also still grappling with a comprehensive view of a client across all services (data, voice, text, residential, business, international, TV, mobile, etc.) each of their touch points (website, call center, local store).  They are also miles away of including any sort of meaningful network data (jitter, dropped calls, failed call setups, etc.)

Similarly, my insurance investigations typically touch most of the frontline consumer (business and personal) contact points including agencies, marketing (incl. CEM & VOC) and the service center.  On all these we typically see a significant lack of productivity given that policy, billing, payments and claims systems are service line specific, while supporting functions from developing leads and underwriting to claims adjucation often handle more than one type of claim.

This lack of performance is worsened even more by the fact that campaigns have sub-optimal campaign response and conversion rates.  As touchpoint-enabling CRM applications also suffer from a lack of complete or consistent contact preference information, interactions may violate local privacy regulations. In addition, service centers may capture leads only to log them into a black box AS400 policy system to disappear.

Here again we often hear that the fix could just happen by scrubbing data before it goes into the data warehouse.  However, the data typically does not sync back to the source systems so any interaction with a client via chat, phone or face-to-face will not have real time, accurate information to execute a flawless transaction.

On the insurance IT side we also see enormous overhead; from scrubbing every database from source via staging to the analytical reporting environment every month or quarter to one-off clean up projects for the next acquired book-of-business.  For a mid-sized, regional carrier (ca. $6B net premiums written) we find an average of $13.1 million in annual benefits from a central customer hub.  This figure results in a ROI of between 600-900% depending on requirement complexity, distribution model, IT infrastructure and service lines.  This number includes some baseline revenue improvements, productivity gains and cost avoidance as well as reduction.

On the health insurance side, my clients have complained about regional data sources contributing incomplete (often driven by local process & law) and incorrect data (name, address, etc.) to untrusted reports from membership, claims and sales data warehouses.  This makes budgeting of such items like medical advice lines staffed  by nurses, sales compensation planning and even identifying high-risk members (now driven by the Affordable Care Act) a true mission impossible, which makes the life of the pricing teams challenging.

Over in the life insurers category, whole and universal life plans now encounter a situation where high value clients first faced lower than expected yields due to the low interest rate environment on top of front-loaded fees as well as the front loading of the cost of the term component.  Now, as bonds are forecast to decrease in value in the near future, publicly traded carriers will likely be forced to sell bonds before maturity to make good on term life commitments and whole life minimum yield commitments to keep policies in force.

This means that insurers need a full profile of clients as they experience life changes like a move, loss of job, a promotion or birth.   Such changes require the proper mitigation strategy, which can be employed to protect a baseline of coverage in order to maintain or improve the premium.  This can range from splitting term from whole life to using managed investment portfolio yields to temporarily pad premium shortfalls.

Overall, without a true, timely and complete picture of a client and his/her personal and professional relationships over time and what strategies were presented, considered appealing and ultimately put in force, how will margins improve?  Surely, social media data can help here but it should be a second step after mastering what is available in-house already.  What are some of your experiences how carriers have tried to collect and use core customer data?

Disclaimer:
Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer  and on our observations.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warrantee or representation of success, either express or implied, is made.
FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customer Services, Customers, Data Governance, Data Privacy, Data Quality, Data Warehousing, Enterprise Data Management, Governance, Risk and Compliance, Healthcare, Master Data Management, Vertical | Tagged , , , , , , , , | Leave a comment

Inside Dreamforce ’13: Why sales may be building 360-degree customer views outside Salesforce

I had a disturbing conversation at Dreamforce. Long story short, thousands of highly skilled and highly paid financial advisors (read sales reps) at a large financial services company are spending most of their day pulling together information about their clients in a spreadsheet, leaving only a few hours to engage with clients and generate revenue.

Not all valuable customer information is in Salesforce

360_degree_city-1024x640

Are you in sales? Your time is too valuable to squander on jobs technology can do. Stop building 360-degree customer views in spreadsheets. Get it delivered in Salesforce.

Why? They don’t have a 360-degree customer view within Salesforce.

Why not? Not all client information that’s valuable to the financial advisors is in Salesforce.  Important client information is in other applications too, such as:

  • Marketing automation application
  • Customer support application
  • Account management applications
  • Finance applications
  • Business intelligence applications

Are you in sales? Do you work for a company that has multiple products or lines of business? Then you can probably relate. In my 15 years of experience working with sales, I’ve found this to be a harsh reality. You have to manually pull together customer information, which is a time-consuming process that doesn’t boost job satisfaction.

Stop building 360-degree customer views in spreadsheets 

So what can you do about it? Stop building 360-degree customer views in spreadsheets. There is a better way and your sales operations leader can help.

One of my favorite customer success stories is about one of the world’s leading wealth management companies, with 16,000 financial advisors globally.  Like most companies, their goal is to increase revenue by understanding their customers’ needs and making relevant cross-sell and up-sell offers.

But, the financial advisors needed an up-to-date view of the “total customer relationship” with the bank before they talked to their high net-worth clients. They wanted to appear knowledgeable and offer a product the client might actually want.

Can you guess what was holding them back? The bank operated in an account-centric world. Each line of business had its own account management application. To get a 360-degree customer view, the financial advisors spent 70% of their time pulling important client information from different applications into spreadsheets. Sound familiar?

Once the head of sales realized this, he decided to invest in information management technology that provides clean, consistent and connected customer information and delivers a 360-degree customer view within Salesforce.

The result? They’ve had a $50 million dollar impact annually and a 30% increase in productivity. In fact, word spread to other banks and the 360-degree customer view in Salesforce became an incentive to attract top talent in the industry.

Ask sales operations to give you 360-degree customer views within Salesforce

I urge you to take action. In particular, talk to your sales operations leader if he or she is at all interested in improving performance and productivity, acquiring and retaining top sales talent, and cutting costs.

Want to see how you can get 360-degree customer views in Salesforce? Check out this demo: Enrich Customer Data in Your CRM Application with MDM. Then schedule a meeting with your sales operations leader.

Have a similar experience to share? Please share it in the comments below.

FacebookTwitterLinkedInEmailPrintShare
Posted in Customer Acquisition & Retention, Customers, Data Integration, Data Quality, Financial Services, Master Data Management, Operational Efficiency, Uncategorized | Tagged , , , , | Leave a comment

Squeezing the Value out of the Old Annoying Orange

I believe that most in the software business believe that it is tough enough to calculate and hence financially justify the purchase or build of an application - especially middleware – to a business leader or even a CIO.  Most of business-centric IT initiatives involve improving processes (order, billing, service) and visualization (scorecarding, trending) for end users to be more efficient in engaging accounts.  Some of these have actually migrated to targeting improvements towards customers rather than their logical placeholders like accounts.  Similar strides have been made in the realm of other party-type (vendor, employee) as well as product data.  They also tackle analyzing larger or smaller data sets and providing a visual set of clues on how to interpret historical or predictive trends on orders, bills, usage, clicks, conversions, etc.

Squeeze that Orange

Squeeze that Orange

If you think this is a tough enough proposition in itself, imagine the challenge of quantifying the financial benefit derived from understanding where your “hardware” is physically located, how it is configured, who maintained it, when and how.  Depending on the business model you may even have to figure out who built it or owns it.  All of this has bottom-line effects on how, who and when expenses are paid and revenues get realized and recognized.  And then there is the added complication that these dimensions of hardware are often fairly dynamic as they can also change ownership and/or physical location and hence, tax treatment, insurance risk, etc.

Such hardware could be a pump, a valve, a compressor, a substation, a cell tower, a truck or components within these assets.  Over time, with new technologies and acquisitions coming about, the systems that plan for, install and maintain these assets become very departmentalized in terms of scope and specialized in terms of function.  The same application that designs an asset for department A or region B, is not the same as the one accounting for its value, which is not the same as the one reading its operational status, which is not the one scheduling maintenance, which is not the same as the one billing for any repairs or replacement.  The same folks who said the Data Warehouse is the “Golden Copy” now say the “new ERP system” is the new central source for everything.  Practitioners know that this is either naiveté or maliciousness. And then there are manual adjustments….

Moreover, to truly take squeeze value out of these assets being installed and upgraded, the massive amounts of data they generate in a myriad of formats and intervals need to be understood, moved, formatted, fixed, interpreted at the right time and stored for future use in a cost-sensitive, easy-to-access and contextual meaningful way.

I wish I could tell you one application does it all but the unsurprising reality is that it takes a concoction of multiple.  None or very few asset life cycle-supporting legacy applications will be retired as they often house data in formats commensurate with the age of the assets they were built for.  It makes little financial sense to shut down these systems in a big bang approach but rather migrate region after region and process after process to the new system.  After all, some of the assets have been in service for 50 or more years and the institutional knowledge tied to them is becoming nearly as old.  Also, it is probably easier to engage in often required manual data fixes (hopefully only outliers) bit-by-bit, especially to accommodate imminent audits.

So what do you do in the meantime until all the relevant data is in a single system to get an enterprise-level way to fix your asset tower of Babel and leverage the data volume rather than treat it like an unwanted step child?  Most companies, which operate in asset, fixed-cost heavy business models do not want to create a disruption but a steady tuning effect (squeezing the data orange), something rather unsexy in this internet day and age.  This is especially true in “older” industries where data is still considered a necessary evil, not an opportunity ready to exploit.  Fact is though; that in order to improve the bottom line, we better get going, even if it is with baby steps.

If you are aware of business models and their difficulties to leverage data, write to me.  If you even know about an annoying, peculiar or esoteric data “domain”, which does not lend itself to be easily leveraged, share your thoughts.  Next time, I will share some examples on how certain industries try to work in this environment, what they envision and how they go about getting there.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Big Data, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customers, Data Governance, Data Quality, Enterprise Data Management, Governance, Risk and Compliance, Healthcare, Life Sciences, Manufacturing, Master Data Management, Mergers and Acquisitions, Operational Efficiency, Product Information Management, Profiling, Telecommunications, Transportation, Utilities & Energy, Vertical | 1 Comment

Improving CMS Star Ratings… The Secret Sauce

Many of our customers are Medicare health plans and one thing that keeps coming up in conversation is how they can transform business processes to improve star ratings. For plans covering health services, the overall score for quality of those services covers 36 different topics in 5 categories:

1. Staying healthy: screenings, tests, and vaccines

2. Managing chronic (long-term) conditions

3. Member experience with the health plan

4. Member complaints, problems getting services, and improvement in the health plan’s performance

5. Health plan customer service

Based on member feedback and activity in each of these areas, the health plans receive a rating (1-5 stars) which is published and made available to consumers. These ratings play a critical role in plan selection each Fall. The rating holds obvious value as consumers are increasingly “yelp minded,” meaning they look to online reviews from peer groups to make buying decisions. Even with this realization though, improving ratings is a challenge. There are the typical complexities of any survey: capturing a representative respondent pool, members may be negatively influenced by a single event and there are commonly emotional biases. There are also less obvious challenges associated with the data.

For example, a member with CHF may visit north of 8 providers in a month and they may or may not follow through on prescribed preventative care measures. How does CMS successfully capture the clinical and administrative data on each of these visits when patient information may be captured differently at each location? How does the health plan ensure that the CMS interpretation matches their interpretation of the visit data? In many cases, our customers have implemented an enterprise data warehouse and are doing some type of claims analysis but this analysis requires capturing new data and analyzing data in new ways.

We hear that those responsible for member ratings, retention and acquisition routinely wait >6 months to have a source or data added to a reporting database. The cycle time is too great to make a quick and meaningful impact on the ratings.

Let’s continue this discussion next week during your morning commute.

Join me as I talk with Frank Norman a Healthcare Partners at Knowledgent.

During this “drive time” webinar series, health plans will learn how to discover insights to improve CMS Star ratings.

Part 1 of the webinar series: Top 5 Reasons Why Improving CMS Star Ratings is a Challenge

Part 2 of the webinar series: Using Your Data to Improve CMS Star Ratings

Part 3 of the webinar series: Automating Insights into CMS Star Ratings

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Big Data, CIO, Customers, Data Warehousing, Enterprise Data Management, Healthcare | Tagged , , | Leave a comment

Data Integration and Enterprise Success, a Winning Combination

Interesting that I found this one.  Informatica announced that two Informatica customers were named Leaders in the Ventana Research 2013 Leadership Awards, which honor the leaders and pioneers who have contributed to their organizations’ successes.  While many of you may think that I’m shelling for our host, these stories are actually hard to come by, and when I find them I love to make hay.

This is not a lack of interest; it’s just the fact that those successful with data integration projects are typically the unsung heroes of enterprise IT.  There are almost never awards.  However, those who count on enterprise IT to provide optimal data flow in support of the business processes should understand that the data integration got them there.  In this case, one of the more interesting stories was around UMASS Memorial Health Care: Leadership Award For CIO George Brenckle

“The Ventana Research Leadership Awards recognize organizations and supporting technology vendors that have effectively achieved superior results through using people, processes, information and technology while applying best practices within specific business and technology categories.” Those receiving these awards leverage the Informatica Platform, thus why the award is promoted.  However, I find the approaches and the way technology is leveraged most interesting.

Just as a bit of background.  UMASS Memorial Health Care undertook the Cornerstone initiative to transform the way data is used across its medical center, four community hospitals, more than 60 outpatient clinics, and the University of Massachusetts Medical School.  The geographical distribution of these entities, and the different ways that they store data is always the challenge.

When approaching these problems you need two things: First, a well defined plan as to how you plan on approaching the problem, including the consumption of information from the source, the processing of that information, and the production of that information to the target.  Cornerstone implements common patient, clinical and financial systems and drive information across these systems to optimize healthcare delivery and improve patient outcomes, grow the patient population and increase efficiency.

UMASS Memorial Health Care used Informatica to establish a data integration and data quality initiative to incorporate data from its clinical, financial and administrative sources and targets.  Using the Informatica technology, they are able to place volatility into the domain of the integration technology, in this case, Informatica.  This allows the integration administrator to add or delete systems as needed, and brings the concept of agility to a rapidly growing hospital system.

“The success of Cornerstone has resulted in primary patient panel analytics, online diabetes care, a command center for the ICU, and compliance with Medicare programs.”  Indeed, the business results are apparent around the use of data integration approaches and technology, including the ability to trace this project back to an immediate and definable business benefit.

FacebookTwitterLinkedInEmailPrintShare
Posted in Customers, Healthcare | Tagged , , , , | 1 Comment

Informatica World 2013: What’s Happening in the World of Data Governance?

It’s never been a more exciting to be in the data management industry.  With more and more organizations looking to make the most of their enterprise data, the need to do things better, faster & cheaper is ever increasing.  It should come as no surprise then, that data governance continues to be an important initiative surfacing across industries of all types.  Through data governance, organizations are looking to unleash the true potential of data and leverage it for competitive advantage.

Since this is such an important and relevant topic these days, we have a number of sessions next week at Informatica World designed to help your organization drive success with your data governance initiative, regardless of whether you’re just getting started or are looking to drive improvement in your current program.  Here are just a few of the highlights surrounding data governance next week:

 Holistic Data Governance: A Framework for Competitive Advantage

  • Learn how to make data governance a competitive differentiator that identifies critical business processes, decisions, and interactions and establishes policies, processes, roles, responsibilities, and architectures to support them with trusted, secure data.

Holistic Data Governance: Customer Roundtable

  • Rob Karel, VP of Strategy at Informatica, moderates this panel discussion featuring customers detailing their successful implementations of holistic data governance.

Data Governance in Action at Wells Fargo

  • Learn about the bank’s journey to a successful data governance program, the supporting role Informatica has played, and what lessons other organizations can take away from Wells Fargo’s experiences.

This is just a sample of what’s in store.  In addition to compelling sessions, you’ll also have the opportunity to hear and talk with Informatica executives and several of our customers who can help you on your journey to data governance success.

It’s not too late to register for Informatica World, check out the registration page for full program details.  I hope to see you there next week in Las Vegas!

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Customers, Data Governance, Data Quality, Enterprise Data Management, Informatica Events, Master Data Management, Uncategorized | Tagged , , , , | Leave a comment

A key growth milestone for the Informatica Marketplace

Recently, the Informatica Marketplace reached a major milestone: we exceeded 1,000 Blocks (Apps). Looking back to three years ago when we started with 70 Blocks from a handful of partners, it’s an amazing achievement to have reached this volume of solutions in such a short time. For me, it speaks to the tremendous value that the Marketplace brings not only to our customers who download more than 10,000 Blocks per month, but also to our partners who have found in the Marketplace a viable route to market and a great awareness and monetization vehicle for their solutions.

There has been a lot of discussion around the explosion of data and what it means to companies trying to leverage this extremely valuable resource. Informatica has a huge part to play in helping customers solve those problems not only through the technologies we provide directly, but through the tremendous ecosystem that we have built through our partners. The Marketplace has grown to more than 165 unique partner companies, and we’re adding more every day. Blocks such as BI & Analytics sing Social Media Data from Deloitte, and Interstage XWand – XBRL Processor from Fujitsu represent offerings from large, established software companies, while Blocks such as Skybot Enterprise Job Scheduler and Undraleu Code Review Tool from Coeurdata are solutions that have been contributed by earlier stage companies that have experienced significant success and growth. It has been a pleasure helping these companies to grow and reach new customers through the Marketplace.

One of the most exciting things about reaching the 1K Block milestone is not just the amount of companies that are on the Marketplace, but the amount of solutions that have been contributed from our developer community. Blocks such as Autotype Excel Macro, Execute Workflow, and iExportNormalizer are all solutions that Informatica developers have built because it helps them in their daily activities, and through the Marketplace they have found a way to share these valuable assets with the community. In fact, over half of our solutions are free to use, which is a ringing endorsement of the power of the community and a great way to try out any number of useful solutions at no risk. By leveraging enabling technologies such as Informatica’s Cloud Platform as a Service, developers can create and share solutions more quickly and easily than ever before.

Overall, it has been an exciting ride as the Marketplace has rocketed to 1,000 Blocks in under three years, and I look forward to what the next three years has in store!

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Customers, Marketplace, PaaS, Uncategorized | Tagged , , , , , , , , , , , | Leave a comment

How to Improve Your Application Performance with the Hardware You Already Have

Most application owners know that as data volumes accumulate, application performance can take a major hit if the underlying infrastructure is not aligned to keep up with demand. The problem is that constantly adding hardware to manage data growth can get costly – stealing budgets away from needed innovation and modernization initiatives.

Join Julie Lockner as she reviews the Cox Communications case study on how they were able to solve an application performance problem caused by too much data with the hardware they already had by using Informatica Data Archive with Smart Partitioning. Source: TechValidate. TVID: 3A9-97F-577

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Customers, Telecommunications | Tagged , , , , , , | Leave a comment