Category Archives: Enterprise Data Management

Non-Clonetroversial Oracle Data Cloning

Cloning

When the average person hears of cloning, my bet is that they think of the controversy and ethical issues surrounding cloning, such as the cloning of Dolly the sheep, or the possible cloning of humans by a mad geneticist in a rogue nation state. I would also put money down that when an Informatica blog reader thinks of cloning they think of “The Matrix” or “Star Wars” (that dreadful episode II Attack of the Clones).   I did.  Unfortunately.

But my pragmatic expectation is that when Informatica customers think of cloning, they also think of Data Cloning software.  Data Cloning software clones terabytes of database data into a host of other databases, data warehouses, analytical appliances, and Big Data stores such as Hadoop.  And just for hoots and hollers, you should know that almost half of all Data Integration efforts involve replication, be it snapshot or real-time, according to TDWI survey data. Survey also says… replication is the second most popular — or second most used — data integration tool, behind ETL.

ClonewocontroCloning should be easy and very natural. It’s an important part of life (at your job).   However, we can all admit that it is also a process that causes many a headache and ruins many a relationship.

Do your company’s cloning tools work with non-standard types? Know that Informatica cloning tools can reproduce Oracle data to just about anything on 2 tuples (or more).  We do non-discriminatory duplication, so it’s no wonder we especially fancy cloning the Oracle!  (a thousand apologies for the bad “Matrix” pun)

Just remember that data clones are an important and natural component of business continuity, and the use cases span both operational and analytic applications.  So if you’re not cloning your Oracle data safely and securely with the quality results that you need and deserve, it’s high time that you get some better tools.

Send in the Clones

With that in mind, if you haven’t tried to clone before, for a limited time, Informatica is making Fast Clone database cloning trial software product available for a free download. Click here to get it now.

FacebookTwitterLinkedInEmailPrintShare
Posted in data replication, Data Synchronization, Database Archiving, Enterprise Data Management, Marketplace | Tagged , , , , , , , , | Leave a comment

Architects: 8 Great Reasons To Be At Informatica World 2014

Architects INFA14For years, corporations have tried to solve business problems by “throwing new enterprise applications at them.” This strategy has created ‘silos of data’ that are attached to these applications and databases. As a result, it is now increasingly difficult to find, access and use the correct data for new projects.

Data fragmentation leads to slow project delivery and “shadow IT.” Without a change in enterprise data strategy, is it unlikely this problem will improve. On the contrary, the growth of cloud applications and platforms, mobile applications, NoSQL, and the “Internet of Things” create increasing urgency.  Unless a new approach to enterprise data management is taken, the house of cards is going to crumble.

I seriously doubt that this is a surprise to most of you. The question is, “What should we do about it?” The Informatica World 2014 event is a perfect place to find answers. Here are eight benefits architects will enjoy at Informatica World 2014:

  1. A dedicated track of breakout sessions for architects. This track explores reference architectures, design patterns, best practices and real-world examples for building and sustaining your next-generation information architecture. The track will begin with a keynote on the broader issues of enterprise data architecture. It will also include panel of architects from leading companies.
  2. Inspiration to participate in defining the enterprise data architecture of the future. (I can’t spoil it by divulging the details here, but I promise that it will be interesting and worth your while!)
  3. New insights on how to manage information for the data-centric enterprise. These will expand your thinking about data architecture and platforms.
  4. Chances to network with architect peers.
  5. A Hands-on Lab, were you can talk to Informatica experts about their products and solutions.
  6. Updates on what Informatica is doing to bring business subject matter experts in as full partners in the co-development of data-related projects.
  7. A chance to win a one-hour one-on-one session with an Informatica architect at Informatica World.
  8. A chance to learn to control your biggest enterprise system: The collection of all data-moving resources across your company.

We believe that architecture is the key to unleashing information potential. To compete in today’s global 24x7x365 economy, business requires well-designed information architectures that can continuously evolve to support the new heights of flexibility, efficiency, responsiveness, and trust. I hope you will join us in Las Vegas, May 12-15!

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Enterprise Data Management, Informatica World 2014 | Tagged , | Leave a comment

The Surprising Link Between Hurricanes and Strawberry Pop-Tarts: Brought to you by Clean, Consistent and Connected Data

What do you think Wal-Mart’s best-seller is right before a hurricane? If you guessed water like I did, you’d be wrong. According to this New York Times article, “What Wal-Mart Knows About Customers’ Habits” the retailer sells 7X more strawberry Pop-Tarts in Florida right before a hurricane than any other time. Armed with predictive analytics and a solid information management foundation, the team stocks up on strawberry Pop-Tarts to make sure they have enough supply to meet demand.

Andrew Donaher advises IT leaders to ask business  leaders how much bad data is costing them.

Andrew Donaher
advises IT leaders to ask business
leaders how much bad data is costing them.

I learned this fun fact from Andrew Donaher, Director of Information Management Strategy at Groundswell Group, a consulting firm based in western Canada that specializes in information management services. In this interview, Andy and I discuss how IT leaders can increase the value of data to drive business value, explain how some IT leaders are collaborating with business leaders to improve predictive analytics, and share advice about how to talk to business leaders, such as the CFO about investing in an information management strategy.

Q. Andy, what can IT leaders do to increase the value of data to drive business value?

A. Simply put, each business leader in a company needs to focus on achieving their goals. The first step IT leaders should take is to engage with each business leader to understand their long and short-term goals and ask some key questions, such as:

  • What type of information is critical to achieving their goals?
  • Do they have the information they need to make the next decision or take the next best action?
  • Is all the data they need in house? If not, where is it?
  • What challenges are they facing when it comes to their data?
  • How much time are people spending trying to pull together the information they need?
  • How much time are people spending fixing bad data?
  • How much is this costing them?
  • What opportunities exist if they had all the information they need and could trust it?
You need a solid information management strategy to make the shift from looking into the rear-view mirror realizing the potential business value of predictive analytics.

If you want to get the business value you’re expecting by shifting from rear-view mirror style reporting to predictive analytics, you need to use clean, consistent and connected data

Q. How are IT leaders collaborating with business partners to improve predictive analytics?

A. Wal-Mart’s IT team collaborated with the business to improve the forecasting and demand planning process. Once they found out what was important, IT figured out how to gather, store and seamlessly integrate external data like historical weather and future weather forecasts into the process. This enabled the business to get more valuable insights, tailor product selections at particular stores, and generate more revenue.

Q. Why is it difficult for IT leaders to convince business leaders to invest in an information management strategy?

A. In most cases, business leaders don’t see the value in an information management strategy or they haven’t seen value before. Unfortunately this often happens because IT isn’t able to connect the dots between the information management strategy and the outcomes that matter to the business.

Business leaders see value in having control over their business-critical information, being able to access it quickly and to allocate their resources to get any additional information they need. Relinquishing control takes a lot of trust. When IT leaders want to get buy-in from business leaders to invest in an information management strategy they need to be clear about how it will impact business priorities. Data integration, data quality and master data management (MDM) should be built into the budget for predictive or advanced analytics initiatives to ensure the data the business is relying on is clean, consistent and connected.

Q: You liked this quotation from an IT leader at a beer manufacturing company, “We don’t just make beer. We make beer and data. We need to manage our product supply chain and information supply chain equally efficiently.”

A.What I like about that quote is the IT leader was able to connect the dots between the primary revenue generator for the company and the role data plays in improving organizational performance. That’s something that a lot of IT leaders struggle with. IT leaders should always be thinking about what’s the next thing they can do to increase business value with the data they have in house and other data that the company may not yet be tapping into.

Q. According to a recent survey by Gartner and the Financial Executives Research Foundation, 60% of Chief Financial Officers (CFOs) are investing in analytics and improved decision-making as their #1 IT priority. What’s your advice for IT Leaders who need to get buy-in from the CFO to invest in information management?

A. Read your company’s financial statements, especially the Management Discussion and Analysis section. You’ll learn about the company’s direction, what the stakeholders are looking for, and what the CFO needs to deliver. Offer to get your CFO the information s/he needs to make decisions and to deliver. When you talk to a CFO about investing in information management, focus on the two things that matter most:

  1. Risk mitigation: CFOs know that bad decisions based on bad information can negatively impact revenue, expenses and market value. If you have to caveat all your decisions because you can’t trust the information, or it isn’t current, then you have problems. CFOs need to trust their information. They need to feel confident they can use it to make important financial decisions and deliver accurate reports for compliance.
  2. Opportunity: Once you have mitigated the risk and can trust the data, you can take advantage of predictive analytics. Wal-Mart doesn’t just do forecasting and demand planning. They do “demand shaping.” They use accurate, consistent and connected data to plan events and promotions not just to drive inventory turns, but to optimize inventory and the supply chain process. Some companies in the energy market are using accurate, consistent and connected data for predictive asset maintenance. By preventing unplanned maintenance they are saving millions of dollars, protecting revenue streams, and gaining health and safety benefits.

To do either of these things you need a solid information management plan to manage clean, consistent and connected information.  It takes a commitment but the pays offs can be very significant.

Q. What are the top three business requirements when building an information management and integration strategy?
A: In my experience, IT leaders should focus on:

  1. Business value: A solid information management and integration strategy that has a chance of getting funded must be focused on delivering business value. Otherwise, your strategy will lack clarity and won’t drive priorities. If you focus on business value, it will be much easier to gain organizational buy-in. Get that dollar figure before you start anything. Whether it is risk mitigation, time savings, revenue generation or cost savings, you need to calculate that value to the business and get their buy-in.
  2. Trust: When people know they can trust the information they are getting it liberates them to explore new ideas and not have to worry about issues in the data itself.
  3. Flexibility: Flexibility should be banked right into the strategy. Business drivers will evolve and change. You must be able to adapt to change. One of the most neglected, and I would argue most important, parts of a solid strategy is the ability to make continuous small improvements that may require more effort than a typical maintenance event, but don’t create long delays. This will be very much appreciated by the business. We work with our clients to ensure that this is addressed.
FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Retail | Tagged , , , , , , , , , | 11 Comments

Would YOU Buy a Ford Pinto Just To Get the Fuzzy Dice?

Today, I am going to take a stab at rationalizing why one could even consider solving a problem with a solution that is well-known to be sub-par. Consider the Ford Pinto: Would you choose this car for your personal, land-based transportation simply because of the new plush dice in the window? For my European readers, replace the Pinto with the infamous Trabant and you get my meaning.  The fact is, both of these vehicles made the list of the “worst cars ever built” due to their mediocre design, environmental hazards or plain personal safety record.

What is a Pinto-like buying decision in information technology procurement? (source: msn autos)

What is a Pinto-like buying decision in information technology procurement? (source: msn autos)

Rational people would never choose a vehicle this way. So I always ask myself, “How can IT organizations rationalize buying product X just because product Y is thrown in for free?” Consider the case in which an organization chooses their CRM or BPM system simply because the vendor throws in an MDM or Data Quality Solution for free: Can this be done with a straight face?  You often hear vendors claim that “everything in our house is pre-integrated”, “plug & play” or “we have accelerators for this.” I would hope that IT procurement officers have come to understand that these phrases don’t close a deal in a cloud-based environment. That is even less so in an on-premise construct as it can never achieve this Nirvana unless it is customized based on client requirements.

Anyone can see the logic in getting “2 for the price of 1.” However, as IT procurement organizations seek to save a percentage of money every deal, they can’t lose sight of this key fact:

Standing up software (configuring, customizing, maintaining) and operating it over several years requires CLOSE inspection and scrutiny.

Like a Ford Pinto, Software cannot just be driven off the lot without a care, leaving you only to worry about changing the oil and filters at recommended intervals. Customization, operational risk and maintenance are a significant cost, which all my seasoned padawans will know. If Pinto buyers would have understood the Total Cost of Ownership before they made their purchase, they would have opted for Toyotas instead. Here is the bottom line:

If less than 10% of the overall requirements are solved by the free component
AND (and this is a big AND)
If less than 12% of the overall financial value is provided by the free component
Then it makes ZERO sense select a solution based on freebie add-ons.

When an add-on component is of significantly lower-quality than industry leading solutions, it becomes even more illogical to rely on it simply because it’s “free.” If analysts have affirmed that the leading solutions have stronger capabilities, flexibility and scalability, what does an IT department truly “save” by choosing an inferior “free” add-on?

So just why DO procurement officers gravitate toward “free” add-ons, rather than high quality solutions? As a former procurement manager, I remember the motivations perfectly. Procurement teams are often measured by, and rewarded for, the savings they achieve. Because their motivation is near-term savings, long term quality issues are not the primary decision driver. And, if IT fails to successfully communicate the risks, cost drivers and potential failure rates to Procurement, the motivation to save up-front money will win every time.

Both sellers and buyers need to avoid these dances of self-deception, the “Pre-Integration Tango” and the “Freebie Cha-Cha”.  No matter how much you loved driving that Pinto or Trabant off the dealer lot, your opinion changed after you drove it for 50,000 miles.

I’ve been in procurement. I’ve built, sold and implemented “accelerators” and “blueprints.” In my opinion, 2-for-1 is usually a bad idea in software procurement. The best software is designed to make 1+1=3. I would love to hear from you if you agree with my above “10% requirements/12% value” rule-of-thumb.  If not, let me know what your decision logic would be.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Enterprise Data Management | Tagged , , | 10 Comments

If you Want Business to “Own” the Data, You Need to Build An Architecture For the Business

If you build an IT Architecture, it will be a constant up-hill battle to get business users and executives engaged and take ownership of data governance and data quality. In short you will struggle to maximize the information potential in your enterprise. But if you develop and Enterprise Architecture that starts with a business and operational view, the dynamics change dramatically. To make this point, let’s take a look at a case study from Cisco. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Data Governance, Data Integration, Enterprise Data Management, Governance, Risk and Compliance, Integration Competency Centers | Tagged , , , , | Leave a comment

Death of the Data Scientist: Silver Screen Fiction?

Maybe the word “death” is a bit strong, so let’s say “demise” instead.  Recently I read an article in the Harvard Business Review around how Big Data and Data Scientists will rule the world of the 21st century corporation and how they have to operate for maximum value.  The thing I found rather disturbing was that it takes a PhD – probably a few of them – in a variety of math areas to give executives the necessary insight to make better decisions ranging from what product to develop next to who to sell it to and where.

Who will walk the next long walk.... (source: Wikipedia)

Who will walk the next long walk…. (source: Wikipedia)

Don’t get me wrong – this is mixed news for any enterprise software firm helping businesses locate, acquire, contextually link, understand and distribute high-quality data.  The existence of such a high-value role validates product development but it also limits adoption.  It is also great news that data has finally gathered the attention it deserves.  But I am starting to ask myself why it always takes individuals with a “one-in-a-million” skill set to add value.  What happened to the democratization  of software?  Why is the design starting point for enterprise software not always similar to B2C applications, like an iPhone app, i.e. simpler is better?  Why is it always such a gradual “Cold War” evolution instead of a near-instant French Revolution?

Why do development environments for Big Data not accommodate limited or existing skills but always accommodate the most complex scenarios?  Well, the answer could be that the first customers will be very large, very complex organizations with super complex problems, which they were unable to solve so far.  If analytical apps have become a self-service proposition for business users, data integration should be as well.  So why does access to a lot of fast moving and diverse data require scarce PIG or Cassandra developers to get the data into an analyzable shape and a PhD to query and interpret patterns?

I realize new technologies start with a foundation and as they spread supply will attempt to catch up to create an equilibrium.  However, this is about a problem, which has existed for decades in many industries, such as the oil & gas, telecommunication, public and retail sector. Whenever I talk to architects and business leaders in these industries, they chuckle at “Big Data” and tell me “yes, we got that – and by the way, we have been dealing with this reality for a long time”.  By now I would have expected that the skill (cost) side of turning data into a meaningful insight would have been driven down more significantly.

Informatica has made a tremendous push in this regard with its “Map Once, Deploy Anywhere” paradigm.  I cannot wait to see what’s next – and I just saw something recently that got me very excited.  Why you ask? Because at some point I would like to have at least a business-super user pummel terabytes of transaction and interaction data into an environment (Hadoop cluster, in memory DB…) and massage it so that his self-created dashboard gets him/her where (s)he needs to go.  This should include concepts like; “where is the data I need for this insight?’, “what is missing and how do I get to that piece in the best way?”, “how do I want it to look to share it?” All that is required should be a semi-experienced knowledge of Excel and PowerPoint to get your hands on advanced Big Data analytics.  Don’t you think?  Do you believe that this role will disappear as quickly as it has surfaced?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Integration, Data Integration Platform, Data Quality, Data Warehousing, Enterprise Data Management, Financial Services, Healthcare, Life Sciences, Manufacturing, Master Data Management, Operational Efficiency, Profiling, Scorecarding, Telecommunications, Transportation, Uncategorized, Utilities & Energy, Vertical | Tagged , , , , | 1 Comment

And now for the rest of the data…

In the first two issues I spent time looking at the need for states to pay attention to the digital health and safety of their citizens, followed by the oft forgotten need to understand and protect the non-production data. This is data than has often proliferated and also ignored or forgotten about.

In many ways, non-production data is simpler to protect. Development and test systems can usually work effectively with realistic but not real PII data and realistic but not real volumes of data. On the other hand, production systems need the real production data complete with the wealth of information that enables individuals to be identified – and therefore presents a huge risk. If and when that data is compromised either deliberately or accidentally the consequences can be enormous; in the impact on the individual citizens and also the cost of remediation on the state. Many will remember the massive South Carolina data breach of late 2012 when over the course of 2 days a 74 GB database was downloaded and stolen, around 3.8 million payers and 1.9 million dependents had their social security information stolen and 3.3 million “lost” bank account details. The citizens’ pain didn’t end there, as the company South Carolina picked to help its citizens seems to have tried to exploit the situation.

encryption protects against theft - unless the key is stolen too

encryption protects against theft – unless the key is stolen too

The biggest problem with securing production data is that there are numerous legitimate users and uses of that data, and most often just a small number of potentially malicious or accidental attempts of inappropriate or dangerous access. So the question is… how does a state agency protect its citizens’ sensitive data while at the same time ensuring that legitimate uses and users continues – without performance impacts or any disruption of access? Obviously each state needs to make its own determination as to what approach works best for them.

This video does a good job at explaining the scope of the overall data privacy/security problems and also reviews a number of successful approaches to protecting sensitive data in both production and non-production environments. What you’ll find is that database encryption is just the start and is fine if the database is “stolen” (unless of course the key is stolen along with the data! Encryption locks the data away in the same way that a safe protects physical assets – but the same problem exists. If the key is stolen with the safe then all bets are off. Legitimate users are usually easily able deliberately breach and steal the sensitive contents, and it’s these latter occasions we need to understand and protect against. Given that the majority of data breaches are “inside jobs” we need to ensure that authorized users (end-users, DBAs, system administrators and so on) that have legitimate access only have access to the data they absolutely need, no more and no less.

So we have reached the end of the first series. In the first blog we looked at the need for states to place the same emphasis on the digital health and welfare of their citizens as they do on their physical and mental health. In the second we looked at the oft-forgotten area of non-production (development, testing, QA etc.) data. In this third and final piece we looked at the need to and some options for providing the complete protection of non-production data.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Business Impact / Benefits, Data masking, Data Privacy, Enterprise Data Management, Public Sector | Tagged , , , , | Leave a comment

Hospitality Execs: Invest in Great Customer Information to Support A Customer-Obsessed Culture

I love exploring new places. I’ve had exceptional experiences at the W in Hong Kong, El Dorado Royale in the Riviera Maya and Ventana Inn in Big Sur. I belong to almost every loyalty program under the sun, but not all hospitality companies are capitalizing on the potential of my customer information. Imagine if employees had access to it so they could personalize their interactions with me and send me marketing offers that appeal to my interests.

Do I have high expectations? Yes. But so do many travelers. This puts pressure on marketing and sales executives who want to compete to win. According to Deloitte’s report, “Hospitality 2015: Game changers or spectators?,” hospitality companies need to adapt to meet consumers’ increasing expectations to know their preferences and tastes and to customize packages that suit individual needs.

Jeff Klagenberg helps companies to use their data as a strategic asset

Jeff Klagenberg helps companies use data as a strategic asset and get the most value out of it.

In this interview, Jeff Klagenberg, senior principal at Myers-Holum, explains how one of the largest, most customer-focused companies in the hospitality industry is investing in better customer, product, and asset information. Why? To personalize customer interactions, bundle appealing promotion packages and personalize marketing offers across channels.

Q: What are the company’s goals?
A: The executive team at one of the world’s leading providers of family travel and leisure experiences is focused on achieving excellence in quality and guest services. They generate revenues from the sales of room nights at hotels, food and beverages, merchandise, admissions and vacation club properties. The executive team believes their future success depends on stronger execution based on better measurement and a better understanding of customers.

Q: What role does customer, product and asset information play in achieving these goals?
A: Without the highest quality business-critical data, how can employees continually improve customer interactions? How can they bundle appealing promotional packages or personalize marketing offers? How can they accurately measure the impact of sales and marketing efforts? The team recognized the powerful role of high quality information in their pursuit of excellence.

Q: What are they doing to improve the quality of this business-critical information?
A: To get the most value out of their data and deliver the highest quality information to business and analytical applications, they knew they needed to invest in an integrated information management infrastructure to support their data governance process. Now they use the Informatica Total Customer Relationship Solution, which combines data integration, data quality, and master data management (MDM). It pulls together fragmented customer information, product information, and asset information scattered across hundreds of applications in their global operations into one central, trusted location where it can be managed and shared with analytical and operational applications on an ongoing basis.

Many marketers overlook the importance of using high quality customer information in their personalization capabilities.

Many marketers overlook the importance of using high quality customer information in their investments in personalization.

Q: How will this impact marketing and sales?
A: With clean, consistent and connected customer information, product information, and asset information in the company’s applications, they are optimizing marketing, sales and customer service processes. They get limitless insights into who their customers are and their valuable relationships, including households, corporate hierarchies and influencer networks. They see which products and services customers have purchased in the past, their preferences and tastes. High quality information enables the marketing and sales team to personalize customer interactions across touch points, bundle appealing promotional packages, and personalize marketing offers across channels. They have a better understanding of which marketing, advertising and promotional programs work and which don’t.

Q: What is the role did the marketing and sales leaders play in this initiative?
A: The marketing leaders and sales leaders played a key role in getting this initiative off the ground. With an integrated information management infrastructure in place, they’ll benefit from better integration between business-critical master data about customers, products and assets and transaction data.

Q. How will this help them gain customer insights from “Big Data”?
A. We helped the business leaders understand that getting customer insights from “Big Data” such as weblogs, call logs, social and mobile data requires a strong backbone of integrated business-critical data. By investing in a data-centric approach, they future-proofed their business. They are ready to incorporate any type of data they will want to analyze, such as interaction data. A key realization was there is no such thing as “Small Data.” The future is about getting very bit of understanding out of every data source.

Q: What advice do you have for hospitality industry executives?
A: Ask yourself, “Which of our strategic initiatives can be achieved with inaccurate, inconsistent and disconnected information?” Most executives know that the business-critical data in their applications, used by employees across the globe, is not the highest quality. But they are shocked to learn how much this is costing the company. My advice is talk to IT about the current state of your customer, product and asset information. Find out if it is holding you back from achieving your strategic initiatives.

Also, many business executives are excited about the prospect of analyzing “Big Data” to gain revenue-generating insights about customers. But the business-critical data about customers, products and assets is often in terrible shape. To use an analogy: look at a wheat field and imagine the bread it will yield. But don’t forget if you don’t separate the grain from the chaff you’ll be disappointed with the outcome. If you are working on a Big Data initiative, don’t forget to invest in the integrated information management infrastructure required to give you the clean, consistent and connected information you need to achieve great things.

FacebookTwitterLinkedInEmailPrintShare
Posted in Customer Acquisition & Retention, Customers, Data Integration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , , , , , , , , , , , , , , , | Leave a comment

Murphy’s First Law of Bad Data – If You Make A Small Change Without Involving Your Client – You Will Waste Heaps Of Money

I have not used my personal encounter with bad data management for over a year but a couple of weeks ago I was compelled to revive it.  Why you ask? Well, a complete stranger started to receive one of my friend’s text messages – including mine – and it took days for him to detect it and a week later nobody at this North American wireless operator had been able to fix it.  This coincided with a meeting I had with a European telco’s enterprise architecture team.  There was no better way to illustrate to them how a customer reacts and the risk to their operations, when communication breaks down due to just one tiny thing changing – say, his address (or in the SMS case, some random SIM mapping – another type of address).

Imagine the cost of other bad data (thecodeproject.com)

Imagine the cost of other bad data (thecodeproject.com)

In my case, I  moved about 250 miles within the United States a couple of years ago and this seemingly common experience triggered a plethora of communication screw ups across every merchant a residential household engages with frequently, e.g. your bank, your insurer, your wireless carrier, your average retail clothing store, etc.

For more than two full years after my move to a new state, the following things continued to pop up on a monthly basis due to my incorrect customer data:

  • In case of my old satellite TV provider they got to me (correct person) but with a misspelled last name at my correct, new address.
  • My bank put me in a bit of a pickle as they sent “important tax documentation”, which I did not want to open as my new tenants’ names (in the house I just vacated) was on the letter but with my new home’s address.
  • My mortgage lender sends me a refinancing offer to my new address (right person & right address) but with my wife’s as well as my name completely butchered.
  • My wife’s airline, where she enjoys the highest level of frequent flyer status, continually mails her offers duplicating her last name as her first name.
  • A high-end furniture retailer sends two 100-page glossy catalogs probably costing $80 each to our address – one for me, one for her.
  • A national health insurer sends “sensitive health information” (disclosed on envelope) to my new residence’s address but for the prior owner.
  • My legacy operator turns on the wrong premium channels on half my set-top boxes.
  • The same operator sends me a SMS the next day thanking me for switching to electronic billing as part of my move, which I did not sign up for, followed by payment notices (as I did not get my invoice in the mail).  When I called this error out for the next three months by calling their contact center and indicating how much revenue I generate for them across all services, they counter with “sorry, we don’t have access to the wireless account data”, “you will see it change on the next bill cycle” and “you show as paper billing in our system today”.

Ignoring the potential for data privacy law suits, you start wondering how long you have to be a customer and how much money you need to spend with a merchant (and they need to waste) for them to take changes to your data more seriously.  And this are not even merchants to whom I am brand new – these guys have known me and taken my money for years!

One thing I nearly forgot…these mailings all happened at least once a month on average, sometimes twice over 2 years.  If I do some pigeon math here, I would have estimated the postage and production cost alone to run in the hundreds of dollars.

However, the most egregious trespass though belonged to my home owner’s insurance carrier (HOI), who was also my mortgage broker.  They had a double whammy in store for me.  First, I received a cancellation notice from the HOI for my old residence indicating they had cancelled my policy as the last payment was not received and that any claims will be denied as a consequence.  Then, my new residence’s HOI advised they added my old home’s HOI to my account.

After wondering what I could have possibly done to trigger this, I called all four parties (not three as the mortgage firm did not share data with the insurance broker side – surprise, surprise) to find out what had happened.

It turns out that I had to explain and prove to all of them how one party’s data change during my move erroneously exposed me to liability.  It felt like the old days, when seedy telco sales people needed only your name and phone number and associate it with some sort of promotion (back of a raffle card to win a new car), you never took part in, to switch your long distance carrier and present you with a $400 bill the coming month.  Yes, that also happened to me…many years ago.  Here again, the consumer had to do all the legwork when someone (not an automatic process!) switched some entry without any oversight or review triggering hours of wasted effort on their and my side.

We can argue all day long if these screw ups are due to bad processes or bad data, but in all reality, even processes are triggered from some sort of underlying event, which is something as mundane as a database field’s flag being updated when your last purchase puts you in a new marketing segment.

Now imagine you get married and you wife changes her name. With all these company internal (CRM, Billing, ERP),  free public (property tax), commercial (credit bureaus, mailing lists) and social media data sources out there, you would think such everyday changes could get picked up quicker and automatically.  If not automatically, then should there not be some sort of trigger to kick off a “governance” process; something along the lines of “email/call the customer if attribute X has changed” or “please log into your account and update your information – we heard you moved”.  If American Express was able to detect ten years ago that someone purchased $500 worth of product with your credit card at a gas station or some lingerie website, known for fraudulent activity, why not your bank or insurer, who know even more about you? And yes, that happened to me as well.

Tell me about one of your “data-driven” horror scenarios?

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, Business Impact / Benefits, Business/IT Collaboration, Complex Event Processing, Customer Acquisition & Retention, Customer Services, Customers, Data Aggregation, Data Governance, Data Privacy, Data Quality, Enterprise Data Management, Financial Services, Governance, Risk and Compliance, Healthcare, Master Data Management, Retail, Telecommunications, Uncategorized, Vertical | Tagged , , , , , , , , , | Leave a comment