Tag Archives: Master Data Management
According to Accenture – 2013 Global Consumer Pulse Survey, “85 percent of customers are frustrated by dealing with a company that does not make it easy to do business with them, 84 percent by companies promising one thing, but delivering another; and 58 percent are frustrated with inconsistent experiences from channel to channel.”
Consumers expect more from the companies they do business with. In response, many companies are shifting from managing their business based on an application-, account- or product-centric approach to a customer-centric approach. And this is one of the main drivers for master data management (MDM) adoption. According to a VP of Data Strategy & Services at one of the largest insurance companies in the world, “Customer data is the lifeblood of a company that is serious about customer-centricity.” So, better managing customer data, which is what MDM enables you to do, is a key to the success of any customer-centricity initiative. MDM provides a significant competitive differentiation opportunity for any organization that’s serious about improving customer experience. It enables customer-facing teams to assess the value of any customer, at the individual, household or organization level.
Amongst the myriad business drivers of a customer-centricity initiative, key benefits include delivering an enhanced customer experience – leading to higher customer loyalty and greater share of wallet, more effective cross-sell and upsell targeting to increase revenue, and improved regulatory compliance.
To truly achieve all the benefits expected from a customer-first, customer-centric strategy, we need to look beyond the traditional approaches of data quality and MDM implementations, which often consider only one foundational (yet important) aspect of the technology solution. The primary focus has always been to consolidate and reconcile internal sources of customer data with the hope that this information brought under a single umbrella of a database and a service layer will provide the desired single view of customer. But in reality, this data integration mindset misses the goal of creating quality customer data that is free from duplication and enriched to deliver significant value to the business.
Today’s MDM implementations need to take their focus beyond mere data integration to be successful. In the following section, I will explain 3 levels of customer views which can be built incrementally to be able to make most out of your MDM solution. When implemented fully, these customer views act as key ingredients for improving the execution of your customer-centric business functions.
Trusted Customer View
The first phase of the solution should cover creation of trusted customer view. This view empowers your organization with an ability to see complete, accurate and consistent customer information.
In this stage, you take the best information from all the applications and compile it into a single golden profile. You not only use data integration technology for this, but also employ data quality tools to ensure the correctness and completeness of the customer data. Advanced matching, merging and trust framework are used to derive the most up-to-date information about your customer. You also guarantee that the golden record you create is accessible to business applications and systems of choice so everyone who has the authority can leverage the single version of the truth.
At the end of this stage, you will be able to clearly say John D. who lives at 123 Main St and Johnny Doe at 123 Main Street, who are both doing business with you, are not really two different individuals.
Customer Relationships View
The next level of visibility is about providing a view into the customer’s relationships. It takes advantage of the single customer view and layers in all valuable family and business relationships as well as account and product information. Revealing these relationships is where the real value of multidomain MDM technology comes into action.
At the end of this phase, you not only see John Doe’s golden profile, but the products he has. He might have a personal checking from the Retail Bank, a mortgage from the Mortgage line of business, and brokerage and trust account with the Wealth Management division. You can see that John has his own consulting firm. You can see he has a corporate credit card and checking account with the Commercial division under the name John Doe Consulting Company.
At the end of this phase, you will have a consolidated view of all important relationship information that will help you evaluate the true value of each customer to your organization.
Customer Interactions and Transactions View
The third level of visibility is in the form of your customer’s interactions and transactions with your organization.
During this phase, you tie transactional information, historical data and social interactions your customer has with your organization to further enhance the system. Building this view provides you a whole new world of opportunities because you can see everything related to your customer in one central place. Once you have this comprehensive view, when John Doe calls your call center, you know how valuable he is to your business, which product he just bought from you (transactional data), what is the problem he is facing (social interactions).
A widely accepted rule of thumb holds that 80 percent of your company’s future revenue will come from 20 percent of your existing customers. Many organizations are trying to ensure they are doing everything they can to retain existing customers and grow wallet share. Starting with Trusted Customer View is first step towards making your existing customers stay. Once you have established all three states discussed here, you can arm your customer-facing teams with a comprehensive view of customers so they can:
- Deliver the best customer experiences possible at every touch point,
- Improve customer segmentation for tailored offers, boost marketing and sales productivity,
- Increase cross-sell and up-sell success, and
- Streamline regulatory reporting.
Achieving the 3 views discussed here requires a solid data management platform. You not only need an industry leading multidomain MDM technology, but also require tools which will help you integrate data, control the quality and connect all the dots. These technologies should work together seamlessly to make your implementation easier and help you gain rapid benefits. Therefore, choose your data management platform. To know more about MDM vendors, read recently released Gartner’s Magic Quadrant for MDM of Customer Data Solutions.
Citrix: You may not realize you know them, but chances are pretty good that you do. And chances are also good that we marketers can learn something about achieving fortune teller-like marketing from them!
Citrix is the company that brought you GoToMeeting and a whole host of other mobile workspace solutions that provide virtualization, networking and cloud services. Their goal is to give their 100 million users in 260,000 organizations across the globe “new ways to work better with seamless and secure access to the apps, files and services they need on any device, wherever they go.”
Citrix is a company that has been imagining and innovating for over 25 years, and over that time, has seen a complete transformation in their market – virtual solutions and cloud services didn’t even exist when they were founded. Now it’s the backbone of their business. Their corporate video proudly states that the only constant in this world is change, and that they strive to embrace the “yet to be discovered.”
Having worked with them quite a bit over the past few years, we have seen first-hand how Citrix has demonstrated their ability to embrace change.
Back in 2011, it became clear to Citrix that they had a data problem, and that they would have to make some changes to stay ahead in this hyper competitive market. Sales & Marketing had identified data as their #1 concern – their data was incomplete, inaccurate, and duplicated in their CRM system. And with so many different applications in the organization, it was quite difficult to know which application or data source had the most accurate and up-to-date information. They realized they needed a single source of the truth – one system of reference where all of their global data management practices could be centralized and consistent.
The marketing team realized that they needed to take control of the solution to their data concerns, as their success truly depended upon it. They brought together their IT department and their systems integration partner, Cognizant to determine a course of action. Together they forged an overall data governance strategy which would empower the marketing team to manage data centrally – to be responsible for their own success.
As a key element of that data governance / management strategy, they determined that they needed a Master Data Management (MDM) solution to serve as their Single Trusted Source of Customer & Prospect Data. They did a great deal of research into industry best practices and technology solutions, and decided to select Informatica as their MDM partner. As you can see, Citrix’s environment is not unlike most marketing organizations. The difference is that they are now able to capture and distribute better customer and prospect data to and from these systems to achieve even better results. They are leveraging internal data sources and systems like CRM (Salesforce) and marketing automation (Marketo). Their systems live all over the enterprise, both on premises and in the cloud. And they leverage analytical tools to analyze and dashboard their results.
Citrix strategized and implemented their Single Trusted Source of Customer & Prospect solution in a phased approach throughout 2013 and 2014, and we believe that what they’ve been able to accomplish during that short period of time has been nothing short of phenomenal. Here are the higlights:
- Used Informatica MDM to provide clean, consistent and connected channel partner, customer and prospect data and the relationships between them for use in operational applications (SFDC, BI Reporting and Predictive Analytics)
- Recognized 20% increase in lead-to-opportunity conversion rates
- Realized 20% increase in marketing team’s operational efficiency
- Achieved 50% increase in quality of data at the point of entry, and a 50% reduction in the rate of junk and duplicate data for prospects, existing accounts and contact
- Delivered a better channel partner and customer experience by renewing all of a customers’ user licenses across product lines at one time and making it easy to identify whitespace opportunities to up-sell more user licenses
That is huge! Can you imagine the impact on your own marketing organization of a 20% increase in lead-to-opportunity conversion? Can you imagine the impact of spending 20% less time questioning and manually massaging data to get the information you need? That’s game changing!
Because Citrix now has great data and great resulting insight, they have been able to take the next step and embark on new fortune teller-like marketing strategies. As Citrix’s Dagmar Garcia discussed during a recent webinar, “We monitor implicit and explicit behavior of transactional leads and accounts, and then we leverage these insights and previous behaviors to offer net new offers and campaigns to our customers and prospects… And it’s all based on the quality of data we have within our database.”
I encourage you to take a few minutes to listen to Dagmar discuss Citrix’s project on a recent webinar. In the webinar, she dives deeper into their project, the project scope and timeline, and to what she means by “fortune telling abilities”. Also, take a look at the customer story section of the Informatica.com website for the PDF case study. And, if you’re in the mood to learn more, you can download a complimentary copy of the 2014 Gartner Magic Quadrant for MDM of Customer Data Solutions.
Hat’s off to you Citrix, and we look forward to working with you to continue to change the game even more in the coming months and years!
“Raw materials costs are the company’s single largest expense category,” said Steve Jenkins, Global IT Director at Valspar, at MDM Day in London. “Data management technology can help us improve business process efficiency, manage sourcing risk and reduce RFQ cycle times.”
Valspar is a $4 billion global manufacturing company, which produces a portfolio of leading paint and coating brands. At the end of 2013, the 200 year old company celebrated record sales and earnings. They also completed two acquisitions. Valspar now has 10,000 employees operating in 25 countries.
As is the case for many global companies, growth creates complexity. “Valspar has multiple business units with varying purchasing practices. We source raw materials from 1,000s of vendors around the globe,” shared Steve.
“We want to achieve economies of scale in purchasing to control spending,” Steve said as he shared Valspar’s improvement objectives. “We want to build stronger relationships with our preferred vendors. Also, we want to develop internal process efficiencies to realize additional savings.”
Poorly managed vendor and raw materials data was impacting Valspar’s buying power
The Valspar team, who sharply focuses on productivity, had an “Aha” moment. “We realized our buying power was limited by the age and quality of available vendor data and raw materials data,” revealed Steve.
The core vendor data and raw materials data that should have been the same across multiple systems wasn’t. Data was often missing or wrong. This made it difficult to calculate the total spend on raw materials. It was also hard to calculate the total cost of expedited freight of raw materials. So, employees used a manual, time-consuming and error-prone process to consolidate vendor data and raw materials data for reporting.
These data issues were getting in the way of achieving their improvement objectives. Valspar needed a data management solution.
Valspar needed a single trusted source of vendor and raw materials data
The team chose Informatica MDM, master data management (MDM) technology. It will be their enterprise hub for vendors and raw materials. It will manage this data centrally on an ongoing basis. With Informatica MDM, Valspar will have a single trusted source of vendor and raw materials data.
Informatica PowerCenter will access data from multiple source systems. Informatica Data Quality will profile the data before it goes into the hub. Then, after Informatica MDM does it’s magic, PowerCenter will deliver clean, consistent, connected and enriched data to target systems.
Better vendor and raw materials data management results in cost savings
Valspar expects to gain the following business benefits:
- Streamline the RFQ process to accelerate raw materials cost savings
- Reduce the total number of raw materials SKUs and vendors
- Increase productivity of staff focused on pulling and maintaining data
- Leverage consistent global data visibly to:
- increase leverage during contract negotiations
- improve acquisition due diligence reviews
- facilitate process standardization and reporting
Valspar’s vision is to tranform data and information into a trusted organizational assets
“Mastering vendor and raw materials data is Phase 1 of our vision to transform data and information into trusted organizational assets,” shared Steve. In Phase 2 the Valspar team will master customer data so they have immediate access to the total purchases of key global customers. In Phase 3, Valspar’s team will turn their attention to product or finished goods data.
Steve ended his presentation with some advice. “First, include your business counterparts in the process as early as possible. They need to own and drive the business case as well as the approval process. Also, master only the vendor and raw materials attributes required to realize the business benefit.”
Want more? Download the Total Supplier Information Management eBook. It covers:
- Why your fragmented supplier data is holding you back
- The cost of supplier data chaos
- The warning signs you need to be looking for
- How you can achieve Total Supplier Information Management
In my last blog I promised I would report back my experience on using Informatica Data Quality, a software tool that helps automate the hectic, tedious data plumbing task, a task that routinely consumes more than 80% of the analyst time. Today, I am happy to share what I’ve learned in the past couple of months.
But first, let me confess something. The reason it took me so long to get here was that I was dreaded by trying the software. Never a savvy computer programmer, I was convinced that I would not be technical enough to master the tool and it would turn into a lengthy learning experience. The mental barrier dragged me down for a couple of months and I finally bit the bullet and got my hands on the software. I am happy to report that my fear was truly unnecessary – It took me one half day to get a good handle on most features in the Analyst Tool, a component of the Data Quality designed for analyst and business users, then I spent 3 days trying to figure out how to maneuver the Developer Tool, another key piece of the Data Quality offering mostly used by – you guessed it, developers and technical users. I have to admit that I am no master of the Developer Tool after 3 days of wrestling with it, but, I got the basics and more importantly, my hands-on interaction with the entire software helped me understand the logic behind the overall design, and see for myself how analyst and business user can easily collaborate with their IT counterpart within our Data Quality environment.
To break it all down, first comes to Profiling. As analyst we understand too well the importance of profiling as it provides an anatomy of the raw data we collected. In many cases, it is a must have first step in data preparation (especially when our raw data came from different places and can also carry different formats). A heavy user of Excel, I used to rely on all the tricks available in the spreadsheet to gain visibility of my data. I would filter, sort, build pivot table, make charts to learn what’s in my raw data. Depending on how many columns in my data set, it could take hours, sometimes days just to figure out whether the data I received was any good at all, and how good it was.
Switching to the Analyst Tool in Data Quality, learning my raw data becomes a task of a few clicks – maximum 6 if I am picky about how I want it to be done. Basically I load my data, click on a couple of options, and let the software do the rest. A few seconds later I am able to visualize the statistics of the data fields I choose to examine, I can also measure the quality of the raw data by using Scorecard feature in the software. No more fiddling with spreadsheet and staring at busy rows and columns. Take a look at the above screenshots and let me know your preference?
Once I decide that my raw data is adequate enough to use after the profiling, I still need to clean up the nonsense in it before performing any analysis work, otherwise bad things can happen — we call it garbage in garbage out. Again, to clean and standardize my data, Excel came to rescue in the past. I would play with different functions and learn new ones, write macro or simply do it by hand. It was tedious but worked if I worked on static data set. Problem however, was when I needed to incorporate new data sources in a different format, many of the previously built formula would break loose and become inapplicable. I would have to start all over again. Spreadsheet tricks simply don’t scale in those situation.
With Data Quality Analyst Tool, I can use the Rule Builder to create a set of logical rules in hierarchical manner based on my objectives, and test those rules to see the immediate results. The nice thing is, those rules are not subject to data format, location, or size, so I can reuse them when the new data comes in. Profiling can be done at any time so I can re-examine my data after applying the rules, as many times as I like. Once I am satisfied with the rules, they will be passed on to my peers in IT so they can create executable rules based on the logic I create and run them automatically in production. No more worrying about the difference in format, volume or other discrepancies in the data sets, all the complexity is taken care of by the software, and all I need to do is to build meaningful rules to transform the data to the appropriate condition so I can have good quality data to work with for my analysis. Best part? I can do all of the above without hassling my IT – feeling empowered is awesome!
Use the right tool for the right job will improve our results, save us time, and make our jobs much more enjoyable. For me, no more Excel for data cleansing after trying our Data Quality software, because now I can get a more done in less time, and I am no longer stressed out by the lengthy process.
I encourage my analyst friends to try Informatica Data Quality, or at least the Analyst Tool in it. If you are like me, feeling weary about the steep learning curve then fear no more. Besides, if Data Quality can cut down your data cleansing time by half (mind you our customers have reported higher numbers), how many more predictive models you can build, how much you will learn, and how much faster you can build your reports in Tableau, with more confidence?
Get connected. Be connected. Make connections. Find connections. The Internet of Things (IoT) is all about connecting people, processes, data and, as the name suggests, things. The recent social media frenzy surrounding the ALS Ice Bucket Challenge has certainly reminded everyone of the power of social media, the Internet and a willingness to answer a challenge. Fueled by personal and professional connections, the craze has transformed fund raising for at least one charity. Similarly, IoT may potentially be transformational to the business of the public sector, should government step up to the challenge.
Government is struggling with the concept and reality of how IoT really relates to the business of government, and perhaps rightfully so. For commercial enterprises, IoT is far more tangible and simply more fun. Gaming, televisions, watches, Google glasses, smartphones and tablets are all about delivering over-the-top, new and exciting consumer experiences. Industry is delivering transformational innovations, which are connecting people to places, data and other people at a record pace.
It’s time to accept the challenge. Government agencies need to keep pace with their commercial counterparts and harness the power of the Internet of Things. The end game is not to deliver new, faster, smaller, cooler electronics; the end game is to create solutions that let devices connecting to the Internet interact and share data, regardless of their location, manufacturer or format and make or find connections that may have been previously undetectable. For some, this concept is as foreign or scary as pouring ice water over their heads. For others, the new opportunity to transform policy, service delivery, leadership, legislation and regulation is fueling a transformation in government. And it starts with one connection.
One way to start could be linking previously siloed systems together or creating a golden record of all citizen interactions through a Master Data Management (MDM) initiative. It could start with a big data and analytics project to determine and mitigate risk factors in education or linking sensor data across multiple networks to increase intelligence about potential hacking or breaches. Agencies could stop waste, fraud and abuse before it happens by linking critical payment, procurement and geospatial data together in real time.
This is the Internet of Things for government. This is the challenge. This is transformation.
This blog post feels a little bit like bragging… and OK, I guess it is pretty self-congratulatory to announce that this year, Informatica was again chosen as a leader in MDM and PIM by The Information Difference. As you may know, The Information Difference is an independent research firm that specializes in the MDM industry and each year surveys, analyzes and ranks MDM and PIM providers and customers around the world. This year, like last year, The Information Difference named Informatica tops in the space.
Why do I feel especially chuffed about this? Because of our customers.
“Inaccurate, inconsistent and disconnected supplier information prohibits us from doing accurate supplier spend analysis, leveraging discounts, comparing and choosing the best prices, and enforcing corporate standards.”
This is quotation from a manufacturing company executive. It illustrates the negative impact that poorly managed supplier information can have on a company’s ability to cut costs and achieve revenue targets.
Many supply chain and procurement teams at large companies struggle to see the total relationship they have with suppliers across product lines, business units and regions. Why? Supplier information is scattered across dozens or hundreds of Enterprise Resource Planning (ERP) and Accounts Payable (AP) applications. Too much valuable time is spent manually reconciling inaccurate, inconsistent and disconnected supplier information in an effort to see the big picture. All this manual effort results in back office administrative costs that are higher than they should be.
Do these quotations from supply chain leaders and their teams sound familiar?
“We have 500,000 suppliers. 15-20% of our supplier records are duplicates. 5% are inaccurate.”
“I get 100 e-mails a day questioning which supplier to use.”
“To consolidate vendor reporting for a single supplier between divisions is really just a guess.”
“Every year 1099 tax mailings get returned to us because of invalid addresses, and we play a lot of Schedule B fines to the IRS.”
“Two years ago we spent a significant amount of time and money cleansing supplier data. Now we are back where we started.”
Please join me and Naveen Sharma, Director of the Master Data Management (MDM) Practice at Cognizant for a Webinar, Supercharge Your Supply Chain Applications with Better Supplier Information, on Tuesday, July 29th at 11 am PT.
During the Webinar, we’ll explain how better managing supplier information can help you achieve the following goals:
- Accelerate supplier onboarding
- Mitiate the risk of supply disruption
- Better manage supplier performance
- Streamline billing and payment processes
- Improve supplier relationship management and collaboration
- Make it easier to evaluate non-compliance with Service Level Agreements (SLAs)
- Decrease costs by negotiating favorable payment terms and SLAs
I hope you can join us for this upcoming Webinar!