Tag Archives: MDM
If your organization is working with a large number of vendors, you know the challenge to successfully manage your suppliers’ information across the entire enterprise.
It starts with quickly onboarding the right suppliers at the right location to introduce a new product or initiate a project – a process that can take several weeks if it isn’t automated.
Once qualified and on-boarded, you need to be able to assess your suppliers’ risk and compliance and analyze their performance. Do my suppliers deliver on time? Do they meet our quality standards? Do we benefit from negotiated corporate discounts and payment terms with a trusted Supplier360 view of our supplier relationships worldwide?
In order to enable organizations to more effectively manage vendors and their performance, negotiate better prices and payment terms, and reduce operating costs and risk, Informatica just launched the Total Supplier Relationship Application (TSR), a master data-fueled solution that optimizes supplier information management with a Supplier360 view across the enterprise.
Sodexo is a multinational food services and facilities management corporation. Frank Griffith, Senior Director of Integrated Procurement Systems at Sodexo says: “The Informatica Total Supplier Relationship application will empower Sodexo to manage our supplier master data and vendor items more effectively and streamline our current supply chain processes. Based on one, single trusted view of our vendor and product data, we will be able to better track our vendors’ compliance, and manage vendor performance and certifications, while leveraging the GDSN (Global Data Synchronization Network) for nutritional product attribute data.”
Combining the industry-leading Informatica Master Data Management (MDM) hub with Informatica Data Integration and Informatica Data Quality, TSR is a master data-fueled application with a configurable business-friendly user interface. It can be enhanced with Informatica Product Information Management (PIM) and Data-as-a-Service for contact data verification and enrichment.
If you’ve spent some time studying and practicing data governance, you would agree that data governance is a challenging yet rewarding endeavor. Across industries, a growing number of organizations have put data governance programs in place so they can more effectively manage their data to drive the business value. But the reality is, data governance is a complex process, and most companies practicing data governance today are still at the early phase of this very long journey. In fact, according to the result from over 240 completed data governance assessments on http://governyourdata.com/, a community website dedicated to everything data governance, the average score for data governance maturity is only 1.6 out of 5. It’s no surprise that data governance was a hot topic at last week’s Informatica World 2015. Over a dozen presentations and panel discussions on data governance were delivered; practitioners across various industries shared their real-world stories on topics ranging from how to kick-start a data governance program, how to build business cases for data governance, frameworks and stewardship management, to the choice of technologies. For me, the key takeaways are:
- Old but still true – To do data governance the right way, you must start small and focus on achieving tangible results. Leverage the small victories to advance to the next phase.
- Be prepared to fail more than once while building a data governance program. But don’t quit, because your data will not.
- One-size doesn’t fit all when it comes to building a data governance framework, which is a challenge for organizations, as there is no magic formula that companies can immediately adopt. Should you build a centralized or federated data governance operation? Well, that really depends on what works within your existing environment.
In fact, when asked “what’s the most challenging area for your data governance effort” in our recent survey conducted at Informatica World 2015, “Identify roles and responsibilities” got the most mentions. Basic principle? – Choose a framework that blends well with your company‘s culture.
- Let’s face it, data governance is not an IT project, nor is it about fixing data problems. It is a business function that calls for people, process and technology working together to obtain the most value from your data. Our seasoned practitioners recommend a systematic approach: Your first priority should be people gathering – identifying the right people with the right skills and most importantly, those who have a passion for data; next is figuring out the process. Things to consider include: What’s the requirement for data quality? What metrics and measurements should be used for examining the data; how to handle exceptions and remediate data issues? How to quickly identify and apply security measures to the various data sets? Third priority is selecting the right technologies to implement and facilitate those processes to transform the data so it can be used to help meet business goals.
- “Engage your business early on” is another important tip from our customers who have achieved early success with their data governance program. A data governance program will not be sustainable without participation from the business. The reason is simple – the business owns the data, they are the consumers of the data and have specific requirements for the data they want to use. IT needs to work collaboratively with business to meet those requirements so the data is fit for use, and provides good value for the business.
- Scalability, flexibility and interoperability should be the key considerations when it comes to selecting data governance technologies. Your technology platform should be able to easily adapt to the new requirements arising from the changes in your data environment. A Big Data project, for example, introduces new data types, increased data speed and volume. Your data management solution should be agile enough to address those new challenges with minimum disruption to your workflow.
Data governance is HOT! The well-attended sessions at Informatica World, as well as some of our previously hosted webinars is testimony of the enthusiasm among our customers, partners, and our own employees on this topic. It’s an exciting time for us at Informatica because we are in a great position to help companies build an effective data governance program. In fact, many of our customers have been relying on our industry-leading data management tools to support their data governance program, and have achieved results in many business areas such as meeting compliance requirements, improving customer centricity and enabling advanced analytics projects. To continue the dialogue and facilitate further learning, I’d like to invite you to an upcoming webinar on May 28, to hear some insightful, pragmatic tips and tricks for building a holistic data governance program from industry expert David Loshin, Principal at Knowledge Integrity, Inc, and Informatica’s own data governance guru Rob Karel.
“Better data is everyone’s job” – well said by Terri Mikol, director of Data Governance at University of Pittsburgh Medical Center. For companies striving to leverage data to deliver business value, everyone within the company should treat data as a strategic asset and take on responsibilities for delivering clean, connected and safe data. Only then can your organization be considered truly “Data Ready”.
In the age of engagement, customers expect a seamless, integrated and consistent customer experience. But most companies can’t meet those expectations today. This video explains what holds many companies back.Then, check out this eBook, Mastering the Chaos of Customer Experience. It explains the 7 steps to becoming a customer ready enterprise. A customer ready enterprise has a total customer relationship view, that trusted customer profile that’s accurate, up-to-date, complete and consistent across the company.
Find out how you can stop your customer data chaos and start using great customer data to fuel great customer experiences.
If you are unable to view the video, click here.
What a day! This should go down as one of the best days in this entire year.
At the sold out Informatica World 2015 All Things Data conference today, Sohaib Abbasi, CEO of Informatica outlined the move to the “age of engagement”. The main highlight for the MDMer in me was the announcement of Social360 for Internet of Master Data.
Informatica MDM now offers Social360 to view master data, relationships and associated interactions, including social media feeds and media mentions. By matching transactions with social relationships, Informatica MDM can identify the top influencers among the most valuable customers. Designed to run natively on Hadoop, Informatica Big Data Relationship Management identifies household and social relationships across billions of records representing millions of people.
Suresh Menon – VP of Product Management at Informatica opened the MDM Day sessions with a one liner – “we make great data ready to use”.
Companies are trying master the digital transformation successfully. In order to accelerate this transformation, Suresh talked about 3 key aspects that will drive the new age –
New Age of Relationships: Relations between party, places, things, household, family and their interactions in a socially rich world will continue in fast pace.
New Age of Bimodal Governance: Information governance has become a democratic exercise that works both top-down and buttom-up. Bimodal is an ENABLER for business and IT, managing the way how we colloborate and how processes are exectuted to do the right things right.
New Age of Trusted Master Data Fueled Apps: We are entering a new era of master data fuled applications which deliver clean, consistent data. For example, Product 360 for retail, Total supplier relationship for supply chain optimization, Big Data Realtionship Management to address use cases such as effective campaign management to increase your marketing ROI. These apps run on Informatica’s powerful multidomain platform to deliver wide range of use case and industry specific solutions.
Following this was Devon Energy and Noah Consulting keynote where Devon shared how they could deliver fast, accurate Big Data to make smart and quick decisions. Devon uses Informatica MDM and Big Data solutions to provide authoritative and trusted data as it relates to wells, suppliers, and other key master and reference data. This allows them to build data pipelines in Hadoop that transform and prepare data for Big Data analytics.
I attended a number of sessions that were part of MDM Day agenda. That include –
- Symantec & EMC using Informatica MDM and Informatica Data Quality to deliver trusted customer views.
- eBay one of the world’s largest online marketplaces, led a financial transformation with data to maximize the value of finance and drive functional effectiveness.
- A large global company utilizing MDM to enable successful global workday implementation for HR across 60 countries with 100+ HR applications with varying level of data governance across 275 companies.
- GE Aviation shared their new approach to scaling MDM to the enterprise, architecture and concepts on data lakes, and how MDM plays into their large data volumes.
A great day overall with a lot of discussion on topics thar are near and dear to me. I also got a chance to sit (and have a drink) with some amazing folks who focus on MDM, data quality and data governance.
With that I will sign-off as the clock hits the midnight. I will bring you more news from Informatica World, specifically the Information Quality and Governance track sessions happening tomorrow. Stay tuned and as always keep an eye on @MDMGeek on twitter.
If you are as long in the tooth as I am – you are familiar with Willy Wonka and the Chocolate Factory…one of the major plot points revolves around Charlie getting the “golden ticket” which allows him access to Willy Wonka’s factory…but there are only 5 golden tickets available.
Within the world of Master Data Management (MDM) – there is a concept of a “golden record.” This is (hopefully) not as elusive as the Wonka Golden Ticket – but equally as important. This golden record gives you access to the most pure, validated and complete picture of your individual records in your domain.
Let’s start with defining a lot of the terms from the previous paragraph:
- Golden record (according to whatis) ” is a single, well-defined version of all the data entities in an organizational ecosystem. In this context, a golden record is sometimes called the “single version of the truth,” where “truth” is understood to mean the reference to which data users can turn when they want to ensure that they have the correct version of a piece of information. The golden record encompasses all the data in every system of record (SOR) within a particular organization.
- Domain is an area of control or knowledge. In the context of MDM – it refers to the type of data you want to master. For the payer market – it is typical to start with a member or provider domain, but there can be many, many different types of domains.
One of the trickiest parts of implementing an MDM solution is creating the workflow around this golden record. You need to consider all of your data sources, which fields from which data sources tend to be more reliable and what are the criteria for allowing the field from one system to populate an MDM field over another. In other words, if you have an enrollment system that captures the member’s name and a claims system that also captures the member’s name – which of these two systems tend to have the most correct member’s name? Is there another source system that is particularly reliable for capturing address – but the member name tends to be off?
One of the main considerations in the creation and maintenance of the golden record is matching and merging records. If there are two records that are pretty similar, what is the process for inclusion in the golden record? – for instance consider the two records below:
|Last Name||First Name||Member #||Phone Number||Street Address||City, State|
|Wayland||Jennifer||201215||7065842||123 Maine Street||Camden, Maine|
|Wayland||Jenn||201211||2078675309||123 Main Street||Camden, Maine|
The last names are the same as are the City, State fields. All of the other fields are different – in the world of a golden record, they don’t create an automatic match/merge. For the sake of this example – let’s say that we know that the first record comes from a source that has great reliability for names and addresses, while the second record comes from a source that is known for highly accurate member numbers and phone numbers. With good MDM solutions there should be a toolset that allows you to automate the merge functionality as much as possible. For this example, you could set up the workflow to obtain the records from the two sources, set up the criteria for merging/matching (take the name from the first record, the member number and phone number from the second record, and the addresses from the first record. The following could be the golden record for this member:
|Last Name||First Name||Member #||Phone Number||Street Address||City, State|
|Wayland||Jennifer||201211||2078675309||123 Maine Street||Camden, Maine|
Where matching and merging get interesting is when the source fields are not clear “winners.” There will be situations where manual intervention is necessary to determine which record should take precedence. For this – a workflow manager toolkit is very helpful. It will assign records to a data steward who can then make the judgment based on their specific experience and knowledge of the data set which field from which record should take precedence. It can also have approval mechanisms before a record is finally and truly merged resulting in a modification to the existing golden record for a specific member.
As a result of the complexity of implementing a Master Data Management solution – it will help to start with picturing your golden record. Can you answer the following questions?
- What information needs to be captured in your golden record?
- Related to this – is there any information that is not necessarily specific to the domain but may be interesting when attaching relationships to a record (attaching a provider to a specific member)
- What are all the sources of data for the record?
- Are all the sources currently integrated? How easily can new records or updated records be shared?
- Which source is the best source for which fields?
- What is the threshold your company can tolerate for automatic merges?
What approval process needs to be in place before a merge takes place? Who needs to look at the record/recommendation before the merge is complete?
Analyst research shows that only half of enterprises do an ROI analysis of their Information Management investment. Does the other half go on faith?
In the digital transformation to become a data-ready enterprise, information changes from maintained information to managed information. The goal of MDM DAY (May 12 at Informatica World) is to help companies change the way they measure success from “on time and on budget” to using revenue, time-to-market, and customer retention as key values.
13 customers across different industries (retail, distribution, manufacturing, healthcare,
and more) will share their experience on how they master customer, product, patient, supplier or other domains for becoming a data ready enterprise. Get a sneak preview on
MDM Day below and register now. But there are much more sessions…
Spreading Like Wildfire: Fast, Accurate Big Data for Smart, Quick Decisions
Devon Energy is among the largest U.S.-based independent natural gas and oil producers. In this session, you’ll learn how to use Informatica’s MDM and Big Data solutions to provide authoritative and trusted data as it relates to wells, suppliers, and other key master and reference data and to prepare data for Big Data analytics.
Symantec and EMC: Using MDM and DQ to Deliver Trusted Customer Views
Follow Symantec’s journey from realizing it needed to ensure its enterprise ERP system contains trustworthy and relevant data to establishing processes based on Informatica technology and methodology. EMC’s innovation pushed it a step ahead in its MDM journey with a full-blown consolidation hub that generates, references, and publishes single customer ID in the enterprise. EMC was also able to operationalize the MDM with real-time cloud integration services. Their presentations will cover the business case, the establishment of data management processes, key lessons learned, and best practices within the context of both technical architecture and business change.
PartsSource Improve Customer Experience with Product Information
PrtsSource will discuss how it uses Informatica PIM as the foundation for its omnichannel strategy and to support its industry-first one-stop shop online catalog for medical parts. PartsSource will explain how Informatica helped reduce the time needed to launch and update products from 120 minutes to 2 minutes using fewer employees.
Horizon BCBS and Sutter Health: Using MDM for Reference Data and More
Horizon’s master data is consumed by a diverse set of applications ranging from transactional care management systems to analytics systems that compile member risk scores. Its MDM based RDM initiative greatly mitigates, if not eliminates, the pain points of a legacy solution that had limited integration capabilities, data model constraints, data divergence, and governance enforcement.
St. Joseph Health initiated an enterprise information management initiative in support of their analytics program In 2014. Kim Jackson, Director Analytics, Data Warehousing and Special Applications at St. Joseph Health, spoke about how she and her team developed a business case and initiated a data governance program.
CHRISTUS Health: Recovering Revenue through Supply Chain Optimization
The Director of Business Intelligence at CHRISTUS Health explains how a supply chain business intelligence initiative allowed the organization to create a supply chain dashboard that quickly revealed cost containment opportunities.
JDA Software: MDM Journey Soup to Nuts
JDA Software will share its entire MDM lifecycle, starting with identifying MDM need within a large organization and including MDM product selection, vendor selection, team selection, support model, handling ongoing business change requests, upgrading the platform to 9.7, and integrating with cloud offerings like SFDC and Workday as well as legacy transactional systems like PeopleSoft Financials, all using the MDM hub and PowerCenter.
Customer 360 Solution at Nissan Improving Return on Marketing Spend
Discover how Nissan reached this milestone by implementing its Nissan Customer Database, the foundation for a truly sustainable Customer360 view. Learn how to bring data together from multiple heterogeneous sources around Europe (24 countries), ensure data quality, and comply with local regulations.
Johnson & Johnson: Using MDM to Fast Track a Global Workday Implementation
In this session, HighPoint Solutions and Johnson & Johnson discusses how they leveraged Informatica’s MDM hub technology to complete this initiative for a Top Five pharmaceutical company, including techniques used to source country data, cleanse the source data, and integrate it into the Workday instance.
GE Aviation: Scaling MDM to the Enterprise
Master data has quickly become critical to both application and business transformation, but issues arise when large volumes of data need to be loaded into the MDM solution. GE Aviation will share their new approach for mastering data.
Elkjop and Monsanto: Two Best Practices Managing Product and Customer Data
Find out how Elkjop (Nordic’s No 1 consumer electronics retailer) increased both its product range offering and the quality of its product information across its entire portfolio by leveraging the strong embedded Data Quality tools found in the Informatica PIM. Elkjop will also discuss how they reduce their yearly development costs and operating costs.
Monsanto developed a strategy to address customer data quality issues, clearly articulated the business value of implementing the strategy, and successfully implemented the solution leveraging Informatica MDM and a new data governance program.
UPMC and Cambia: Data Governance as an Imperative for Master Data Quality
MDM Implementation Tips & Tricks
Learn from a best practice MDM implementation at Kroton.
The New Age of Proactive Data Governance
Overall Informatica World provides more than 120 sessions to learn from experts how to make great data ready to use. Master Data Management is not complete without rules and their enforcement. In the age of proactive governance, data governance now becomes the ENABLER for business and IT, orechstrating the way how humans colloborate, how processes are exectuted. Data Governance becomes more AGILE.
therefore you may also check out the “Information Quality & Governance Track” with more than 20 session on Wednesday May 13 inlcuding customer use cases, user group discussions, roadmaps, what’s new sessions and much more. Don’t forget to register now and learn more about the dependencies between data governance, stewardship and business roles.
If you’ve been contemplating to transform your business, making it a priority to embrace digital transformation, this year’s Informatica World 2015, in Las Vegas, has a series of B2B and B2C sessions for you. Here are some I wish to recommend:
B2B Commerce Ready Data
PartsSource Improve Customer Experience with Product Information in Medical Parts
Brian Thomas, Director of Application, PartsSource
A leading provider of medical replacement parts solutions, PartsSource will discuss how it uses Informatica PIM as the foundation for its omnichannel strategy and to support its industry-first one-stop shop online catalog for medical parts. PartsSource will explain how Informatica helped reduce the time needed to launch and update products from 120 minutes to 2 minutes using fewer employees, making it easier than ever to connect over 3,300 hospitals to thousands of OEMs and suppliers.
B2C Commerce Ready Data
Elkjop and Monsanto: PIM and MDM to Manage Product and Customer Data
Thomas Thykjer, Master Data Architect, Elkjop Nordics AS
Jim Stellern, US Commercial Data Management and BI IT Lead, Monsanto
Elkjop, the largest consumer electronics retailer in the Nordic countries, increased both its product range offering and the quality of its product information across its entire portfolio by leveraging the strong embedded Data Quality tools found in the Informatica PIM. Elkjop will also discuss how they reduced their yearly development costs and operating costs.
Monsanto developed a strategy to address customer data quality issues, clearly articulated the business value of implementing the strategy, and successfully implemented the solution leveraging Informatica MDM and a new data governance program.
Omnichannel Ready: What’s New in PIM 8?
Latest Features, Demo and Roadmap
Stefan Reinhardt, Senior Product Manager PIM, R&D DEU PIM, Informatica
Ben Rund, Sr. Director Product Marketing, Information Quality Solutions, Informatica
Are you commerce ready? Can you say the same about your data? Are you focused on the betterment of your supply chain, marketing, ecommerce, merchandising and category management This session, aimed at those wanting to sell more products faster.
Learn what’s new with PIM 8:
- Kits and bundles for superior cross-selling
- High data volumes architecture
- Role- and task-based web interfaces for increased efficiency on product content collaboration
- Business user dashboard
- 1WorldSync data pool syndication for compliant CPG data
- Product Data as a Service (DaaS) for price intelligence and content benchmarking.
We will be covering all those new features during the session through in a live demo.
Furthermore, we will also cover the PIM roadmap, showcasing how PIM is evolving and gaining MDM like features, by fueling product data apps for different use cases and industries, leveraging the Intelligent Data Platform.
User Group Session for Omnichannel
Are you looking to talk to the experts? Don’t miss out on our user group sessions where you will be able to discuss and work directly with the R&D and product management experts. They will be there to answer your questions as well as hear your thoughts and feedback. It’s your opportunity to be heard and get an answer to your question, don’t miss out!
37 Sessions to Master the Digital Transformation
An overall of 37 sessions, focused on different Master Data Management and Information Governance, key disciplines and the foundation of successfully mastering digital transformation, will be held on the 12th and 13th of May.
The Good, The Bad, and The Ugly! Experiencing Great (and Not So Great) Data-Driven Marketing in Day-to-Day Life
I spend my professional life helping marketers get the most out of their data. So it really hits me when I experience real life case studies in my day-to-day personal life. When I do run across these great (and not so great) experiences, it broadens my perspective and helps me figure out new ways that I can help our customers use clean, safe, and connected data to revolutionize their marketing efforts.
I feel compelled to blog about these great customer experiences in hopes that other marketers can learn from these encounters like I did. So this is the first installment of an ongoing series of blog posts about where I’ve experienced great (and not so great) data-driven marketing experiences.
I love my insurance company for many reasons, but the last experience I had with them was truly exceptional. I had been on their website getting an insurance quote for a new home. I still had a few questions, so I initiated an online chat. The online representative there quickly and efficiently answered my question and I left the experience with a thorough understanding of the potential policy. About a week later, seemingly unrelated, I put in a claim through a local vendor for a chipped windshield repair. Then two weeks later, I called back and the phone system recognized my mobile number and through the automated system asked me if I was calling about the homeowner’s policy quote I had received a few weeks ago. I pressed 1 for yes, and here’s where the impressive part begins…
A representative quickly answered the phone. He was able to view all of the information I had put into the online quote tool. He then referenced the question I had asked the representative in the online chat, and asked if I had any further questions about it. Within minutes I had my new policy set up, but it didn’t stop there. He asked if the windshield chip I had the previous week had been repaired to my satisfaction. Then he noticed that although I do have a credit card through them, I haven’t used it in some time, and offered me a 0% balance transfer offer for 36 months with no transfer fee on that card. Heck, we just bought a new house and I know there will be plenty of expenses, so sign me up! Finally, he noticed that because I was about to move to a new zip code, my car insurance rates would be going down slightly and offered to send me to the automobile insurance team to make the change to my policy.
The system clearly tied all of my recent and past activities from various channels together, analyzed them, and leveraged some sort of recommendation engine to guide the customer support representative to provide truly customized service. You can be assured I love my insurance company even more, and I will be dusting off a stagnant account that I hadn’t used in years. A+.
Oh how I wish my bank would embrace Total Customer Relationship. My husband, my children, and I have far too many accounts at our bank. Each child has a savings account, we have a joint checking account, we have a savings account, he has a personal savings account, I have a personal savings account, and then there’s the cash reserve line, the credit card and a few CD’s. We recently sold a home and the now-paid-off mortgage was through them. Plus we’ve been customers for almost 20 years. So sufficed to say, we have been loyal customers.
Well the other day, I had to go get a cashier’s check for a school activity for one of my children. I know, it’s pretty strange that they needed a cashier’s check, but I digress. I went to pull half out of my son’s minor savings account and half out of my individual checking account. Neither of those accounts have much money in there, nor have they been opened for very long. Call me spoiled, but I’m used to getting these types of service fees waived for my long tenure and deep relationship with our bank. But because the accounts I had pulled the money from didn’t have that kind of history, they weren’t willing to waive the fees. I was in a hurry because I was running late from shopping at a shoe store (see “The Ugly” below) so I didn’t have the time or energy to fight it, so I paid the darn $5 fee, but I was irritated. Clearly they couldn’t easily see the total customer relationship I have with their institution. The aren’t leveraging their data to tell a complete story, and missed an opportunity to show a loyal customer a great experience.
And The Ugly
A few days ago, I was at a shoe store picking up some new soccer cleats for one of my children. I had gotten an email offer for 30% off, so I pulled up the email and prepared to use it at the cash register. For whatever reason, the email didn’t had a blank where a code was supposed to be, and the woman at the register, despite her best efforts couldn’t use it. So, trying to be helpful, she looked at my loyalty account and, low and behold, I had $40 worth of rewards points that I didn’t even know existed. But I had to first download an app to try to issue a coupon using those points. I downloaded the app, put in my loyalty number, and no points were available.
Turns out, I had two accounts, but they weren’t linked despite having the same phone number and names. One had an address that was misspelled in the system, so it apparently wouldn’t merge with the other account – oh data quality and address validation how I missed you at that moment! She corrected the address, and informed me that it would now merge the two accounts and to try to log in again. Of course, I knew that there was no way this was a real time, or near real time process, but she was insistent. So I tried again, nothing.
The woman couldn’t have been nicer, but poor data quality processes and long batch windows had her hands tied. I was advised to call the customer support line, but of course it was a Sunday afternoon and nobody was there to pick up. So 45 minutes later, I left the store, irritated and very late, and without the shoes I was going to purchase out of principle. In the future, I’ll be going down the street and shopping at another shoe store – it’s my own personal strike against antiquated, inaccurate, incomplete, and painfully slow data processes which result in bad customer experiences!
In The End…
In the greater scheme of things, these varying degrees of customer experience “misses” aren’t exactly a crisis. It’s not curing cancer or solving world hunger, but to consumers, having a great customer experience is really important. Wouldn’t you rather have your customers raving about a great experience, than grumbling about a bad one, or losing a customer due to an ugly one?
We marketers can make the difference! We own that end-to-end omni channel experience. We need to make sure that our data is clean, safe, and connected so we can provide our customers what they expect and frankly deserve from us.
Informatica’s Total Customer Relationship Solution empowers organizations with confidence, knowing that they have access to the kind of great customer data that allows them to surpass customer acquisition and retention goals by providing consistent, integrated, and seamless customer experiences across channels. The end result? Great experiences that customers are inspired to share with their family and friends at dinner parties and on blog posts like this one.
Want to learn more? Check out these webinars to see how Informatica and our customers and partners are revolutionizing the customer experience.
Improving Order to Cash matters regardless of market cycle
Order to Cash (OTC) matters to today’s CFOs and their financial teams even as CFOs move themselves from an expense and cost reduction footing to a manage growth footing. As the business process concerned with receiving and processing customer sales, having a well-functioning OTC process is not only about preserving cash, but it is also about improving the working capital delivered to the bottom line. This is something which is necessitated by successful growth strategies. Specifically, OTC helps to provide the cash flow needed to quicken collections and improve working capital turnover.
This drives concrete improvements to finance metrics including Days Sales Outstanding (a measure of the average number of days that a company takes to collect revenue after a sale has been made) and the overall cost of collections. It should be clear that a poorly running order to cash process can create tangible business issues. These include but are not limited to the following:
- Accuracy of purchase orders
- Accuracy of invoices
- Volume of customer disputes
- Incorrect application of payment terms
- Approval of inappropriate orders
- Errors in order fulfillment
CFOs tell us that it is critical that they make sure that “good cash management is occurring in compliance with regulation”. It is important as well to recognize that OTC cuts across many of the primary activities of the business value chain—especially those that related to sales, marketing, and services.
How do you improve your order to cash process?
So how do financial leader go about improving OTC? They can clearly start by looking at the entire OTC process from quote order, process order, fulfill order, invoice customer, receive and apply cash, and manage credit and collections. The below diagram shows the specific touch points where the process can be improved—each of these should be looked at for process or technology improvement.
However, the starting point is where the most concrete action can be taken. Fixing customer data fixes the data that is used by each succeeding process improvement area. This is where a single, connected view of customer can be established. This improves the OTC process by doing the following:
- Fixes your customer data
- Establishes the right relationships between customers
- Establishes tax and statutory registrations and credit limits
- Prevents penalties for delivering tax documents to the wrong placeCustomer Data Mastering (CDM) does this in the following way. It provides a single view of customers, 360 degree view of relationships as well as a complete view of integrative customer relationships including interactions and transactions.
CDM matters to the CFO and the Business as a whole
It turns out that CDM does not just matter to OTC and the finance organization. It matters as well to the corporate value chain by impacting the following primary activities including outbound logics, marketing and sales, and service. Specifically, CDM accomplishes the following:
- It reduces costs by reducing invoicing and billing inaccuracies, customer disputes, mailing unconsolidated statements, sending duplicate mail, and dealing with returned mail
- Increases revenue by boosting marketing effectiveness at segmenting customer for more personalized offers
- Increases revenue by boosting sales effectiveness by making more relevant cross-sell and up-sell offers
- Reduces costs by boosting call center effectiveness by resolving customer issues more quickly
- Improves customer satisfaction, customer loyalty and retention because employees are empowered with complete customer information to deliver great customer experiences across channels and touch points
So as we have discussed, today’s CFOs are finding that they need to become more and more growth oriented. This transition is made more effective with a well function OTC process. While OTC impacts other elements of the business too, the starting point for fixing OTC is customer data mastering because it fixes elements of the data portion of this process.
The Oil and Gas (O&G) industry is an important backbone of every economy. It is also an industry that has weathered the storm of constantly changing economic trends, regulations and technological innovations. O&G companies by nature have complex and intensive data processes. For a profitable functioning under changing trends, policies and guidelines, O&G’s need to manage these data processes really well.
The industry is subject to pricing volatility based on microeconomic pattern of supply and demand affected by geopolitical developments, economic meltdown and public scrutiny. The competition from other sources such as cheap natural gas and low margins are adding fuel to the burning fire making it hard for O&G’s to have a sustainable and predictable outcome.
A recent PWC survey of oil and gas CEOs similarly concluded that “energy CEOs can’t control market factors such as the world’s economic health or global oil supply, but can change how they respond to market conditions, such as getting the most out of technology investments, more effective use of partnerships and diversity strategies.” The survey also revealed that nearly 80% of respondents agreed that digital technologies are creating value for their companies when it comes to data analysis and operational efficiency.
O&G firms run three distinct business operations; upstream exploration & production (E&P’s), midstream (storage & transportation) and downstream (refining & distribution). All of these operations need a few core data domains being standardized for every major business process. However, a key challenge faced by O&G companies is that this critical core information is often spread across multiple disparate systems making it hard to take timely decisions. To ensure effective operations and to grow their asset base, it is vital for these companies to capture and manage critical data related to these domains.
E&P’s core data domains include wellhead/materials (asset), geo-spatial location data and engineer/technicians (associate). Midstream includes trading partners and distribution and downstream includes commercial and residential customers. Classic distribution use cases emerge around shipping locations, large-scale clients like airlines and other logistics providers buying millions of gallons of fuel and industrial lube products down to gas station customers. The industry also relies heavily on reference data and chart of accounts for financial cost and revenue roll-ups.
The main E&P asset, the well, goes through its life cycle and changes characteristics (location, ID, name, physical characterization, depth, crew, ownership, etc.), which are all master data aspects to consider for this baseline entity. If we master this data and create a consistent representation across the organization, it can then be linked to transaction and interaction data so that O&G’s can drive their investment decisions, split cost and revenue through reporting and real-time processes around
- Crew allocation
- Royalty payments
- Safety and environmental inspections
- Maintenance and overall production planning
E&P firms need a solution that allows them to:
- Have a flexible multidomain platform that permits easier management of different entities under one solution
- Create a single, cross-enterprise instance of a wellhead master
- Capture and master the relationships between the well, equipment, associates, land and location
- Govern end-to-end management of assets, facilities, equipment and sites throughout their life cycle
The upstream O&G industry is uniquely positioned to take advantage of vast amount of data from its operations. Thousands of sensors at the well head, millions of parts in the supply chain, global capital projects and many highly trained staff create a data-rich environment. A well implemented MDM brings a strong foundation for this data driven industry providing clean, consistent and connected information about the core master data so these companies can cut the material cost, IT maintenance and increase margins.
To know more on how you can achieve upstream operational excellence with Informatica Master Data Management, check out this recorded webinar with @OilAndGasData