Tag Archives: Master Data Management
The Oil and Gas (O&G) industry is an important backbone of every economy. It is also an industry that has weathered the storm of constantly changing economic trends, regulations and technological innovations. O&G companies by nature have complex and intensive data processes. For a profitable functioning under changing trends, policies and guidelines, O&G’s need to manage these data processes really well.
The industry is subject to pricing volatility based on microeconomic pattern of supply and demand affected by geopolitical developments, economic meltdown and public scrutiny. The competition from other sources such as cheap natural gas and low margins are adding fuel to the burning fire making it hard for O&G’s to have a sustainable and predictable outcome.
A recent PWC survey of oil and gas CEOs similarly concluded that “energy CEOs can’t control market factors such as the world’s economic health or global oil supply, but can change how they respond to market conditions, such as getting the most out of technology investments, more effective use of partnerships and diversity strategies.” The survey also revealed that nearly 80% of respondents agreed that digital technologies are creating value for their companies when it comes to data analysis and operational efficiency.
O&G firms run three distinct business operations; upstream exploration & production (E&P’s), midstream (storage & transportation) and downstream (refining & distribution). All of these operations need a few core data domains being standardized for every major business process. However, a key challenge faced by O&G companies is that this critical core information is often spread across multiple disparate systems making it hard to take timely decisions. To ensure effective operations and to grow their asset base, it is vital for these companies to capture and manage critical data related to these domains.
E&P’s core data domains include wellhead/materials (asset), geo-spatial location data and engineer/technicians (associate). Midstream includes trading partners and distribution and downstream includes commercial and residential customers. Classic distribution use cases emerge around shipping locations, large-scale clients like airlines and other logistics providers buying millions of gallons of fuel and industrial lube products down to gas station customers. The industry also relies heavily on reference data and chart of accounts for financial cost and revenue roll-ups.
The main E&P asset, the well, goes through its life cycle and changes characteristics (location, ID, name, physical characterization, depth, crew, ownership, etc.), which are all master data aspects to consider for this baseline entity. If we master this data and create a consistent representation across the organization, it can then be linked to transaction and interaction data so that O&G’s can drive their investment decisions, split cost and revenue through reporting and real-time processes around
- Crew allocation
- Royalty payments
- Safety and environmental inspections
- Maintenance and overall production planning
E&P firms need a solution that allows them to:
- Have a flexible multidomain platform that permits easier management of different entities under one solution
- Create a single, cross-enterprise instance of a wellhead master
- Capture and master the relationships between the well, equipment, associates, land and location
- Govern end-to-end management of assets, facilities, equipment and sites throughout their life cycle
The upstream O&G industry is uniquely positioned to take advantage of vast amount of data from its operations. Thousands of sensors at the well head, millions of parts in the supply chain, global capital projects and many highly trained staff create a data-rich environment. A well implemented MDM brings a strong foundation for this data driven industry providing clean, consistent and connected information about the core master data so these companies can cut the material cost, IT maintenance and increase margins.
To know more on how you can achieve upstream operational excellence with Informatica Master Data Management, check out this recorded webinar with @OilAndGasData
I recently got to meet with a very enlightened insurance company which was actively turning their SWOTT analysis (with the second T being trends) into concrete action. They shared with me that they view their go forward “right to win” being determined by the quality of customer experience they deliver to customers through their traditional channels and increasingly through “digital channels”. One marketing leader joked early on that “it’s no longer about the money; it is about the experience”. The marketing and business leaders that I met with made it extremely clear that they have a sense of urgency to respond to what it saw as significant market changes on the horizon. What this company wanted to achieve was a single view of customer across each of its distribution channel as well as their agent population. Typical of many businesses today, they had determined that they needed an automated, holistic view into things like its customer history. Smartly, this business wanted to put together its existing customer data with its customer leads.
Using Data to Accelerate the Percentage Customers that are Cross Sold
Taking this step was seen as allowing them to understand when an existing customer is also a lead for another product. With this knowledge, they wanted to provide them with special offers to accelerate their conversion from lead to being a customer with more than one product. What they wanted to do here reminded me of the capabilities of 58.com, eBay, and other Internet pure plays. The reason for doing this well was described recently by Gartner. Gartner suggests that increasing business success is determined by what they call “business moments”. Without a first rate experience that builds upon what this insurance company already knows about its customers, this insurance company worries it could be increasing at risk by Internet pure plays. As important, like many businesses, the degree of cross sell is for many businesses a major determinant of whether a customer is profitable or not.
Getting Customer Data Right is Key to Developing a Winning Digital Experience
To drive a first rate digital experience, this insurance company wanted to apply advanced analytics to a single view of customer and prospect data. This would allow them to do things like conduct nearest neighbor predictive analysis and modeling. In this form of analysis, “the goal is to predict whether a new customer will respond to an offer based on how other similar customers have responded” (Data Science for Business, Foster Provost, O’Reilly, 2013, page 147).
What has limiting this business like so many others is that their customer data is scattered across many enterprise systems. For just for one division, they have more than one Salesforce instance. Yet this company’s marketing team knew to keep its customers, it needed to be able to service them omnichannel and establish a single unified customer experience. To make this happen, they needed to for the first to share holistic customer information across their ecosystems. At the same time, they knew that they would needed to protect their customer’s privacy—i.e. only certain people would be able to see certain information. They wanted by role that the ability to selective mask data and protect their customer in particular consumers by only allowing certain users in defense parlance, with a need to know, to see a subset of the holistic set of information collected. When asked about the need for a single view of customer, the digital marketing folks openly shared that they perceived the potential for external digital market entrants—ala Porter’s five forces of competition. This firm saw them either as taking market share from them or effectively disintermediating them over time them from their customers as more and more customers move their insurance purchasing of Insurance to the Web. Given the risk, their competitive advantage needed to move to knowing better their customer and being able to respond better to them on the web. This clearly included new customers that are trying to win in the language of Theodore Levitt.
Competing on Customer Experience
In sum, this insurance company smartly felt that they needed to compete on customer experience to pull out a new phrase for me and this required superior knowledge of existing and new customers. This means they needed as complete and correct view of customers as possible including addresses, connection preferences, and increasingly social media responses. This means competitively responding directly to those that have honed their skills in web design, social presence, and advanced analytics. To do this, they will create predictive capabilities that will make use of their superior customer data. Clearly, without this prescience of thinking, this moment will not be like the strategic collision of Starbucks and Fast Food Vendors where the desire to grow forced competition between the existing player and new entrants wanting to claim a portion of the existing market player’s business.
Blogs and Articles
Let’s face it, building a Data Governance program is no overnight task. As one CDO puts it: ”data governance is a marathon, not a sprint”. Why? Because data governance is a complex business function that encompasses technology, people and process, all of which have to work together effectively to ensure the success of the initiative. Because of the scope of the program, Data Governance often calls for participants from different business units within an organization, and it can be disruptive at first.
Why bother then? Given that data governance is complex, disruptive, and could potentially introduce additional cost to a company? Well, the drivers for data governance can vary for different organizations. Let’s take a close look at some of the motivations behind data governance program.
For companies in heavily regulated industries, establishing a formal data governance program is a mandate. When a company is not compliant, consequences can be severe. Penalties could include hefty fines, brand damage, loss in revenue, and even potential jail time for the person who is held accountable for being noncompliance. In order to meet the on-going regulatory requirements, adhere to data security policies and standards, companies need to rely on clean, connected and trusted data to enable transparency, auditability in their reporting to meet mandatory requirements and answer critical questions from auditors. Without a dedicated data governance program in place, the compliance initiative could become an on-going nightmare for companies in the regulated industry.
A data governance program can also be established to support customer centricity initiative. To make effective cross-sells and ups-sells to your customers and grow your business, you need clear visibility into customer purchasing behaviors across multiple shopping channels and touch points. Customer’s shopping behaviors and their attributes are captured by the data, therefore, to gain thorough understanding of your customers and boost your sales, a holistic Data Governance program is essential.
Other reasons for companies to start a data governance program include improving efficiency and reducing operational cost, supporting better analytics and driving more innovations. As long as it’s a business critical area and data is at the core of the process, and the business case is loud and sound, then there is a compelling reason for launching a data governance program.
Now that we have identified the drivers for data governance, how do we start? This rather loaded question really gets into the details of the implementation. A few critical elements come to consideration including: identifying and establishing various task forces such as steering committee, data governance team and business sponsors; identifying roles and responsibilities for the stakeholders involved in the program; defining metrics for tracking the results. And soon you will find that on top of everything, communications, communications and more communications is probably the most important tactic of all for driving the initial success of the program.
A rule of thumb? Start small, take one-step at a time and focus on producing something tangible.
Sounds easy, right? Well, let’s hear what the real-world practitioners have to say. Join us at this Informatica webinar to hear Michael Wodzinski, Director of Information Architecture, Lisa Bemis, Director of Master Data, Fabian Torres, Director of Project Management from Houghton Mifflin Harcourt, global leader in publishing, as well as David Lyle, VP of product strategy from Informatica to discuss how to implement a successful data governance practice that brings business impact to an enterprise organization.
If you are currently kicking the tires on setting up data governance practice in your organization, I’d like to invite you to visit a member-only website dedicated to Data Governance: http://governyourdata.com/. This site currently has over 1,000 members and is designed to foster open communications on everything data governance. There you will find conversations on best practices, methodologies, frame works, tools and metrics. I would also encourage you to take a data governance maturity assessment to see where you currently stand on the data governance maturity curve, and compare the result against industry benchmark. More than 200 members have taken the assessment to gain better understanding of their current data governance program, so why not give it a shot?
Data Governance is a journey, likely a never-ending one. We wish you best of the luck on this effort and a joyful ride! We love to hear your stories.
Achieving and maintaining a single, semantically consistent version of master data is crucial for every organization. As many companies are moving from an account or product-centric approach to a customer-centric model, master data management is becoming an important part of their enterprise data management strategy. MDM provides the clean, consistent and connected information your organizations need for you to –
- Empower customer facing teams to capitalize on cross-sell and up-sell opportunities
- Create trusted information to improve employee productivity
- Be agile with data management so you can make confident decisions in a fast changing business landscape
- Improve information governance and be compliant with regulations
But there are challenges ahead for the organizations. As Andrew White of Gartner very aptly wrote in a blog post, we are only half pregnant with Master Data Management. Andrew in his blog post talked about increasing number of inquiries he gets from organizations that are making some pretty simple mistakes in their approach to MDM without realizing the impact of those decisions on a long run.
Over last 10 years, I have seen many organizations struggle to implement MDM in a right way. Few MDM implementations have failed and many have taken more time and incurred cost before showing value.
So, what is the secret sauce?
A key factor for a successful MDM implementation lays in mapping your business objectives to features and functionalities offered by the product you are selecting. It is a phase where you ask right questions and get them answered. There are few great ways in which organizations can get this done and talking to analysts is one of them. The other option is to attend MDM focused events that allow you to talk to experts, learn from other customer’s experience and hear about best practices.
We at Informatica have been working hard to deliver you a flexible MDM platform that provides complete capabilities out of the box. But MDM journey is more than just technology and product features as we have learnt over the years. To ensure our customer success, we are sharing knowledge and best practices we have gained with hundreds of successful MDM and PIM implementations. The Informatica MDM Day, is a great opportunity for organizations where we will –
- Share best practices and demonstrate our latest features and functionality
- Show our product capabilities which will address your current and future master data challenges
- Provide you opportunity to learn from other customer’s MDM and PIM journeys.
- Share knowledge about MDM powered applications that can help you realize early benefits
- Share our product roadmap and our vision
- Provide you an opportunity to network with other like-minded MDM, PIM experts and practitioners
So, join us by registering today for our MDM Day event in New York on 24th February. We are excited to see you all there and walk with you towards MDM Nirvana.
That’s right, Valentine’s Day is upon us, the day that symbolizes the power of love and has the ability to strengthen relationships between people. I’ve personally experienced 53 Valentine’s Days so I believe I speak with no small measure of authority on the topic of how to make the best of it. Here are my top five suggestions for having a great day:
- Know everything you can about the people you have relationships with
- Quality matters
- ALL your relationships matter
- Uncover your hidden or anonymous relationships
- Treat your relationships with respect all year long
OK, I admit, this is not the most romantic list ever and might get you in more trouble with your significant other than actually forgetting Valentine’s Day altogether! But, what did you expect? I work for a software company, not eHarmony!
Right. Software. Let’s put this list into the context of government agencies.
- Know everything – If your agency’s mission involves delivering services to citizens, likely, you have multiple “systems of record”, each with a supposed accurate record of all the people being tracked by each system. In reality though, it’s rare that the data about individuals is consistently accurate and complete from system to system. The ability to centralize all the data about individuals into a single, authoritative “record” is key to improving service delivery. Such a record will enable you to ensure the citizens you serve are able to take full advantage of all the services available to them. Further, having a single record for each citizen has the added benefit of reducing fraud, waste and abuse.
- Quality matters – Few things hinder the delivery of services more than bad data, data with errors, inconsistencies and gaps in completeness. It is difficult, at best, to make sound business decisions with bad data. At the individual level and at the macro level, agency decision makers need complete and accurate data to ensure each citizen is fully served.
- All relationships matter – In this context, going beyond having single records to represent people, it’s also important to have single, authoritative views of other entities – programs, services, providers, deliverables, places, etc.
- Uncover hidden relationships – Too often, in the complex eco-system of government programs and services, the inability to easily recognize relationships between people and the additional entities mentioned above creates inefficiencies in the “system”. For example, it can go unnoticed that a single parent is not enrolled in a special program designed for their unique life circumstances. Flipping the coin, not having a full view of hidden relationships also opens the door for the less scrupulous in society, giving them the ability to hide their fraudulent activities in plain sight.
- Treat relationships respectfully all year – Data hygiene is not a one-time endeavor. Having the right mindset, processes and tools to implement and automate the process of “mastering” data as an on-going process will better ensure the relationship between your agency and those it serves will remain positive and productive.
I may not win the “Cupid of the Year” award, but, I hope my light-hearted Valentine’s Day message has given you a thing or two to think about. Maybe Lennon and McCartney are right, between people, “Love is All You Need”. But, we at Informatica believe for Government-Citizen relationships, a little of the right software can go a long way.
The signs that healthcare is becoming a more consumer (think patients, members, providers) driven industry are evident all around us. I see provider and payer organizations clamoring for more data, specifically data that is actionable, relatable and has integrity. Armed with this data, healthcare organizations are able to differentiate around a member/patient-centric view.
These consumer-centric views convey the total value of the relationships healthcare organizations have with consumers. Understanding the total value creates a more comprehensive understanding of consumers because they deliver a complete picture of an individual’s critical relationships including: patient to primary care provider, member to household, provider to network and even members to legacy plans. This is the type of knowledge that informs new treatments, targets preventative care programs and improves outcomes.
Payer organizations are collecting and analyzing data to identify opportunities for more informed care management and segmentation to reach new, high value customers in individual markets. By segmenting and targeting messaging to specific populations, health plans generate increased member satisfaction and cost effectively expands and manages provider networks.
How will they accomplish this? Enabling members to interact in health and wellness forums, analyzing member behavior and trends and informing care management programs with a 360 view of members… to name a few . Payers will also drive new member programs, member retention and member engagement marketing and sales programs by investigating complete views of member households and market segments.
In the provider space, this relationship building can be a little more challenging because often consumers as patients do not interact with their doctor unless they are sick, creating gaps in data. When provider organizations have a better understanding of their patients and providers, they can increase patient satisfaction and proactively offer preventative care to the sickest (and most likely to engage) of patients before an episode occurs. These activities result in increased market share and improved outcomes.
Where can providers start? By creating a 360 view of the patient, organizations can now improve care coordination, open new patient service centers and develop patient engagement programs.
Analyzing populations of patients, and fostering patient engagement based on Meaningful Use requirements or Accountable Care requirements, building out referral networks and developing physician relationships are essential ingredients in consumer engagement. Knowing your patients and providing a better patient experience than your competition will differentiate provider organizations.
You may say “This all sounds great, but how does it work?” An essential ingredient is clean, safe and connected data. Clean, safe and connected data requires an investment in data as an asset… just like you invest in real estate and human capital, you must invest in the accessibility and quality of your data. To be successful, arm your team with tools to govern data –ensuring ongoing integrity and quality of data, removes duplicate records and dynamically incorporates data validation/quality rules. These tools include master data management, data quality, metadata management and are focused on information quality. Tools focused on information quality support a total customer relationship view of members, patients and providers.
This blog post initially appeared on CMSwire.com and is reblogged here with their consent.
Friends of mine were remodeling their master bath. After searching for a claw foot tub in stores and online, they found the perfect one that fit their space. It was only available for purchase on the retailer’s e-commerce site, they bought it online.
When it arrived, the tub was too big. The dimensions online were incorrect. They went to return it to the closest store, but were told they couldn’t — because it was purchased online, they had to ship it back.
The retailer didn’t have a total customer relationship view or a single view of product information or inventory across channels and touch points. This left the customer representative working with a system that was a silo of limited information. She didn’t have access to a rich customer profile. She didn’t know that Joe and his wife spent almost $10,000 with the brand in the last year. She couldn’t see the products they bought online and in stores. Without this information, she couldn’t deliver a great customer experience.
It was a terrible customer experience. My friends share it with everyone who asks about their remodel. They name the retailer when they tell the story. And, they don’t shop there anymore. This terrible customer experience is negatively impacting the retailer’s revenue and brand reputation.
Bad customer experiences happen a lot. Companies in the US lose an estimated $83 billion each year due to defections and abandoned purchases as a direct result of a poor experience, according to a Datamonitor/Ovum report.
Customer Experience is the New Marketing
Gartner believes that by 2016, companies will compete primarily on the customer experiences they deliver. So who should own customer experience?
Twenty-five percent of CMOs say that their CEOs expect them to lead customer experience. What’s their definition of customer experience? “The practice of centralizing customer data in an effort to provide customers with the best possible interactions with every part of the company, from marketing to sales and even finance.”
Mercedes Benz USA President and CEO, Steve Cannon said, “Customer experience is the new marketing.”
The Gap Between Customer Expectations + Your Ability to Deliver
My previous post, 3 Barriers to Delivering Omnichannel Experiences, explained how omnichannel is all about seeing your business through the eyes of your customer. Customers don’t think in terms of channels and touch points, they just expect a seamless, integrated and consistent customer experience. It’s one brand to the customer. But there’s a gap between customer expectations and what most businesses can deliver today.
Most companies who sell through multiple channels operate in silos. They are channel-centric rather than customer-centric. This business model doesn’t empower employees to deliver seamless, integrated and consistent customer experiences across channels and touch points. Different leaders manage each channel and are held accountable to their own P&L. In most cases, there’s no incentive for leaders to collaborate.
Old Navy’s CMO, Ivan Wicksteed got it right when he said,
“Seventy percent of searches for Old Navy are on a mobile device. Consumers look at the product online and often want to touch it in the store. The end goal is not to get them to buy in the store. The end goal is to get them to buy.”
The end goal is what incentives should be based on.
Executives at most organizations I’ve spoken with admit they are at the very beginning stages of their journey to becoming omnichannel retailers. They recognize that empowering employees with a total customer relationship view and a single view of product information and inventory across channels are critical success factors.
Becoming an omnichannel business is not an easy transition. It forces executives to rethink their definition of customer-centricity and whether their business model supports it. “Now that we need to deliver seamless, integrated and consistent customer experiences across channels and touch points, we realized we’re not as customer-centric as we thought we were,” admitted an SVP of marketing at a financial services company.
You Have to Transform Your Business
“We’re going through a transformation to empower our employees to deliver great customer experiences at every stage of the customer journey,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”
Hyatt uses data integration, data quality and master data management (MDM) technology to connect the numerous applications that contain fragmented customer data including sales, marketing, e-commerce, customer service and finance. It brings the core customer profiles together into a single, trusted location, where they are continually managed. Now its customer profiles are clean, de-duplicated, enriched and validated. Members of a household as well as the connections between corporate hierarchies are now visible. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively.
When he first joined Hyatt, Brogan did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay,” he joked. Those guest profiles have now been successfully consolidated.
According to Brogan,
“Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is a mess.”
Improving How You Manage, Use and Analyze Data is More Important Than Ever
Some companies lack a single view of product information across channels and touch points. About 60 percent of retail managers believe that shoppers are better connected to product information than in-store associates. That’s a problem. The same challenges exist for product information as customer information. How many different systems contain valuable product information?
Harrods overcame this challenge. The retailer has a strategic initiative to transform from a single iconic store to an omnichannel business. In the past, Harrods’ merchants managed information for about 500,000 products for the store point of sale system and a few catalogs. Now they are using product information management technology (PIM) to effectively manage and merchandise 1.7 million products in the store and online.
Because they are managing product information centrally, they can fuel the ERP system and e-commerce platform with full, searchable multimedia product information. Harrods has also reduced the time it takes to introduce new products and generate revenue from them. In less than one hour, buyers complete the process from sourcing to market readiness.
It Ends with Satisfied Customers
By 2016, you will need to be ready to compete primarily on the customer experiences you deliver across channels and touch points. This means really knowing who your customers are so you can serve them better. Many businesses will transform from a channel-centric business model to a truly customer-centric business model. They will no longer tolerate messy data. They will recognize the importance of arming marketing, sales, e-commerce and customer service teams with the clean, consistent and connected customer, product and inventory information they need to deliver seamless, integrated and consistent experiences across touch points. And all of us will be more satisfied customers.
A friend of mine recently reached out to me about some advice on CRM solutions in the market. Though I have not worked for a CRM vendor, I’ve had both direct experience working for companies that implemented such solutions to my current role interacting with large and small organizations regarding their data requirements to support ongoing application investments across industries. As we spoke, memories started to surface when he and I had worked on implementing Salesforce.com (SFDC) many years ago. Memories that we wanted to forget but important to call out given his new situation.
We worked together for a large mortgage lending software vendor selling loan origination solutions to brokers and small lenders mainly through email and snail mail based marketing. He was responsible for Marketing Operations, and I ran Product Marketing. The company looked at Salesforce.com to help streamline our sales operations and improve how we marketed and serviced our customers. The existing CRM system was from the early 90’s and though it did what the company needed it to do, it was heavily customized, costly to operate, and served its life. It was time to upgrade, to help grow the business, improve business productivity, and enhance customer relationships.
After 90 days of rolling out SFDC, we ran into some old familiar problems across the business. Sales reps continued to struggle in knowing who was a current customer using our software, marketing managers could not create quality mailing lists for prospecting purposes, and call center reps were not able to tell if the person on the other end was a customer or prospect. Everyone wondered why this was happening given we adopted the best CRM solution in the market. You can imagine the heartburn and ulcers we all had after making such a huge investment in our new CRM solution. C-Level executives were questioning our decisions and blaming the applications. The truth was, the issues were not related to SFDC but the data that we had migrated into the system and the lack proper governance and a capable information architecture to support the required data management integration between systems that caused these significant headaches.
During the implementation phase, IT imported our entire customer database of 200K+ unique customer entities from the old system to SFDC. Unfortunately, the mortgage industry was very transient and on average there were roughly 55K licenses mortgage brokers and lenders in the market and because no one ever validated the accuracy of who was really a customer vs. someone who had ever bought out product, we had a serious data quality issues including:
- Trial users who purchased evaluation copies of our products that expired were tagged as current customers
- Duplicate records caused by manual data entry errors consisting of companies with similar but entered slightly differently with the same business address were tagged as unique customers
- Subsidiaries of parent companies in different parts of the country that were tagged again as a unique customer.
- Lastly, we imported the marketing contact database of prospects which were incorrectly accounted for as a customer in the new system
We also failed to integrate real-time purchasing data and information from our procurement systems for sales and support to handle customer requests. Instead of integrating that data in real-time with proper technology, IT had manually loaded these records at the end of the week via FTP resulting in incorrect billing information, statement processing, and a ton of complaints from customers through our call center. The price we paid for not paying attention to our data quality and integration requirements before we rolled out Salesforce.com was significant for a company of our size. For example:
- Marketing got hit pretty hard. Each quarter we mailed evaluation copies of new products to our customer database of 200K, each costing the company $12 per to produce and mail. Total cost = $2.4M annually. Because we had such bad data, we would get 60% of our mailings returned because of invalid addresses or wrong contact information. The cost of bad data to marketing = $1.44M annually.
- Next, Sales struggled miserably when trying to upgrade a customer by running cold call campaigns using the names in the database. As a result, sales productivity dropped by 40% and experienced over 35% sales turnover that year. Within a year of using SFDC, our head of sales got let go. Not good!
- Customer support used SFDC to service customers, our average all times were 40 min per service ticket. We had believed that was “business as usual” until we surveyed what reps were spending their time each day and over 50% said it was dealing with billing issues caused by bad contact information in the CRM system.
At the end of our conversation, this was my advice to my friend:
- Conduct a data quality audit of the systems that would interact with the CRM system. Audit how complete your critical master and reference data is including names, addresses, customer ID, etc.
- Do this before you invest in a new CRM system. You may find that much of the challenges faced with your existing applications may be caused by the data gaps vs. the legacy application.
- If they had a data governance program, involve them in the CRM initiative to ensure they understand what your requirements are and see how they can help.
- However, if you do decide to modernize, collaborate and involve your IT teams, especially between your Application Development teams and your Enterprise Architects to ensure all of the best options are considered to handle your data sharing and migration needs.
- Lastly, consult with your technology partners including your new CRM vendor, they may be working with solution providers to help address these data issues as you are probably not the only one in this situation.
CRM systems have come a long way in today’s Big Data and Cloud Era. Many firms are adopting more flexible solutions offered through the Cloud like Salesforce.com, Microsoft Dynamics, and others. Regardless of how old or new, on premise or in the cloud, companies invest in CRM not to just serve their sales teams or increase marketing conversion rates, but to improve your business relationship with your customers. Period! It’s about ensuring you have data in these systems that is trustworthy, complete, up to date, and actionable to improve customer service and help drive sales of new products and services to increase wallet share. So how to do you maximize your business potential from these critical business applications?
Whether you are adopting your first CRM solution or upgrading an existing one, keep in mind that Customer Relationship Management is a business strategy, not just a software purchase. It’s also about having a sound and capable data management and governance strategy supported by people, processes, and technology to ensure you can:
- Access and migrate data from old to new avoiding develop cost overruns and project delays.
- Identify, detect, and distribute transactional and reference data from existing systems into your front line business application in real-time!
- Manage data quality errors including duplicate records, invalid names and contact information due to proper data governance and proactive data quality monitoring and measurement during and after deployment
- Govern and share authoritative master records of customer, contact, product, and other master data between systems in a trusted manner.
Will your data be ready for your new CRM investments? To learn more:
- Download Salesforce Integration for Dummies
- Download a new Whitepaper on how to Maximize Integration ROI with a Hybrid Approach
- Consolidating Multiple Salesforce Orgs: A Best Practice Guide
- Sign up for a 30 Day Trial of Informatica Cloud Integration
Follow me on Twitter @DataisGR8
I recently refinanced an existing mortgage on an investment property with my bank. Like most folks these days, I went to their website from my iPad, fill out an online application form, and received a pre-approval decision. Like any mortgage application, we stated our liabilities and assets including credit cards, auto loans, and investment accounts some of which were with this bank. During the process I also entered a new contact email address after my email service was hacked over the summer. The whole process took quite a bit of time and being an inpatient person I ended up logging off and coming back to the application over the weekend.
I walked into my local branch the following week to do a withdrawal with my bank teller and asked how my mortgage application was going. She had no clue what I was talking about as though I was a complete stranger. When I asked her if they had my updated email address that I entered online, she was equally puzzled stating that any updates to that information would require me to contact all the other groups that held my brokerage, credit card, and mortgage services to make the change. That experience was extremely frustrating and I felt like my bank had no idea who I was as a customer despite the fact my ATM card as printed on it “Customer Since 1989″! Even worse, I expected someone to reach out to me after entering my entire financial history on my mortgage application about moving my investment accounts to their bank however no one contacted me about any new offers or services. (Wondering if they really wanted my business??)
2015 will continue to be a challenge for banks large and small to grow revenue caused by low interest rates, increasing competition from non-traditional segments, and lower customer loyalty with existing institutions. The biggest opportunity for banks to grow revenue is to expand the wallet with existing customers. Though times are ahead as many bank customers continue to do business with a multitude of different financial institutions.
The average U.S. consumer owns between 8-12 financial products ranging from your basic checking, credit card, mortgages, etc. to a wider range of products from IRA’s to 401K’s as they get closer to retirement. On the flip side the average institution has between 2-3 products per customer relationship. So why do banks continue to struggle in gaining more wallet share from existing customers? Based on my experience and research, it stems down to two key reasons including:
- Traditional product-centric business silos and systems
- Lack of a single trusted source of customer, account, household, and other shared data syndicated and governed across the enterprise
The first reason is the way banks are set up to do business. Back in the day, you would walk into your local branch office. As you enter the doors, you have your bank tellers behind the counter ready to handle your deposits, withdrawals, and payments. If you need to open a new account you would talk to the new accounts manager sitting at their desk waiting to offer you a cookie. For mortgages and auto loans that would be someone else sitting in the far side of the building equally eager to sign new customers. As banks diversified their businesses with new products including investments, credit cards, insurance, etc. each product had their own operating units. The advent of the internet did not really change the traditional “brick and mortar” business model. Instead, one would go to the bank’s website to transact or sign up for a new product however on the back end the systems, people, and incentives to sell one product did not change creating the same disconnected customer experience. Fast forward to today, these product centric silos continue to exist in big and small banks across the globe despite CEO’s saying they are focused on delivering a better customer experience.
Why is that the case? Well, another reason or cause are the systems within these product silos including core banking, loan origination, loan servicing, brokerage systems, etc. that were never designed to share common information with each other. In traditional retail or consumer banks maintained customer, account, and household information within the Customer Information File (CIF) often part of the core banking systems. Primary and secondary account holders would be grouped with a household based on the same last name and mailing address. Unfortunately, CIF systems were mainly used within retail banking. The problem grows expotentially as more systems were adopted to run the business across core business functions and traditional product business silos. Each group and its systems managed their own versions of the truth and these environments were never set up to share common data between them.
This is where Master Data Management technology can help. “Master Data” is defined as a single source of basic business data used across multiple systems, applications, and/or processes. In banking that traditionally includes information such as:
- Customer name
- Account numbers
- Household members
- Employees of the bank
Master Data Management technology has evolved over the years starting as Customer Data Integration (CDI) solutions providing merge and match capabilities between systems to more modern platforms that govern consistent records and leverage inference analytics in to determine relationships between entities across systems within an enterprise. Depending on your business need, there are core capabilities one should consider when investing in an MDM platform. They include:
|Key functions:||What to look for in an MDM solution?|
|Capturing existing master data from two or more systems regardless of source and creating a single source of the truth for all systems to share.||To do this right, you need seamless access to data regardless of source, format, system, and in real-time|
|Defining relationships based on “business rules” between entities. For example: “Household = Same last name, address, and account number.”||These relationship definitions can be complex and can change over time therefore having the ability to create and modify those business rules by business users will help grow adoption and scalability across the enterprise|
|Governing consistency across systems by identifying changes to this common business information, determining whether it’s a unique, duplicate, or update to an existing record, and updating other systems that use and rely on that information.||Similar to the first, you need the ability easily deliver and update dependent systems across the enterprise in real-time. Also, having a flexible and user friendly way of managing those master record rules and avoid heavy IT development is important to consider.|
Now, what would my experience have been if my bank had capable Master Data Management solution in my bank? Let’s take a look:
|Process||Without MDM||With MDM||Benefit with MDM|
|Start a new mortgage application online||Customer is required to fill out the usual information (name, address, employer, email, phone, existing accounts, etc.)||The online banking system references the MDM solution which delivers the most recent master record of this customer based on existing data from the bank’s core banking system and brokerage systems and pre-populates the form with those details including information for their existing savings and credit card accounts with that bank.||
|New email address from customer||Customer enters this on their mortgage application and gets entered into the bank’s loan origination system||MDM recognizes that the email address is different from what exists in other systems, asks the customer to confirm changes.The master record is updated and shared across the banks’ other systems in real-time including the downstream data warehouse used by Marketing to drive cross sell campaigns.||
The banking industry continues to face headwinds from a revenue, risk, and regulatory standpoint. Traditional product-centric silos will not go away anytime soon and new CRM and client onboarding solutionsmay help with improving customer engagements and productivity within a firm however front office business applications are not designed to manage and share critical master data across your enterprise. Anyhow, I decided to bank with another institution who I know has Master Data Management. Are you ready for a new bank too?
For more information on Informatica’s Master Data Management:
The last few months have allowed me to see MDM, data quality and data governance from a completely different perspective. I sat with other leaders here at Informatica, analysts who focus on information quality and spent time talking to our partners who work closely with customers on data management initiatives. As we collectively attempted to peer into the crystal ball and forecast what will be hot – and what will not – in this year and beyond for MDM and data quality, here are few top predictions that stood out.
1. MDM will become a single platform for all master entities
“The classical notion of boundaries that existed where we would say, this is MDM versus this is not MDM is going to get blurred,” says Dennis Moore – SVP, Information Quality Solutions (IQS), Informatica. “Today, we master a fairly small number of attributes in MDM. Rather than only mastering core attributes, we need to master business level entities, like customer, product, location, assets, things, etc., and combine all relevant attributes into a single platform which can be used to develop new “data fueled” applications. This platform will allow mastering of data, aggregate data from other sources, and also syndicate that data out into other systems.”
Traditionally MDM was an invisible hub that was connected to all the spokes. Instead, Dennis says – “MDM will become more visible and will act as an application development platform.”
2. PIM is becoming more integrated environment that covers all information about products and related data in single place
More and more customers want to have single interface which will allow them to manage all product information. Along with managing a product’s length, width, height, color, cost etc., they probably want to see data about the history, credit rating, previous quality rating, sustainability scorecard, returns, credits and so on. Dennis says – “All the product information in one place helps make better decisions with embedded analytics, giving answers to questions such as:
- What were my sales last week?
- Which promotions are performing well and poorly?
- Which suppliers are not delivering on their SLAs?
- Which stores aren’t selling according to plan?
- How are the products performing in specific markets?”
Essentially, PIM will become a sovereign supplier of product data that goes in your catalog and ecommerce system that will be used by merchandisers, buyers, and product and category managers. It will become the buyer’s guide and a desktop for the person whose job is to figure out how to effectively promote products to meet sales targets.
3. MDM will become an integral part of big data analytics projects
“Big data analytics suffers from the same challenges as traditional data warehouses – bad data quality produces sub-optimal intelligence. MDM has traditionally enabled better analysis and reporting with high quality master data. Big data analytics will also immensely benefit from MDM’s most trustworthy information.” – Said Ravi Shankar – VP of Product Marketing, MDM, Informatica
Naveen Sharma who heads Enterprise Data Management practice at Cognizant reemphasized what I heard from Dennis. He says – “With big data and information quality coming together, some of the boundaries between a pure MDM system and a pure analytical system will start to soften”. Naveen explains – “MDM is now seen as an integral part of big data analytics projects and it’s a huge change from a couple of years ago. Two of large retailers we work with are going down the path of trying to bring not only the customer dimension but the associated transactional data to derive meaning into an extended MDM platform. I see this trend continuing in 2015 and beyond with other verticals as well.”
4. Business requirements are leading to the creation of solutions
There are several business problems being solved by MDM, such as improving supplier spend management and collaboration with better supplier data. Supply chain, sourcing and procurement teams gain significant cost savings and a boost in productivity by mastering supplier, raw materials and product information and fueling their business and analytical applications with that clean, consistent and connected information. Jakki Geiger, Senior Director of IQS Solutions Marketing at Informatica says, “Business users want more than just the underlying infrastructure to manage business-critical data about suppliers, raw materials, and products. They want to access this information directly through a business-friendly user interface. They want a business process-driven workflow to manage the full supplier lifecycle, including: supplier registration, qualification, verification, onboarding and off-boarding. Instead of IT building these business-user focused solutions on top of an MDM foundation, vendors are starting to build ready-to-use MDM solutions like the Total Supplier Relationship solution.” Read more about Valspar’s raw materials spend management use case.
5. Increased adoption of matching and linking capabilities on Hadoop
“Many of our customers have significantly increased the amount of data they want to master,” says Dennis Moore. Days when tens of millions of master records were a lot are long gone and having hundreds of millions of master records and billions of source records is becoming almost common. An increasing number of master data sources –internal and external to organization – are contributing significantly to the rise in data volumes. To accommodate these increasing volumes, Dennis predicts that large enterprises will look at running complex matching and linking capabilities on Hadoop – a cost effective and flexible way to analyze large amount of data.
6. Master insight management is going to be next big step
“MDM will evolve into master insight management as organizations try to relate trusted data they created in MDM with transactional and social interaction data,” said Rob Karel – VP of Product Strategy and Product Marketing, IQS, Informatica. “The innovations in machine and deep learning techniques will help organizations such as healthcare prescribe next best treatment based on history of patients, retailers suggest best offers based on customer interest and behavior, public sector companies will see big steps in social services, etc.”
Rob sees MDM at the heart of this innovation bringing together relevant information about multiple master entities and acting as a core system for insight management innovations.
7. MDM and Data Governance
Aaron Zornes – Chief research officer at the MDM Institute predicts that in 2014-15, vendor MDM solutions will move from “passive-aggressive” mode to “proactive” data governance mode. Data governance for MDM will move beyond simple stewardship to convergence of task management, workflow, policy management and enforcement according to Aaron.
8. The market will solidify for cloud based MDM adoption
Aaron says – “Cloud-innate services for DQ and DG will be more prevalent; however, enterprise MDM will remain on premise with increasing integration to cloud applications in 2015.
Naveen sees lot of synergy around cloud based MDM offerings and says – “The market is solidifying for MDM on cloud but the flood gates are yet to open”. Naveen does not see any reason why MDM market will not go to cloud and gives the example of CRM which was at similar junction before Saleforce came into play. Naveen sees similar shift for MDM and says – “The fears companies have about their data security on cloud is eventually going to fade. If you look closely at any of the recent breaches, these all involved hacks into company networks and not into cloud provider networks. The fact that cloud service providers spend more dollars on data security than any one company can spend on their on-premise security layer will be a major factor affecting the transition”. Naveen sees that big players in MDM will include cloud offerings as part of their toolkit in coming years.
Ravi also predicts an increase in cloud adoption for MDM in future as the concern for placing master data in the cloud becomes less with maximum security provided by cloud vendors.
So, what do you predict? I would love to hear your opinions and comments.