Tag Archives: Master Data Management

Are You Ready to Compete on Customer Experience?

 

This blog post initially appeared on CMSwire.com and is reblogged here with their consent.

Are You Ready to Compete on Customer Experience?

Are You Ready to Compete on Customer Experience?

Friends of mine were remodeling their master bath. After searching for a claw foot tub in stores and online, they found the perfect one that fit their space. It was only available for purchase on the retailer’s e-commerce site, they bought it online.

When it arrived, the tub was too big. The dimensions online were incorrect. They went to return it to the closest store, but were told they couldn’t — because it was purchased online, they had to ship it back.

The retailer didn’t have a total customer relationship view or a single view of product information or inventory across channels and touch points. This left the customer representative working with a system that was a silo of limited information. She didn’t have access to a rich customer profile. She didn’t know that Joe and his wife spent almost $10,000 with the brand in the last year. She couldn’t see the products they bought online and in stores. Without this information, she couldn’t deliver a great customer experience.

It was a terrible customer experience. My friends share it with everyone who asks about their remodel. They name the retailer when they tell the story. And, they don’t shop there anymore. This terrible customer experience is negatively impacting the retailer’s revenue and brand reputation.

Bad customer experiences happen a lot. Companies in the US lose an estimated $83 billion each year due to defections and abandoned purchases as a direct result of a poor experience, according to a Datamonitor/Ovum report.

Customer Experience is the New Marketing

Gartner believes that by 2016, companies will compete primarily on the customer experiences they deliver. So who should own customer experience?

Twenty-five percent of CMOs say that their CEOs expect them to lead customer experience. What’s their definition of customer experience? “The practice of centralizing customer data in an effort to provide customers with the best possible interactions with every part of the company, from marketing to sales and even finance.”

Mercedes Benz USA President and CEO, Steve Cannon said, “Customer experience is the new marketing.”

The Gap Between Customer Expectations + Your Ability to Deliver

My previous post, 3 Barriers to Delivering Omnichannel Experiences, explained how omnichannel is all about seeing your business through the eyes of your customer. Customers don’t think in terms of channels and touch points, they just expect a seamless, integrated and consistent customer experience. It’s one brand to the customer. But there’s a gap between customer expectations and what most businesses can deliver today.

Most companies who sell through multiple channels operate in silos. They are channel-centric rather than customer-centric. This business model doesn’t empower employees to deliver seamless, integrated and consistent customer experiences across channels and touch points. Different leaders manage each channel and are held accountable to their own P&L. In most cases, there’s no incentive for leaders to collaborate.

Old Navy’s CMO, Ivan Wicksteed got it right when he said,

“Seventy percent of searches for Old Navy are on a mobile device. Consumers look at the product online and often want to touch it in the store. The end goal is not to get them to buy in the store. The end goal is to get them to buy.”

The end goal is what incentives should be based on.

Executives at most organizations I’ve spoken with admit they are at the very beginning stages of their journey to becoming omnichannel retailers. They recognize that empowering employees with a total customer relationship view and a single view of product information and inventory across channels are critical success factors.

Becoming an omnichannel business is not an easy transition. It forces executives to rethink their definition of customer-centricity and whether their business model supports it. “Now that we need to deliver seamless, integrated and consistent customer experiences across channels and touch points, we realized we’re not as customer-centric as we thought we were,” admitted an SVP of marketing at a financial services company.

You Have to Transform Your Business

“We’re going through a transformation to empower our employees to deliver great customer experiences at every stage of the customer journey,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”

Hyatt uses data integration, data quality and master data management (MDM) technology to connect the numerous applications that contain fragmented customer data including sales, marketing, e-commerce, customer service and finance. It brings the core customer profiles together into a single, trusted location, where they are continually managed. Now its customer profiles are clean, de-duplicated, enriched and validated. Members of a household as well as the connections between corporate hierarchies are now visible. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively.

When he first joined Hyatt, Brogan did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay,” he joked. Those guest profiles have now been successfully consolidated.

According to Brogan,

“Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is a mess.”

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Some companies lack a single view of product information across channels and touch points. About 60 percent of retail managers believe that shoppers are better connected to product information than in-store associates. That’s a problem. The same challenges exist for product information as customer information. How many different systems contain valuable product information?

Harrods overcame this challenge. The retailer has a strategic initiative to transform from a single iconic store to an omnichannel business. In the past, Harrods’ merchants managed information for about 500,000 products for the store point of sale system and a few catalogs. Now they are using product information management technology (PIM) to effectively manage and merchandise 1.7 million products in the store and online.

Because they are managing product information centrally, they can fuel the ERP system and e-commerce platform with full, searchable multimedia product information. Harrods has also reduced the time it takes to introduce new products and generate revenue from them. In less than one hour, buyers complete the process from sourcing to market readiness.

It Ends with Satisfied Customers

By 2016, you will need to be ready to compete primarily on the customer experiences you deliver across channels and touch points. This means really knowing who your customers are so you can serve them better. Many businesses will transform from a channel-centric business model to a truly customer-centric business model. They will no longer tolerate messy data. They will recognize the importance of arming marketing, sales, e-commerce and customer service teams with the clean, consistent and connected customer, product and inventory information they need to deliver seamless, integrated and consistent experiences across touch points. And all of us will be more satisfied customers.

Share
Posted in 5 Sales Plays, CMO, Data Governance, Data Integration, Data Quality, Master Data Management, PiM | Tagged , , , , | Leave a comment

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

Is Your Data Ready to Maximize Value from Your CRM Investments?

A friend of mine recently reached out to me about some advice on CRM solutions in the market.  Though I have not worked for a CRM vendor, I’ve had both direct experience working for companies that implemented such solutions to my current role interacting with large and small organizations regarding their data requirements to support ongoing application investments across industries. As we spoke, memories started to surface when he and I had worked on implementing Salesforce.com (SFDC) many years ago. Memories that we wanted to forget but important to call out given his new situation.

We worked together for a large mortgage lending software vendor selling loan origination solutions to brokers and small lenders mainly through email and snail mail based marketing.  He was responsible for Marketing Operations, and I ran Product Marketing. The company looked at Salesforce.com to help streamline our sales operations  and improve how we marketed and serviced our customers.  The existing CRM system was from the early 90’s and though it did what the company needed it to do, it was heavily customized, costly to operate, and served its life. It was time to upgrade, to help grow the business, improve business productivity, and enhance customer relationships.

After 90 days of rolling out SFDC, we ran into some old familiar problems across the business.  Sales reps continued to struggle in knowing who was a current customer using our software, marketing managers could not create quality mailing lists for prospecting purposes, and call center reps were not able to tell if the person on the other end was a customer or prospect. Everyone wondered why this was happening given we adopted the best CRM solution in the market.  You can imagine the heartburn and ulcers we all had after making such a huge investment in our new CRM solution.  C-Level executives were questioning our decisions and blaming the applications. The truth was, the issues were not related to SFDC but the data that we had migrated into the system and the lack proper governance and a capable information architecture to support the required data management integration between systems that caused these significant headaches.

During the implementation phase, IT imported our entire customer database of 200K+ unique customer entities from the old system to SFDC. Unfortunately, the mortgage industry was very transient and on average there were roughly 55K licenses mortgage brokers and lenders in the market and because no one ever validated the accuracy of who was really a customer vs. someone who had ever bought out product, we had a serious data quality issues including:

  • Trial users  who purchased evaluation copies of our products that expired were tagged as current customers
  • Duplicate records caused by manual data entry errors consisting of companies with similar but entered slightly differently with the same business address were tagged as unique customers
  • Subsidiaries of parent companies in different parts of the country that were tagged again as a unique customer.
  • Lastly, we imported the marketing contact database of prospects which were incorrectly accounted for as a customer in the new system

We also failed to integrate real-time purchasing data and information from our procurement systems for sales and support to handle customer requests. Instead of integrating that data in real-time with proper technology, IT had manually loaded these records at the end of the week via FTP resulting in incorrect billing information, statement processing, and a ton of complaints from customers through our call center. The price we paid for not paying attention to our data quality and integration requirements before we rolled out Salesforce.com was significant for a company of our size. For example:

  • Marketing got hit pretty hard. Each quarter we mailed evaluation copies of new products to our customer database of 200K, each costing the company $12 per to produce and mail. Total cost = $2.4M annually.  Because we had such bad data,  we would get 60% of our mailings returned because of invalid addresses or wrong contact information. The cost of bad data to marketing = $1.44M annually.
  • Next, Sales struggled miserably when trying to upgrade a customer by running cold call campaigns using the names in the database. As a result, sales productivity dropped by 40% and experienced over 35% sales turnover that year. Within a year of using SFDC, our head of sales got let go. Not good!
  • Customer support used SFDC to service customers, our average all times were 40 min per service ticket. We had believed that was “business as usual” until we surveyed what reps were spending their time each day and over 50% said it was dealing with billing issues caused by bad contact information in the CRM system.

At the end of our conversation, this was my advice to my friend:

  • Conduct a data quality audit of the systems that would interact with the CRM system. Audit how complete your critical master and reference data is including names, addresses, customer ID, etc.
  • Do this before you invest in a new CRM system. You may find that much of the challenges faced with your existing applications may be caused by the data gaps vs. the legacy application.
  • If they had a data governance program, involve them in the CRM initiative to ensure they understand what your requirements are and see how they can help.
  • However, if you do decide to modernize, collaborate and involve your IT teams, especially between your Application Development teams and your Enterprise Architects to ensure all of the best options are considered to handle your data sharing and migration needs.
  • Lastly, consult with your technology partners including your new CRM vendor, they may be working with solution providers to help address these data issues as you are probably not the only one in this situation.

Looking Ahead!

CRM systems have come a long way in today’s Big Data and Cloud Era. Many firms are adopting more flexible solutions offered through the Cloud like Salesforce.com, Microsoft Dynamics, and others. Regardless of how old or new, on premise or in the cloud, companies invest in CRM not to just serve their sales teams or increase marketing conversion rates, but to improve your business relationship with your customers. Period! It’s about ensuring you have data in these systems that is trustworthy, complete, up to date, and actionable to improve customer service and help drive sales of new products and services to increase wallet share. So how to do you maximize your business potential from these critical business applications?

Whether you are adopting your first CRM solution or upgrading an existing one, keep in mind that Customer Relationship Management is a business strategy, not just a software purchase. It’s also about having a sound and capable data management and governance strategy supported by people, processes, and technology to ensure you can:

  • Access and migrate data from old to new avoiding develop cost overruns and project delays.
  • Identify, detect, and distribute transactional and reference data from existing systems into your front line business application in real-time!
  • Manage data quality errors including duplicate records, invalid names and contact information due to proper data governance and proactive data quality monitoring and measurement during and after deployment
  • Govern and share authoritative master records of customer, contact, product, and other master data between systems in a trusted manner.

Will your data be ready for your new CRM investments?  To learn more:

Follow me on Twitter @DataisGR8

Share
Posted in Architects, Cloud, Cloud Application Integration, Cloud Computing, Cloud Data Integration, Cloud Data Management, CMO, Customer Acquisition & Retention, SaaS | Tagged , , , , , , , , , | Leave a comment

How to Improve Cross Sell and Customer Experience in Banking

Customer Experience in Banking

Customer Experience in Banking

I recently refinanced an existing mortgage on an investment property with my bank. Like most folks these days, I went to their website from my iPad, fill out an online application form, and received a pre-approval decision. Like any mortgage application, we stated our liabilities and assets including credit cards, auto loans, and investment accounts some of which were with this bank.  During the process I also entered a new contact email address after my email service was hacked over the summer.  The whole process took quite a bit of time and being an inpatient person I ended up logging off and coming back to the application over the weekend.

I walked into my local branch the following week to do a withdrawal with my bank teller and asked how my mortgage application was going. She had no clue what I was talking about as though I was a complete stranger.  When I asked her if they had my updated email address that I entered online, she was equally puzzled stating that any updates to that information would require me to contact all the other groups that held my brokerage, credit card, and mortgage services to make the change. That experience was extremely frustrating and I felt like my bank had no idea who I was as a customer despite the fact my ATM card as printed on it “Customer Since 1989″! Even worse, I expected someone to reach out to me after entering my entire financial history on my mortgage application about moving my investment accounts to their bank however no one contacted me about any new offers or services. (Wondering if they really wanted my business??)

2015 will continue to be a challenge for banks large and small to grow revenue caused by low interest rates, increasing competition from non-traditional segments, and lower customer loyalty with existing institutions.  The biggest opportunity for banks to grow revenue is to expand the wallet with existing customers.  Though times are ahead as many bank customers continue to do business with a multitude of different financial institutions.

The average U.S. consumer owns between 8-12 financial products ranging from your basic checking, credit card, mortgages, etc. to a wider range of products from IRA’s to 401K’s as they get closer to retirement.  On the flip side the average institution has between 2-3 products per customer relationship.  So why do banks continue to struggle in gaining more wallet share from existing customers?  Based on my experience and research, it stems down to two key reasons including:

  • Traditional product-centric business silos and systems
  • Lack of a single trusted source of customer, account, household, and other shared data syndicated and governed across the enterprise

The first reason is the way banks are set up to do business. Back in the day, you would walk into your local branch office. As you enter the doors, you have your bank tellers behind the counter ready to handle your deposits, withdrawals, and payments. If you need to open a new account you would talk to the new accounts manager sitting at their desk waiting to offer you a cookie. For mortgages and auto loans that would be someone else sitting in the far side of the building equally eager to sign new customers. As banks diversified their businesses with new products including investments, credit cards, insurance, etc. each product had their own operating units. The advent of the internet did not really change the traditional “brick and mortar” business model. Instead, one would go to the bank’s website to transact or sign up for a new product however on the back end the systems, people, and incentives to sell one product did not change creating the same disconnected customer experience.  Fast forward to today, these product centric silos continue to exist in big and small banks across the globe despite CEO’s saying they are focused on delivering a better customer experience.

Why is that the case? Well, another reason or cause are the systems within these product silos including core banking, loan origination, loan servicing, brokerage systems, etc. that were never designed to share common information with each other. In traditional retail or consumer banks maintained customer, account, and household information within the Customer Information File (CIF) often part of the core banking systems. Primary and secondary account holders would be grouped with a household based on the same last name and mailing address. Unfortunately, CIF systems were mainly used within retail banking. The problem grows expotentially as more systems were adopted to run the business across core business functions and traditional product business silos. Each group and its systems managed their own versions of the truth and these environments were never set up to share common data between them.

This is where Master Data Management technology can help.  “Master Data” is defined as a single source of basic business data used across multiple systems, applications, and/or processes.  In banking that traditionally includes information such as:

  • Customer name
  • Address
  • Email
  • Phone
  • Account numbers
  • Employer
  • Household members
  • Employees of the bank
  • Etc.

Master Data Management technology has evolved over the years starting as Customer Data Integration (CDI) solutions providing merge and match capabilities between systems to more modern platforms that govern consistent records and leverage inference analytics in to determine relationships between entities across systems within an enterprise. Depending on your business need, there are core capabilities one should consider when investing in an MDM platform. They include:

Key functions: What to look for in an MDM solution?
Capturing existing master data from two or more systems regardless of source and creating a single source of the truth for all systems to share. To do this right, you need seamless access to data regardless of source, format, system, and in real-time
Defining relationships based on “business rules” between entities. For example: “Household = Same last name, address, and account number.” These relationship definitions can be complex and can change over time therefore having the ability to create and modify those business rules by business users will help grow adoption and scalability across the enterprise
Governing consistency across systems by identifying changes to this common business information, determining whether it’s a unique, duplicate, or update to an existing record, and updating other systems that use and rely on that information. Similar to the first, you need the ability easily deliver and update dependent systems across the enterprise in real-time. Also, having a flexible and user friendly way of managing those master record rules and avoid heavy IT development is important to consider.

Now, what would my experience have been if my bank had capable Master Data Management solution in my bank? Let’s take a look:

Process Without MDM With MDM Benefit with MDM
Start a new mortgage application online Customer is required to fill out the usual information (name, address, employer, email, phone, existing accounts, etc.) The online banking system references the MDM solution which delivers the most recent master record of this customer based on existing data from the bank’s core banking system and brokerage systems and pre-populates the form with those details including information for their existing savings and credit card accounts with that bank.
  • Accelerate new customer on-boarding
  • Mitigating the risk of a competitor grabbing my attention to do business with them

 

New email address from customer Customer enters this on their mortgage application and gets entered into the bank’s loan origination system MDM recognizes that the email address is different from what exists in other systems, asks the customer to confirm changes.The master record is updated and shared across the banks’ other systems in real-time including the downstream data warehouse used by Marketing to drive cross sell campaigns.
  • Ensure every part of the bank shares the latest information about their customer
  • Avoids any disruptions with future communication of new products and offers to grow wallet share.

The banking industry continues to face headwinds from a revenue, risk, and regulatory standpoint. Traditional product-centric silos will not go away anytime soon and new CRM and client onboarding solutionsmay help with improving customer engagements and productivity within a firm however front office business applications are not designed to manage and share critical master data across your enterprise.  Anyhow, I decided to bank with another institution who I know has Master Data Management.   Are you ready for a new bank too?

For more information on Informatica’s Master Data Management:

Share
Posted in Banking & Capital Markets, Customer Acquisition & Retention, Master Data Management | Tagged , , | Leave a comment

8 Information Quality Predictions for 2015 And Beyond

Information Quality Predictions

Information Quality Predictions

Andy Hayler of Information Difference wrote in October last year that it’s been 10 years since the master data management (MDM) industry emerged. Andy sees MDM technology maturing and project success rates rising. He concluded that MDM has moved past its infancy and has a promising future as it is approaching its teenage years.

The last few months have allowed me to see MDM, data quality and data governance from a completely different perspective. I sat with other leaders here at Informatica, analysts who focus on information quality and spent time talking to our partners who work closely with customers on data management initiatives. As we collectively attempted to peer into the crystal ball and forecast what will be hot – and what will not – in this year and beyond for MDM and data quality, here are few top predictions that stood out.

1. MDM will become a single platform for all master entities
“The classical notion of boundaries that existed where we would say, this is MDM versus this is not MDM is going to get blurred,” says Dennis Moore – SVP, Information Quality Solutions (IQS), Informatica. “Today, we master a fairly small number of attributes in MDM. Rather than only mastering core attributes, we need to master business level entities, like customer, product, location, assets, things, etc., and combine all relevant attributes into a single platform which can be used to develop new “data fueled” applications. This platform will allow mastering of data, aggregate data from other sources, and also syndicate that data out into other systems.”

Traditionally MDM was an invisible hub that was connected to all the spokes. Instead, Dennis says – “MDM will become more visible and will act as an application development platform.”

2. PIM is becoming more integrated environment that covers all information about products and related data in single place
More and more customers want to have single interface which will allow them to manage all product information. Along with managing a product’s length, width, height, color, cost etc., they probably want to see data about the history, credit rating, previous quality rating, sustainability scorecard, returns, credits and so on. Dennis says – “All the product information in one place helps make better decisions with embedded analytics, giving answers to questions such as:

  • What were my sales last week?
  • Which promotions are performing well and poorly?
  • Which suppliers are not delivering on their SLAs?
  • Which stores aren’t selling according to plan?
  • How are the products performing in specific markets?”

Essentially, PIM will become a sovereign supplier of product data that goes in your catalog and ecommerce system that will be used by merchandisers, buyers, and product and category managers. It will become the buyer’s guide and a desktop for the person whose job is to figure out how to effectively promote products to meet sales targets.

3. MDM will become an integral part of big data analytics projects
“Big data analytics suffers from the same challenges as traditional data warehouses – bad data quality produces sub-optimal intelligence. MDM has traditionally enabled better analysis and reporting with high quality master data. Big data analytics will also immensely benefit from MDM’s most trustworthy information.” – Said Ravi Shankar – VP of Product Marketing, MDM, Informatica

Naveen Sharma who heads Enterprise Data Management practice at Cognizant reemphasized what I heard from Dennis. He says – “With big data and information quality coming together, some of the boundaries between a pure MDM system and a pure analytical system will start to soften”. Naveen explains – “MDM is now seen as an integral part of big data analytics projects and it’s a huge change from a couple of years ago. Two of large retailers we work with are going down the path of trying to bring not only the customer dimension but the associated transactional data to derive meaning into an extended MDM platform. I see this trend continuing in 2015 and beyond with other verticals as well.”

4. Business requirements are leading to the creation of solutions
There are several business problems being solved by MDM, such as improving supplier spend management and collaboration with better supplier data. Supply chain, sourcing and procurement teams gain significant cost savings and a boost in productivity by mastering supplier, raw materials and product information and fueling their business and analytical applications with that clean, consistent and connected information. Jakki Geiger, Senior Director of IQS Solutions Marketing at Informatica says, “Business users want more than just the underlying infrastructure to manage business-critical data about suppliers, raw materials, and products. They want to access this information directly through a business-friendly user interface. They want a business process-driven workflow to manage the full supplier lifecycle, including: supplier registration, qualification, verification, onboarding and off-boarding. Instead of IT building these business-user focused solutions on top of an MDM foundation, vendors are starting to build ready-to-use MDM solutions like the Total Supplier Relationship solution.” Read more about Valspar’s raw materials spend management use case.

5. Increased adoption of matching and linking capabilities on Hadoop 
“Many of our customers have significantly increased the amount of data they want to master,” says Dennis Moore. Days when tens of millions of master records were a lot are long gone and having hundreds of millions of master records and billions of source records is becoming almost common. An increasing number of master data sources –internal and external to organization – are contributing significantly to the rise in data volumes. To accommodate these increasing volumes, Dennis predicts that large enterprises will look at running complex matching and linking capabilities on Hadoop – a cost effective and flexible way to analyze large amount of data.

6. Master insight management is going to be next big step
“MDM will evolve into master insight management as organizations try to relate trusted data they created in MDM with transactional and social interaction data,” said Rob Karel – VP of Product Strategy and Product Marketing, IQS, Informatica. “The innovations in machine and deep learning techniques will help organizations such as healthcare prescribe next best treatment based on history of patients, retailers suggest best offers based on customer interest and behavior, public sector companies will see big steps in social services, etc.”

Rob sees MDM at the heart of this innovation bringing together relevant information about multiple master entities and acting as a core system for insight management innovations.

7. MDM and Data Governance
Aaron Zornes – Chief research officer at the MDM Institute predicts that in 2014-15, vendor MDM solutions will move from “passive-aggressive” mode to “proactive” data governance mode. Data governance for MDM will move beyond simple stewardship to convergence of task management, workflow, policy management and enforcement according to Aaron.

8. The market will solidify for cloud based MDM adoption
Aaron says – “Cloud-innate services for DQ and DG will be more prevalent; however, enterprise MDM will remain on premise with increasing integration to cloud applications in 2015.

Naveen sees lot of synergy around cloud based MDM offerings and says – “The market is solidifying for MDM on cloud but the flood gates are yet to open”.  Naveen does not see any reason why MDM market will not go to cloud and gives the example of CRM which was at similar junction before Saleforce came into play. Naveen sees similar shift for MDM and says – “The fears companies have about their data security on cloud is eventually going to fade. If you look closely at any of the recent breaches, these all involved hacks into company networks and not into cloud provider networks. The fact that cloud service providers spend more dollars on data security than any one company can spend on their on-premise security layer will be a major factor affecting the transition”. Naveen sees that big players in MDM will include cloud offerings as part of their toolkit in coming years.

Ravi also predicts an increase in cloud adoption for MDM in future as the concern for placing master data in the cloud becomes less with maximum security provided by cloud vendors.

So, what do you predict? I would love to hear your opinions and comments.

~Prash
@MDMGeek
www.mdmgeek.com

Share
Posted in Big Data, Cloud, Data Governance, Data Quality, Enterprise Data Management, Master Data Management, Product Information Management | Tagged , , , , , | Leave a comment

The 3 Little Architects and the Big Bad Mr. Wolf – A Data Parody for today’s Financial Industry

The 3 Little Architects and the Big Bad Wolf

The 3 Little Architects

Once upon a time, there were 3 Information Architects working in the financial services industry, each with different firms and backgrounds but all responsible for recommending the right technology solutions to help their firms comply with industry regulations including ongoing bank stress testing across the globe.  Since 2008, bank regulators have been focused on measuring systemic risk and requiring banks to provide transparency into how risk is measured and reported to support their capital adequacy needs.

The first architect grew through the ranks starting as a Database Administrator, a black belt in SQL and COBOL programming. Hand coding was their DNA for many years and thought of as the best approach given how customized their business and systems were vs. other organizations. As such, Architect #1 and their team went down the path of building their data management capabilities through custom hand coded scripts, manual data extractions and transformations, and dealing with data quality issues through the business organizations after the data is delivered.   Though their approach and decisions delivered on their short term needs, the firm realized the overhead required to make changes and respond to new requests driven by new industry regulations and changing market conditions.

The second architect is a “gadget guy” at heart who grew up using off the shelf tools vs. hand coding for managing data. He and his team decides not to hand code their data management processes, instead adopt and built their solution leveraging best of breed tools, some of which were open source, others from existing solutions the company had from previous projects for data integration, data quality, and metadata management.  Though their tools helped automate much of the “heavy lifting” he and is IT team were still responsible for integrating these point solutions to work together which required ongoing support and change management.

The last architect is as technically competent as his peers however understood the value of building something once to use across the business. His approach was a little different than the first two. Understanding the risks and costs of hand coding or using one off tools to do the work, he decided to adopt an integrated platform designed to handle the complexities, sources, and volumes of data required by the business.  The platform also incorporated shared metadata, reusable data transformation rules and mappings, a single source of required master and reference data, and provided agile development capabilities to reduce the cost of implementation and ongoing change management. Though this approach was more expensive to implement, the long term cost benefit and performance benefits made the decision a “no brainer’.

Lurking in the woods is Mr. Wolf. Mr. Wolf is not your typical antagonist however is a regulatory auditor whose responsibility is to ensure these banks can explain how risk is calculated as reported to the regulatory authorities. His job isn’t to shut these banks down, instead making sure the financial industry is able to measure risk across the enterprise, explain how risk is measured, and ensure these firms are adequately capitalized as mandated by new and existing industry regulations.

Mr. Wolf visits the first bank for an annual stress test audit. Looking at the result of their stress test, he asks the compliance teams to explain how their data was produced, transformed, calculated, to support the risk measurements they reported as part of the audit. Unfortunately, due to the first architect’s recommendations of hand coding their data management processes, IT failed to provide explanations and documentation on what they did, they found the developers that created their systems were no longer with the firm. As a result, the bank failed miserably, resulting in stiff penalties and higher audit costs.

Next, Architect #2’s bank was next. Having heard of what happened to their peer in the news, the architect and IT teams were confident that they were in good shape to pass their stress test audit. After digging into the risk reports, Mr. Wolf questioned the validity of the data used to calculate Value at Risk (VaR). Unfortunately, the tools that were adopted were never designed nor guaranteed by the vendors to work with each other resulting in invalid data mapping and data quality rules and gaps within their technical metadata documentation. As a result, bank #2 also failed their audit and found themselves with a ton of on one-off tools that helped automate their data management processes but lacked the integration and sharing of rules and metadata to satisfy the regulator’s demand for risk transparency.

Finally, Mr. Wolf investigated Architect #3’s firm. Having seen the result of the first two banks, Mr. Wolf was leery of their ability to pass their stress test audits. Similar demands were presented by Mr. Wolf however this time, Bank #3 provided detailed and comprehensive metadata documentation of their risk data measurements, descriptions of the data used in each report, an comprehensive report of each data quality rule used to cleanse their data, and detailed information on each counterparty and legal entity used to calculate VaR.  Unable to find gaps in their audit, Mr. Wolf, expecting to “blow” the house down, delivered a passing grade for Bank 3 and their management team due to the right investments they made to support their enterprise risk data management needs.

The moral of this story, similar to the familiar one involving the three little pigs is about the importance of having a solid foundation to weather market and regulatory storms or the violent bellow of a big bad wolf.  A foundation that includes the required data integration, data quality, master data management, and metadata management needs but also supports collaboration and visibility of how data is produced, used, and performing across the business. Ensuring current and future compliance in today’s financial services industry requires firms to have a solid data management platform, one that is intelligent, comprehensive, and allows Information Architects to help mitigate the risks and costs of hand coding or using point tools to get by only in the short term.

Are you prepared to meet Mr. Wolf?

Share
Posted in Architects, Banking & Capital Markets, Data Governance, Data Integration, Data Integration Platform, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Achieving Great Data in the Oil and Gas Industry

Have you noticed something different this winter season that most people are cheery about?  I’ll give you a hint. It’s not the great sales going on at your local shopping mall but something that helps you get to the mall allot more affordable then last year. It’s the extremely low gas prices across the globe, fueled by over-supply of oil vs. demand contributed from a boom in Geo-politics and boom in shale oil production in N. America and abroad. Like any other commodity, it’s impossible to predict where oil prices are headed however, one thing is sure that Oil and Gas companies will need timely and quality data as firms are investing in new technologies to become more agile, innovative, efficient, and competitive as reported by a recent IDC Energy Insights Predictions report for 2015.

The report predicts:

  1. 80% of the top O&G companies will reengineer processes and systems to optimize logistics, hedge risk and efficiently and safely deliver crude, LNG, and refined products by the end of 2017.
  2. Over the next 3 years, 40% of O&G majors and all software divisions of oilfield services (OFS) will co-innovate on domain specific technical projects with IT professional service firms.
  3. The CEO will expect immediate and accurate information about top Shale Plays to be available by the end of 2015 to improve asset value by 30%.
  4. By 2016, 70% percent of O&G companies will have invested in programs to evolve the IT environment to a third platform driven architecture to support agility and readily adapt to change.
  5. With continued labor shortages and over 1/3 of the O&G workforce under 45 in three years, O&G companies will turn to IT to meet productivity goals.
  6. By the end of 2017, 100% of the top 25 O&G companies will apply modeling and simulation tools and services to optimize oil field development programs and 25% will require these tools.
  7. Spending on connectivity related technologies will increase by 30% between 2014 and 2016, as O&G companies demand vendors provide the right balance of connectivity for a more complex set of data sources.
  8. In 2015, mergers, acquisitions and divestitures, plus new integrated capabilities, will drive 40% of O&G companies to re-evaluate their current deployments of ERP and hydrocarbon accounting.
  9. With a business case built on predictive analytics and optimization in drilling, production and asset integrity, 50% of O&G companies will have advanced analytics capabilities in place by 2016.
  10. With pressures on capital efficiency, by 2015, 25% of the Top 25 O&G companies will apply integrated planning and information to large capital projects, speeding up delivery and reducing over-budget risks by 30%.

Realizing value from these investments will also require Oil and Gas firms to modernize and improve their data management infrastructure and technologies to deliver great data whether to fuel actionable insights from Big Data technology to facilitating post-merger application consolidation and integration activities.  Great data is only achievable by Great Design supported by capable solutions designed to help access and deliver timely, trusted, and secure data to need it most.

Lack of proper data management investments and competences have long plagued the oil and gas sector with “less-than acceptable” data and higher operating costs. According to the “Upstream Data and Information Management Survey” conducted by Wipro Technologies, 56% of those surveyed felt that business users spent more than ¼ or more of their time on low value activities caused by existing data issues (e.g. accessing, cleansing, preparing data) for “high value” activities (e.g. analysis, planning, decision making).  The same survey showed the biggest data management issues were timely access to required data and data quality issues from source systems.

So what can Oil and Gas CIO’s and Enterprise Architects do to prepare for the future?  Here are some tips for consideration:

  • Look to migrate and automate legacy hand coded data transformation processes by adopting tools that can help streamline the development, testing, deployment, and maintenance of these complex tasks that help developers build, maintain, and monitor data transformation rules once and deploy them across the enterprise.
  • Simplify how data is distributed across systems with more modern architectures and solutions and avoid the cost and complexities of point to point integrations
  • Deal with and manage data quality upstream at the source and throughout the data life cycle vs. having end users fix unforeseen data quality errors manually.
  • Create a centralized source of shared business reference and master data that can manage a consistent record across heterogeneous systems such as well asset/material information (wellhead, field, pump, valve, etc.), employee data (drill/reservoir engineer, technician), location data (often geo-spatial), and accounting data (for financial roll-ups of cost, production data).
  • Establish standards and repeatable best practices by adopting an Integration Competency Center frame work to support the integration and sharing of data between operational and analytical systems.

In summary, low oil prices have a direct and positive impact to consumers especially during the winter season and holidays and I personally hope they continue for the unforeseeable future given that prices were double just a year ago. Unfortunately, no one can predict future energy prices however one thing is for sure, the demand for great data by Oil and Gas companies will continue to grow. As such, CIO’s and Enterprise Architects will need to consider and recognize the importance of improving their data management capabilities and technologies to ensure success in 2015. How ready are you?

Click to learn more about Informatica in today’s Energy Sector:

Share
Posted in Application Retirement, Architects, CIO, Data Integration, Data Quality, Data Synchronization, Utilities & Energy | Tagged , , , , | Leave a comment

Empowering Your Organization with 3 Views of Customer Data

According to Accenture – 2013 Global Consumer Pulse Survey, “85 percent of customers are frustrated by dealing with a company that does not make it easy to do business with them, 84 percent by companies promising one thing, but delivering another; and 58 percent are frustrated with inconsistent experiences from channel to channel.”

Consumers expect more from the companies they do business with. In response, many companies are shifting from managing their business based on an application-, account- or product-centric approach to a customer-centric approach. And this is one of the main drivers for master data management (MDM) adoption. According to a VP of Data Strategy & Services at one of the largest insurance companies in the world, “Customer data is the lifeblood of a company that is serious about customer-centricity.” So, better managing customer data, which is what MDM enables you to do, is a key to the success of any customer-centricity initiative. MDM provides a significant competitive differentiation opportunity for any organization that’s serious about improving customer experience. It enables customer-facing teams to assess the value of any customer, at the individual, household or organization level.

Amongst the myriad business drivers of a customer-centricity initiative, key benefits include delivering an enhanced customer experience – leading to higher customer loyalty and greater share of wallet, more effective cross-sell and upsell targeting to increase revenue, and improved regulatory compliance.

To truly achieve all the benefits expected from a customer-first, customer-centric strategy, we need to look beyond the traditional approaches of data quality and MDM implementations, which often consider only one foundational (yet important) aspect of the technology solution. The primary focus has always been to consolidate and reconcile internal sources of customer data with the hope that this information brought under a single umbrella of a database and a service layer will provide the desired single view of customer. But in reality, this data integration mindset misses the goal of creating quality customer data that is free from duplication and enriched to deliver significant value to the business.

Today’s MDM implementations need to take their focus beyond mere data integration to be successful. In the following section, I will explain 3 levels of customer views which can be built incrementally to be able to make most out of your MDM solution. When implemented fully, these customer views act as key ingredients for improving the execution of your customer-centric business functions.

Trusted Customer View

The first phase of the solution should cover creation of trusted customer view. This view empowers your organization with an ability to see complete, accurate and consistent customer information.

In this stage, you take the best information from all the applications and compile it into a single golden profile. You not only use data integration technology for this, but also employ data quality tools to ensure the correctness and completeness of the customer data. Advanced matching, merging and trust framework are used to derive the most up-to-date information about your customer. You also guarantee that the golden record you create is accessible to business applications and systems of choice so everyone who has the authority can leverage the single version of the truth.

At the end of this stage, you will be able to clearly say John D. who lives at 123 Main St and Johnny Doe at 123 Main Street, who are both doing business with you, are not really two different individuals.

Customer data

Customer Relationships View

The next level of visibility is about providing a view into the customer’s relationships. It takes advantage of the single customer view and layers in all valuable family and business relationships as well as account and product information. Revealing these relationships is where the real value of multidomain MDM technology comes into action.

At the end of this phase, you not only see John Doe’s golden profile, but the products he has. He might have a personal checking from the Retail Bank, a mortgage from the Mortgage line of business, and brokerage and trust account with the Wealth Management division. You can see that John has his own consulting firm. You can see he has a corporate credit card and checking account with the Commercial division under the name John Doe Consulting Company.

At the end of this phase, you will have a consolidated view of all important relationship information that will help you evaluate the true value of each customer to your organization.

Customer Interactions and Transactions View

The third level of visibility is in the form of your customer’s interactions and transactions with your organization.

During this phase, you tie transactional information, historical data and social interactions your customer has with your organization to further enhance the system. Building this view provides you a whole new world of opportunities because you can see everything related to your customer in one central place. Once you have this comprehensive view, when John Doe calls your call center, you know how valuable he is to your business, which product he just bought from you (transactional data), what is the problem he is facing (social interactions).

A widely accepted rule of thumb holds that 80 percent of your company’s future revenue will come from 20 percent of your existing customers.  Many organizations are trying to ensure they are doing everything they can to retain existing customers and grow wallet share. Starting with Trusted Customer View is first step towards making your existing customers stay. Once you have established all three states discussed here, you can arm your customer-facing teams with a comprehensive view of customers so they can:

  • Deliver the best customer experiences possible at every touch point,
  • Improve customer segmentation for tailored offers, boost marketing and sales productivity,
  • Increase cross-sell and up-sell success,  and
  • Streamline regulatory reporting.

Achieving the 3 views discussed here requires a solid data management platform. You not only need an industry leading multidomain MDM technology, but also require tools which will help you integrate data, control the quality and connect all the dots. These technologies should work together seamlessly to make your implementation easier and help you gain rapid benefits. Therefore, choose your data management platform. To know more about MDM vendors, read recently released Gartner’s Magic Quadrant for MDM of Customer Data Solutions.

-Prash (@MDMGeek)

www.mdmgeek.com

Share
Posted in Data Governance, Data Integration, Data Quality, Master Data Management | Tagged , , , , | Leave a comment

How Citrix is Using Great Data to Build Fortune Teller-Like Marketing

Build Fortune Teller-Like MarketingCitrix: You may not realize you know them, but chances are pretty good that you do.  And chances are also good that we marketers can learn something about achieving fortune teller-like marketing from them!

Citrix is the company that brought you GoToMeeting and a whole host of other mobile workspace solutions that provide virtualization, networking and cloud services.  Their goal is to give their 100 million users in 260,000 organizations across the globe “new ways to work better with seamless and secure access to the apps, files and services they need on any device, wherever they go.”

Citrix LogoCitrix is a company that has been imagining and innovating for over 25 years, and over that time, has seen a complete transformation in their market – virtual solutions and cloud services didn’t even exist when they were founded. Now it’s the backbone of their business.  Their corporate video proudly states that the only constant in this world is change, and that they strive to embrace the “yet to be discovered.”

Having worked with them quite a bit over the past few years, we have seen first-hand how Citrix has demonstrated their ability to embrace change.

The Problem:

Back in 2011, it became clear to Citrix that they had a data problem, and that they would have to make some changes to stay ahead in this hyper competitive market.  Sales & Marketing had identified data as their #1 concern – their data was incomplete, inaccurate, and duplicated in their CRM system.  And with so many different applications in the organization, it was quite difficult to know which application or data source had the most accurate and up-to-date information.  They realized they needed a single source of the truth – one system of reference where all of their global data management practices could be centralized and consistent.

The Solution:

The marketing team realized that they needed to take control of the solution to their data concerns, as their success truly depended upon it.  They brought together their IT department and their systems integration partner, Cognizant to determine a course of action.  Together they forged an overall data governance strategy which would empower the marketing team to manage data centrally – to be responsible for their own success.

Citrix Marketing EnvironmentAs a key element of that data governance / management strategy, they determined that they needed a Master Data Management (MDM) solution to serve as their Single Trusted Source of Customer & Prospect Data.  They did a great deal of research into industry best practices and technology solutions, and decided to select Informatica as their MDM partner. As you can see, Citrix’s environment is not unlike most marketing organizations.  The difference is that they are now able to capture and distribute better customer and prospect data to and from these systems to achieve even better results.  They are leveraging internal data sources and systems like CRM (Salesforce) and marketing automation (Marketo).  Their systems live all over the enterprise, both on premises and in the cloud.  And they leverage analytical tools to analyze and dashboard their results.

The Results:

Citrix strategized and implemented their Single Trusted Source of Customer & Prospect solution in a phased approach throughout 2013 and 2014, and we believe that what they’ve been able to accomplish during that short period of time has been nothing short of phenomenal.  Here are the higlights:

Citrix Achieved Tremendous Results

  • Used Informatica MDM to provide clean, consistent and connected channel partner, customer and prospect data and the relationships between them for use in operational applications (SFDC, BI Reporting and Predictive Analytics)
  • Recognized 20% increase in lead-to-opportunity conversion rates
  • Realized 20% increase in marketing team’s operational efficiency
  • Achieved 50% increase in quality of data at the point of entry, and a 50% reduction in the rate of junk and duplicate data for prospects, existing accounts and contact
  • Delivered a better channel partner and customer experience by renewing all of a customers’ user licenses across product lines at one time and making it easy to identify whitespace opportunities to up-sell more user licenses

That is huge!  Can you imagine the impact on your own marketing organization of a 20% increase in lead-to-opportunity conversion?  Can you imagine the impact of spending 20% less time questioning and manually massaging data to get the information you need?  That’s game changing!

Because Citrix now has great data and great resulting insight, they have been able to take the next step and embark on new fortune teller-like marketing strategies.   As Citrix’s Dagmar Garcia discussed during a recent webinar, “We monitor implicit and explicit behavior of transactional leads and accounts, and then we leverage these insights and previous behaviors to offer net new offers and campaigns to our customers and prospects…  And it’s all based on the quality of data we have within our database.”

I encourage you to take a few minutes to listen to Dagmar discuss Citrix’s project on a recent webinar.  In the webinar, she dives deeper into their project, the project scope and timeline, and to what she means by “fortune telling abilities”.  Also, take a look at the customer story section of the Informatica.com website for the PDF case study.  And, if you’re in the mood to learn more, you can download a complimentary copy of the 2014 Gartner Magic Quadrant for MDM of Customer Data Solutions.

Hat’s off to you Citrix, and we look forward to working with you to continue to change the game even more in the coming months and years!

Share
Posted in CMO, Customers, Master Data Management | Tagged , , , , , , , , | Leave a comment

At Valspar Data Management is Key to Controlling Purchasing Costs

Steve Jenkins, Global IT Director at Valspar

Steve Jenkins is working to improve information management maturity at Valspar

Raw materials costs are the company’s single largest expense category,” said Steve Jenkins, Global IT Director at Valspar, at MDM Day in London. “Data management technology can help us improve business process efficiency, manage sourcing risk and reduce RFQ cycle times.”

Valspar is a $4 billion global manufacturing company, which produces a portfolio of leading paint and coating brands. At the end of 2013, the 200 year old company celebrated record sales and earnings. They also completed two acquisitions. Valspar now has 10,000 employees operating in 25 countries.

As is the case for many global companies, growth creates complexity. “Valspar has multiple business units with varying purchasing practices. We source raw materials from 1,000s of vendors around the globe,” shared Steve.

“We want to achieve economies of scale in purchasing to control spending,” Steve said as he shared Valspar’s improvement objectives. “We want to build stronger relationships with our preferred vendors. Also, we want to develop internal process efficiencies to realize additional savings.”

Poorly managed vendor and raw materials data was impacting Valspar’s buying power

Data management at Valspar

“We realized our buying power was limited by the age and quality of available vendor and raw materials data.”

The Valspar team, who sharply focuses on productivity, had an “Aha” moment. “We realized our buying power was limited by the age and quality of available vendor data and raw materials data,” revealed Steve. 

The core vendor data and raw materials data that should have been the same across multiple systems wasn’t. Data was often missing or wrong. This made it difficult to calculate the total spend on raw materials. It was also hard to calculate the total cost of expedited freight of raw materials. So, employees used a manual, time-consuming and error-prone process to consolidate vendor data and raw materials data for reporting.

These data issues were getting in the way of achieving their improvement objectives. Valspar needed a data management solution.

Valspar needed a single trusted source of vendor and raw materials data

Informatica MDM supports vendor and raw materials data management at Valspar

The team chose Informatica MDM as their enterprise hub for vendors and raw materials

The team chose Informatica MDM, master data management (MDM) technology. It will be their enterprise hub for vendors and raw materials. It will manage this data centrally on an ongoing basis. With Informatica MDM, Valspar will have a single trusted source of vendor and raw materials data.

Informatica PowerCenter will access data from multiple source systems. Informatica Data Quality will profile the data before it goes into the hub. Then, after Informatica MDM does it’s magic, PowerCenter will deliver clean, consistent, connected and enriched data to target systems.

Better vendor and raw materials data management results in cost savings

Valspar Chameleon Jon

Valspar will gain benefits by fueling applications with clean, consistent, connected and enriched data

Valspar expects to gain the following business benefits:

  • Streamline the RFQ process to accelerate raw materials cost savings
  • Reduce the total number of raw materials SKUs and vendors
  • Increase productivity of staff focused on pulling and maintaining data
  • Leverage consistent global data visibly to:
    • increase leverage during contract negotiations
    • improve acquisition due diligence reviews
    • facilitate process standardization and reporting

 

Valspar’s vision is to tranform data and information into a trusted organizational assets

“Mastering vendor and raw materials data is Phase 1 of our vision to transform data and information into trusted organizational assets,” shared Steve. In Phase 2 the Valspar team will master customer data so they have immediate access to the total purchases of key global customers. In Phase 3, Valspar’s team will turn their attention to product or finished goods data.

Steve ended his presentation with some advice. “First, include your business counterparts in the process as early as possible. They need to own and drive the business case as well as the approval process. Also, master only the vendor and raw materials attributes required to realize the business benefit.”

Total Supplier Information Management eBook

Click here to download the Total Supplier Information Management eBook

Want more? Download the Total Supplier Information Management eBook. It covers:

  • Why your fragmented supplier data is holding you back
  • The cost of supplier data chaos
  • The warning signs you need to be looking for
  • How you can achieve Total Supplier Information Management

 

Share
Posted in Business/IT Collaboration, Data Integration, Data Quality, Manufacturing, Master Data Management, Operational Efficiency, PowerCenter, Vertical | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment