Jakki Geiger

Jakki Geiger
Jakki Geiger is the Senior Director of Master Data Management (MDM) Solutions Marketing at Informatica, responsible for the go-to-market strategies of solutions such as: Total Customer Relationship, Total Supplier Relationship, and Financial Hierarchies Management, among others. Over the course of her 15 year career spearheading marketing for companies of all sizes — from 10 employees to 50,000 employees — Jakki has experienced the pain of not having clean, consistent and connected customer information to drive marketing programs and reporting. Of course, this was before joining Informatica. Jakki joined Informatica following the Siperian acquisition, where she was responsible for solutions marketing. Before Informatica, Jakki held marketing leadership positions at several venture-funded startups resulting in exponential growth and successful acquisitions by Oracle, Thomson Reuters and Informatica. She graduated magna cum laude with an M.A. in integrated marketing communications from Emerson College in Boston and earned a B.A. with honors in Psychology from McGill University in Montreal, Canada. Jakki is a published author and frequent speaker on the business value that can be gained when people have access to clean, consistent and connected information about the things that matter most to their business. She’s based at Informatica’s headquarters in Redwood City, California.

At Valspar Data Management is Key to Controlling Purchasing Costs

Steve Jenkins, Global IT Director at Valspar

Steve Jenkins is working to improve information management maturity at Valspar

Raw materials costs are the company’s single largest expense category,” said Steve Jenkins, Global IT Director at Valspar, at MDM Day in London. “Data management technology can help us improve business process efficiency, manage sourcing risk and reduce RFQ cycle times.”

Valspar is a $4 billion global manufacturing company, which produces a portfolio of leading paint and coating brands. At the end of 2013, the 200 year old company celebrated record sales and earnings. They also completed two acquisitions. Valspar now has 10,000 employees operating in 25 countries.

As is the case for many global companies, growth creates complexity. “Valspar has multiple business units with varying purchasing practices. We source raw materials from 1,000s of vendors around the globe,” shared Steve.

“We want to achieve economies of scale in purchasing to control spending,” Steve said as he shared Valspar’s improvement objectives. “We want to build stronger relationships with our preferred vendors. Also, we want to develop internal process efficiencies to realize additional savings.”

Poorly managed vendor and raw materials data was impacting Valspar’s buying power

Data management at Valspar

“We realized our buying power was limited by the age and quality of available vendor and raw materials data.”

The Valspar team, who sharply focuses on productivity, had an “Aha” moment. “We realized our buying power was limited by the age and quality of available vendor data and raw materials data,” revealed Steve. 

The core vendor data and raw materials data that should have been the same across multiple systems wasn’t. Data was often missing or wrong. This made it difficult to calculate the total spend on raw materials. It was also hard to calculate the total cost of expedited freight of raw materials. So, employees used a manual, time-consuming and error-prone process to consolidate vendor data and raw materials data for reporting.

These data issues were getting in the way of achieving their improvement objectives. Valspar needed a data management solution.

Valspar needed a single trusted source of vendor and raw materials data

Informatica MDM supports vendor and raw materials data management at Valspar

The team chose Informatica MDM as their enterprise hub for vendors and raw materials

The team chose Informatica MDM, master data management (MDM) technology. It will be their enterprise hub for vendors and raw materials. It will manage this data centrally on an ongoing basis. With Informatica MDM, Valspar will have a single trusted source of vendor and raw materials data.

Informatica PowerCenter will access data from multiple source systems. Informatica Data Quality will profile the data before it goes into the hub. Then, after Informatica MDM does it’s magic, PowerCenter will deliver clean, consistent, connected and enriched data to target systems.

Better vendor and raw materials data management results in cost savings

Valspar Chameleon Jon

Valspar will gain benefits by fueling applications with clean, consistent, connected and enriched data

Valspar expects to gain the following business benefits:

  • Streamline the RFQ process to accelerate raw materials cost savings
  • Reduce the total number of raw materials SKUs and vendors
  • Increase productivity of staff focused on pulling and maintaining data
  • Leverage consistent global data visibly to:
    • increase leverage during contract negotiations
    • improve acquisition due diligence reviews
    • facilitate process standardization and reporting

 

Valspar’s vision is to tranform data and information into a trusted organizational assets

“Mastering vendor and raw materials data is Phase 1 of our vision to transform data and information into trusted organizational assets,” shared Steve. In Phase 2 the Valspar team will master customer data so they have immediate access to the total purchases of key global customers. In Phase 3, Valspar’s team will turn their attention to product or finished goods data.

Steve ended his presentation with some advice. “First, include your business counterparts in the process as early as possible. They need to own and drive the business case as well as the approval process. Also, master only the vendor and raw materials attributes required to realize the business benefit.”

Total Supplier Information Management eBook

Click here to download the Total Supplier Information Management eBook

Want more? Download the Total Supplier Information Management eBook. It covers:

  • Why your fragmented supplier data is holding you back
  • The cost of supplier data chaos
  • The warning signs you need to be looking for
  • How you can achieve Total Supplier Information Management

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Data Integration, Data Quality, Manufacturing, Master Data Management, Operational Efficiency, PowerCenter, Vertical | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment

8 Information Management Challenges for UDI Compliance

“My team spends far too much time pulling together medical device data that’s scattered across different systems and reconciling it in spreadsheets to create compliance reports.” This quotation from a regulatory affairs leader at a medical device manufacturer highlights the impact of poorly managed medical device data on compliance reporting, such as the reports needed for the FDA’s Universal Device Identification (UDI) regulation. In fact, an overreliance on manual, time-consuming processes brings an increased risk of human error in UDI compliance reports.

frustrated_man_computer

Is your compliance team manually reconciling data for UDI compliance reports?

If you are an information management leader working for a medical device manufacturer, and your compliance team needs quick and easy access to medical device data for UDI compliance reporting, I have five questions for you:

1) How many Class III and Class II devices do you have?
2) How many systems or reporting data stores contain data about these medical devices?
3) How much time do employees spend manually fixing data errors before the data can be used for reporting?
4) How do you plan to manage medical device data so the compliance team can quickly and easily produce accurate reports for UDI Compliance?
5) How do you plan to help the compliance team manage the multi-step submission process?

Watch this on-demand webinar "3 EIM Best Practices for UDI Compliance"

Watch this on-demand webinar “3 EIM Best Practices for UDI Compliance”

For some helpful advice from data management experts, watch this on-demand webinar “3 Enterprise Information Management (EIM) Best Practices for UDI Compliance.”

The deadline to submit the first UDI compliance report to the FDA for Class III devices is September 24, 2014. But, the medical device data needed to produce the report is typically scattered among different internal systems, such as Enterprise Resource Planning (ERP) e.g. SAP and JD Edwards, Product Lifecycle Management (PLM), Manufacturing Execution Systems (MES) and external 3rd party device identifiers.

The traditional approach for dealing with poorly managed data is the compliance team burns the midnight oil to bring together and then manually reconcile all the medical device data in a spreadsheet. And, they have to do this each and every time a compliance report is due. The good news is your compliance team doesn’t have to.

Many medical device manufacturers are are leveraging their existing data governance programs, supported by a combination of data integration, data quality and master data management (MDM) technology to eliminate the need for manual data reconciliation. They are centralizing their medical device data management, so they have a single source of trusted medical device data for UDI compliance reporting as well as other compliance and revenue generating initiatives.

Get UDI data management advice from data experts Kelle O'Neal, Managing Partner at First San Francisco Partners and Bryan Balding, MDM Specialist at Informatica
Get UDI data management advice from data experts Kelle O’Neal, Managing Partner at First San Francisco Partners and Bryan Balding, MDM Specialist at Informatica

During this this on-demand webinar, Kelle O’Neal, Managing Partner at First San Francisco Partners, covers the eight information management challenges for UDI compliance as well as best practices for medical device data management.

Bryan Balding, MDM Solution Specialist at Informatica, shows you how to apply these best practices with the Informatica UDI Compliance Solution.

You’ll learn how to automate the process of capturing, managing and sharing medical device data to make it quicker and easier to create the reports needed for UDI compliance on ongoing basis.

 

 

20 Questions & Answers about Complying with the FDA Requirement for Unique Device Identification (UDI)

20 Questions & Answers about Complying with the FDA Requirement
for Unique Device Identification (UDI)

Also, we just published a joint whitepaper with First San Francisco Partners, Information Management FAQ for UDI: 20 Questions & Answers about Complying with the FDA Requirement for Unique Device Identification (UDI). Get answers to questions such as:

What is needed to support an EIM strategy for UDI compliance?
What role does data governance play in UDI compliance?
What are the components of a successful data governance program?
Why should I centralize my business-critical medical device data?
What does the architecture of a UDI compliance solution look like?

I invite you to download the UDI compliance FAQ now and share your feedback in the comments section below.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Enterprise Data Management, Life Sciences, Manufacturing, Master Data Management, Vertical | Tagged , , , , , , , , , , , , , , | Leave a comment

3 Barriers to Delivering Omnichannel Experiences

 

This blog post initially appeared on CMSwire.com and is reblogged here with their consent.

3 Barriers to Delivering Omnichannel Experiences

Image via Lars Plougmann via CC BY-SA 2.0 license

I was recently searching for fishing rods for my 5-year old son and his friends to use at our neighborhood pond. I know nothing about fishing, so I needed to get educated. First up, a Google search on my laptop at home. Then, I jostled between my phone, tablet and laptop visiting websites, reading descriptions, looking at photos and reading reviews. Offline, I talked to friends and visited local stores recently, searching for fishing rods for my 5-year old son and his friends to use at our neighborhood pond. I know nothing about fishing, so I needed to get educated. First up, a Google search on my laptop at home. Then, I jostled between my phone, tablet and laptop visiting websites, reading descriptions, looking at photos and reading reviews. Offline, I talked to friends and visited local stores.

The product descriptions weren’t very helpful. What is a “practice casting plug”? Turns out, this was a great feature! Instead of a hook, the rod had a rubber fish to practice casting safely. What a missed opportunity for the retailers who didn’t share this information. I bought the fishing rods from the retailer that educated me with valuable product information and offered free three to five day shipping.

What does this mean for companies who sell products across multiple channels?

Virtually everyone is a cross-channel shopper: 95 percent of consumers frequently or at least occasionally shop a retailer’s website and store, according to the “Omni-Channel Insights” study by CFI Group. In the report, “The Omnichannel Opportunity: Unlocking the Power of the Connected Customer,” Deloitte predicts more than 50 percent of in-store purchases will be influenced digitally by the end of 2014.

Because of all this crosschannel activity, a new term is trending: omnichannel

What Does Omnichannel Mean?

Let’s take a look back in time. Retailers started with one channel — the brick-and-mortar store. Then they introduced the catalog and call center. Then they built another channel — e-Commerce. Instead of making it an extension of the brick-and-mortar experience, many implemented an independent strategy, including operations, resources, technology and inventory. Retailers recently started integrating brick-and-mortar and e-Commerce channels, but it’s not always consistent. And now they are building another channel — mobile sites and apps.

Multichannel is a retailer-centric, transaction-focused view of operations. Each channel operates and aims to boost sales independently. Omnichannel is a customer-centric view. The goal is to understand through which channels customers want to engage at each stage of the shopping journey and enable a seamless, integrated and consistent brand experience across channels and devices.

Shoppers expect an omnichannel experience, but delivering it efficiently isn’t easy. Those responsible for enabling an omnichannel experience are encountering barriers. Let’s look at the three barriers most relevant for marketing, merchandising, sales, customer experience and information management leaders.

Barrier #1: Shift from product-centric to customer-centric view

Many retailers focus on how many products are sold by channel. Three key questions are:

  1. How can we drive store sales growth?
  2. How can we drive online sales growth?
  3. What’s our mobile strategy?

This is the old way of running a retail business. The new way is analyzing customer data to understand how they are engaging and transacting across channels.

Why is this difficult? At the Argyle eCommerce Leadership Forum, Vice President of Multichannel at GameStop Corp Jason Allen shared the $8.8 billion video game retailer’s approach to overcoming this barrier. While online represents 3 percent of sales, no one measured how much the online channel was influencing overall business.

They started by collecting customer data for analytics to find out who their customers were and how they interacted with Game Stop online and in 6,600 stores across 15 countries. The analysis revealed customers used multiple channels: 60 percent engaged on the web, and 26 percent of web visitors who didn’t buy online bought in-store within 48 hours.

This insight changed the perception of the online channel as a small contributor. Now they use two metrics to measure performance. While the online channel delivers 3 percent of sales, it influences 22 percent of overall business.

Take Action: Start collecting customer data. Analyze it. Learn who your customers are. Find out how they engage and transact with your business across channels.

Barrier #2: Shift from fragmented customer data to centralized customer data everyone can use

Nikki Baird, Managing Partner at Retail Systems Research (RSR), told me she believes the fundamentals of retail are changing from “right product, right price, right place, right time” to:

  1. Who is my customer?
  2. What are they trying to accomplish?
  3. How can we help?

According to RSR, creating a consistent customer experience remains the most valued capability for retailers, but 54 percent indicated their biggest inhibitor was not having a single view of the customer across channels.

Why is this difficult? A $12 billion specialty retailer known for its relentless focus on customer experience, with 200 stores and an online channel had to overcome this barrier. To deliver a high-touch omnichannel experience, they needed to replace the many views of the customer with one unified customer view. They invested in master data management (MDM) technology and competencies.

2014-17-July-Customer-Information-Challenge.jpg

 

Now they bring together customer, employee and product data scattered across 30 applications (e.g., e-Commerce, POS, clienteling, customer service, order management) into a central location, where it’s managed and shared on an ongoing basis. Employees’ applications are fueled with clean, consistent and connected customer data. They are able to deliver a high-touch omnichannel experience because they can answer important questions about customers and their valuable relationships, such as:

  • Who is this customer and who’s in their household?
  • Who do they buy for, what do they buy, where do they buy?
  • Which employees do they typically buy from in store?

Take Action: Think of the valuable information customers share when they interact with different parts of your business. Tap into it by bridging customer information silos. Bring fragmented customer information together in one central location. Make it universally accessible. Don’t let it remain locked up in departmental applications. Keep it up-to-date. Automate the process of updating customer information across departmental applications.

Barrier #3: Shift from fragmented product data to centralized product data everyone can use

Two-thirds of purchase journeys start with a Google search. To have a fighting chance, retailers need rich and high quality product information to rank higher than the competition.

2014-17-July-Geiger-Image5.pngTake a look at the image on the left. Would you buy this product? Probably not. One-third of shoppers who don’t make a purchase didn’t have enough information to make a purchase decision. What product information does a shopper need to convert in the moment? Rich, high quality information has conversion power.

Consumers return about 40 percent of all fashion and 15 percent of electronics purchases. That’s not good for retailers or shoppers. Minimize costly returns with complete product information so shoppers can make more informed purchase decisions. Jason Allen’s advice is, “Focus less on the cart and check out. Focus more on search, product information and your store locator. Eighty percent of customers are coming to the web for research.”

Why is this difficult? Crestline is a multichannel direct marketing firm selling promotional products through direct mail and e-Commerce. The barrier to quickly bringing products to market and updating product information across channels was fragmented and complex product information. To replace the manual, time consuming spreadsheet process to manage product information, they invested in product information management (PIM) technology.

2014-17-July-Product-Information-Challenge.jpg

Now Crestline’s product introduction and update process is 300 percent more efficient. Because they are 100 percent current on top products and over 50 percent current for all products, the company is boosting margins and customer service.

Take Action: Think about all the product information shoppers need to research and make a decision. Tap into it by bridging product information silos. Bring fragmented product information together in one central location. Make it universally usable, not channel-specific. Keep it up-to-date. Automate the process of publishing product information across channels, including the applications used by customer service and store associates.

Key Takeaways

Delivering an omnichannel experience efficiently isn’t easy. The Game Stop team collected and analyzed customer data to learn more about who their customers are and how they interact with the company. A specialty retailer centralized fragmented customer data. Crestline centralized product information to accelerate their ability to bring products to market and make updates across channels. Which of these barriers are holding you back from delivering an omnichannel experience?

Title image by Lars Plougmann (Flickr) via a CC BY-SA 2.0 license

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Aggregation, Operational Efficiency, Retail | Tagged | Leave a comment

How Much is Poorly Managed Supplier Information Costing Your Business?

Supplier Information“Inaccurate, inconsistent and disconnected supplier information prohibits us from doing accurate supplier spend analysis, leveraging discounts, comparing and choosing the best prices, and enforcing corporate standards.”

This is quotation from a manufacturing company executive. It illustrates the negative impact that poorly managed supplier information can have on a company’s ability to cut costs and achieve revenue targets.

Many supply chain and procurement teams at large companies struggle to see the total relationship they have with suppliers across product lines, business units and regions. Why? Supplier information is scattered across dozens or hundreds of Enterprise Resource Planning (ERP) and Accounts Payable (AP) applications. Too much valuable time is spent manually reconciling inaccurate, inconsistent and disconnected supplier information in an effort to see the big picture. All this manual effort results in back office administrative costs that are higher than they should be.

Do these quotations from supply chain leaders and their teams sound familiar?

  • “We have 500,000 suppliers. 15-20% of our supplier records are duplicates. 5% are inaccurate.”
  • I get 100 e-mails a day questioning which supplier to use.”
  • “To consolidate vendor reporting for a single supplier between divisions is really just a guess.”
  • “Every year 1099 tax mailings get returned to us because of invalid addresses, and we play a lot of Schedule B fines to the IRS.”
  • “Two years ago we spent a significant amount of time and money cleansing supplier data. Now we are back where we started.”
Webinar, Supercharge Your Supply Chain Apps with Better Supplier Information

Join us for a Webinar to find out how to supercharge your supply chain applications with clean, consistent and connected supplier information

Please join me and Naveen Sharma, Director of the Master Data Management (MDM) Practice at Cognizant for a Webinar, Supercharge Your Supply Chain Applications with Better Supplier Information, on Tuesday, July 29th at 11 am PT.

During the Webinar, we’ll explain how better managing supplier information can help you achieve the following goals:

  1. Accelerate supplier onboarding
  2. Mitiate the risk of supply disruption
  3. Better manage supplier performance
  4. Streamline billing and payment processes
  5. Improve supplier relationship management and collaboration
  6. Make it easier to evaluate non-compliance with Service Level Agreements (SLAs)
  7. Decrease costs by negotiating favorable payment terms and SLAs

I hope you can join us for this upcoming Webinar!

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Integration, Data Quality, Manufacturing, Master Data Management | Tagged , , , , , , , , , , , , , , | Leave a comment

How Much is Disconnected Well Data Costing Your Business?

“Not only do we underestimate the cost for projects up to 150%, but we overestimate the revenue it will generate.” This quotation from an Energy & Petroleum (E&P) company executive illustrates the negative impact of inaccurate, inconsistent and disconnected well data and asset data on revenue potential. 

“Operational Excellence” is a common goal of many E&P company executives pursuing higher growth targets. But, inaccurate, inconsistent and disconnected well data and asset data may be holding them back. It obscures the complete picture of the well information lifecycle, making it difficult to maximize production efficiency, reduce Non-Productive Time (NPT), streamline the oilfield supply chain, calculate well by-well profitability,  and mitigate risk.

Well data expert, Stephanie Wilkin shares details about the award-winning collaboration between Noah Consulting and Devon Energy.

Well data expert, Stephanie Wilkin shares details about the award-winning collaboration between Noah Consulting and Devon Energy.

To explain how E&P companies can better manage well data and asset data, we hosted a webinar, “Attention E&P Executives: Streamlining the Well Information Lifecycle.” Our well data experts Stephanie Wilkin, Senior Principal Consultant at Noah Consulting, and Stephan Zoder, Director of Value Engineering at Informatica shared some advice. E&P companies should reevaluate “throwing more bodies at a data cleanup project twice a year.” This approach does not support the pursuit of operational excellence.

In this interview, Stephanie shares details about the award-winning collaboration between Noah Consulting and Devon Energy to create a single trusted source of well data, which is standardized and mastered.

Q. Congratulations on winning the 2014 Innovation Award, Stephanie!
A. Thanks Jakki. It was really exciting working with Devon Energy. Together we put the technology and processes in place to manage and master well data in a central location and share it with downstream systems on an ongoing basis. We were proud to win the 2014 Innovation Award for Best Enterprise Data Platform.

Q. What was the business need for mastering well data?
A. As E&P companies grow so do their needs for business-critical well data. All departments need clean, consistent and connected well data to fuel their applications. We implemented a master data management (MDM) solution for well data with the goals of improving information management, business productivity, organizational efficiency, and reporting.

Q. How long did it take to implement the MDM solution for well data?
A. The Devon Energy project kicked off in May of 2012. Within five months we built the complete solution from gathering business requirements to development and testing.

Q. What were the steps in implementing the MDM solution?
A: The first and most important step was securing buy-in on a common definition for master well data or Unique Well Identifier (UWI). The key was to create a definition that would meet the needs of various business functions. Then we built the well master, which would be consistent across various systems, such as G&G, Drilling, Production, Finance, etc. We used the Professional Petroleum Data Management Association (PPDM) data model and created more than 70 unique attributes for the well, including Lahee Class, Fluid Direction, Trajectory, Role and Business Interest.

As part of the original go-live, we had three source systems of well data and two target systems connected to the MDM solution. Over the course of the next year, we added three additional source systems and four additional target systems. We did a cross-system analysis to make sure every department has the right wells and the right data about those wells. Now the company uses MDM as the single trusted source of well data, which is standardized and mastered, to do analysis and build reports.

Q. What’s been the traditional approach for managing well data?
A. Typically when a new well is created, employees spend time entering well data into their own systems. For example, one person enters well data into the G&G application. Another person enters the same well data into the Drilling application. A third person enters the same well data into the Finance application. According to statistics, it takes about 30 minutes to enter wells into a particular financial application.

So imagine if you need to add 500 new wells to your systems. This is common after a merger or acquisition. That translates to roughly 250 hours or 6.25 weeks of employee time saved on the well create process! By automating across systems, you not only save time, you eliminate redundant data entry and possible errors in the process.

Q. That sounds like a painfully slow and error-prone process.
A. It is! But that’s only half the problem. Without a single trusted source of well data, how do you get a complete picture of your wells? When you compare the well data in the G&G system to the well data in the Drilling or Finance systems, it’s typically inconsistent and difficult to reconcile. This leads to the question, “Which one of these systems has the best version of the truth?” Employees spend too much time manually reconciling well data for reporting and decision-making.

Q. So there is a lot to be gained by better managing well data.
A. That’s right. The CFO typically loves the ROI on a master well data project. It’s a huge opportunity to save time and money, boost productivity and get more accurate reporting.

Q: What were some of the business requirements for the MDM solution?
A: We couldn’t build a solution that was narrowly focused on meeting the company’s needs today. We had to keep the future in mind. Our goal was to build a framework that was scalable and supportable as the company’s business environment changed. This allows the company to add additional data domains or attributes to the well data model at any time.

Noah Consulting's MDM Trust Framework for well data

The Noah Consulting MDM Trust Framework was used to build a single trusted source of well data

Q: Why did you choose Informatica MDM?
A: The decision to use Informatica MDM for the MDM Trust Framework came down to the following capabilities:

  • Match and Merge: With Informatica, we get a lot of flexibility. Some systems carry the API or well government ID, but some don’t. We can match and merge records differently based on the system.
  • X-References: We keep a cross-reference between all the systems. We can go back to the master well data and find out where that data came from and when. We can see where changes have occurred because Informatica MDM tracks the history and lineage.
  • Scalability: This was a key requirement. While we went live after only 5 months, we’ve been continually building out the well master based on the requiremets of the target systems.
  • Flexibility: Down the road, if we want to add an additional facet or classification to the well master, the framework allows for that.
  • Simple Integration: Instead of building point-to-point integrations, we use the hub model.

In addition to Informatica MDM, our Noah Consulting MDM Trust Framework includes Informatica PowerCenter for data integration, Informatica Data Quality for data cleansing and Informatica Data Virtualization.

Q: Can you give some examples of the business value gained by mastering well data?
A: One person said to me, “I’m so overwhelmed! We’ve never had one place to look at this well data before.” With MDM centrally managing master well data and fueling key business applications, many upstream processes can be optimized to achieve their full potential value.

People spend less time entering well data on the front end and reconciling well data on the back end. Well data is entered once and it’s automatically shared across all systems that need it. People can trust that it’s consistent across systems. Also, because the data across systems is now tied together, it provides business value they were unable to realize before, such as predictive analytics. 

Q. What’s next?
A. There’s a lot of insight that can be gained by understanding the relationships between the well, and the people, equipment and facilities associated with it. Next, we’re planning to add the operational hierarchy. For example, we’ll be able to identify which production engineer, reservoir engineer and foreman are working on a particular well.

We’ve also started gathering business requirements for equipment and facilities to be tied to each well. There’s a lot more business value on the horizon as the company streamlines their well information lifecycle and the valuable relationships around the well.

If you missed the webinar, you can watch the replay now: Attention E&P Executives: Streamlining the Well Information Lifecycle.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Operational Efficiency, PowerCenter, Utilities & Energy | Tagged , , , , , , , | Leave a comment

Don’t Rely on CRM as Your Single Source of Trusted Customer Data

Step 1: Determine if you have a customer data problem

A statement I often hear from marketing and sales leaders unfamiliar with the concept of mastering customer data is, “My CRM application is our single source of trusted customer data.” They use CRM to onboard new customers, collecting addresses, phone numbers and email addresses. They append a DUNS number. So it’s no surprise they may expect they can master their customer data in CRM. (To learn more about the basics of managing trusted customer data, read this: How much does bad data cost your business?)

It may seem logical to expect your CRM investment to be your customer master – especially since so many CRM vendors promise a “360 degree view of your customer.” But you should only consider your CRM system as the source of truth for trusted customer data if:

Shopper
For most large enterprises, CRM never delivered on that promise of a trusted 360-degree customer view.

 ·  You have only a single instance of Salesforce.com, Siebel CRM, or other CRM

·  You have only one sales organization (vs. distributed across regions and LOBs)

·  Your CRM manages all customer-focused processes and interactions (marketing, service, support, order management, self-service, etc)

·  The master customer data in your CRM is clean, complete, fresh, and free of duplicates


Unfortunately most mid-to-large companies cannot claim such simple operations. For most large enterprises, CRM never delivered on that promise of a trusted 360-degree customer view. That’s what prompted Gartner analysts Bill O’Kane and Kimbery Collins to write this report,
 MDM is Critical to CRM Optimization, in February 2014.

“The reality is that the vast majority of the Fortune 2000 companies we talk to are complex,” says Christopher Dwight, who leads a team of master data management (MDM) and product information management (PIM) sales specialists for Informatica. Christopher and team spend each day working with retailers, distributors and CPG companies to help them get more value from their customer, product and supplier data. “Business-critical customer data doesn’t live in one place. There’s no clear and simple source. Functional organizations, processes, and systems landscapes are much more complicated. Typically they have multiple selling organizations across business units or regions.”

As an example, listed below are typical functional organizations, and common customer master data-dependent applications they rely upon, to support the lead-to-cash process within a typical enterprise:

·  Marketing: marketing automation, campaign management and customer analytics systems.
·  Ecommerce: e-commerce storefront and commerce applications.
·  Sales: sales force automation, quote management,
·  Fulfillment: ERP, shipping and logistics systems.
·  Finance: order management and billing systems.
·  Customer Service: CRM, IVR and case management systems.

The fragmentation of critical customer data across multiple organizations and applications is further exacerbated by the explosive adoption of Cloud applications such as Salesforce.com and Marketo. Merger and acquisition (M&A) activity is common among many larger organizations where additional legacy customer applications must be onboarded and reconciled. Suddenly your customer data challenge grows exponentially.  

Step 2: Measure how customer data fragmentation impacts your business

Ask yourself: if your customer data is inaccurate, inconstant and disconnected can you:

Customer data is fragmented across multiple applications used by business units, product lines, functions and regions.

Customer data is fragmented across multiple applications used by business units, product lines, functions and regions.

·  See the full picture of a customer’s relationship with the business across business units, product lines, channels and regions?  

·  Better understand and segment customers for personalized offers, improving lead conversion rates and boosting cross-sell and up-sell success?

·  Deliver an exceptional, differentiated customer experience?

·  Leverage rich sources of 3rd party data as well as big data such as social, mobile, sensors, etc.., to enrich customer insights?

“One company I recently spoke with was having a hard time creating a single consolidated invoice for each customer that included all the services purchased across business units,” says Dwight. “When they investigated, they were shocked to find that 80% of their consolidated invoices contained errors! The root cause was innaccurate, inconsistent and inconsistent customer data. This was a serious business problem costing the company a lot of money.”

Let’s do a quick test right now. Are any of these companies your customers: GE, Coke, Exxon, AT&T or HP? Do you know the legal company names for any of these organizations? Most people don’t. I’m willing to bet there are at least a handful of variations of these company names such as Coke, Coca-Cola, The Coca Cola Company, etc in your CRM application. Chances are there are dozens of variations in the numerous applications where business-critical customer data lives and these customer profiles are tied to transactions. That’s hard to clean up. You can’t just merge records because you need to maintain the transaction history and audit history. So you can’t clean up the customer data in this system and merge the duplicates.

The same holds true for B2C customers. In fact, I’m a nightmare for a large marketing organization. I get multiple offers and statements addressed to different versions of my name: Jakki Geiger, Jacqueline Geiger, Jackie Geiger and J. Geiger. But my personal favorite is when I get an offer from a company I do business with addressed to “Resident”. Why don’t they know I live here? They certainly know where to find me when they bill me!

Step 3: Transform how you view, manage and share customer data

Why do so many businesses that try to master customer data in CRM fail? Let’s be frank. CRM systems such as Salesforce.com and Siebel CRM were purpose built to support a specific set of business processes, and for the most part they do a great job. But they were never built with a focus on mastering customer data for the business beyond the scope of their own processes.

But perhaps you disagree with everything discussed so far. Or you’re a risk-taker and want to take on the challenge of bringing all master customer data that exists across the business into your CRM app. Be warned, you’ll likely encounter four major problems:

1) Your master customer data in each system has a different data model with different standards and requirements for capture and maintenance. Good luck reconciling them!

2) To be successful, your customer data must be clean and consistent across all your systems, which is rarely the case.

3) Even if you use DUNS numbers, some systems use the global DUNS number; others use a regional DUNS number. Some manage customer data at the legal entity level, others at the site level. How do you connect those?

4) If there are duplicate customer profiles in CRM tied to transactions, you can’t just merge the profiles because you need to maintain the transactional integrity and audit history. In this case, you’re dead on arrival.

There is a better way! Customer-centric, data-driven companies recognize these obstacles and they don’t rely on CRM as the single source of trusted customer data. Instead, they are transforming how they view, manage and share master customer data across the critical applications their businesses rely upon. They embrace master data management (MDM) best practices and technologies to reconcile, merge, share and govern business-critical customer data. 

More and more B2B and B2C companies are investing in MDM capabilities to manage customer households and multiple views of customer account hierarchies (e.g. a legal view can be shared with finance, a sales territory view can be shared with sales, or an industry view can be shared with a business unit).

 

Gartner Report, MDM is Critical to CRM Optimization, Bill O'Kane & Kimberly Collins, February 7 2014.

Gartner Report, MDM is Critical to CRM Optimization, Bill O’Kane & Kimberly Collins, February 7 2014.

According to Gartner analysts Bill O’Kane and Kimberly Collins, “Through 2017, CRM leaders who avoid MDM will derive erroneous results that annoy customers, resulting in a 25% reduction in potential revenue gains,” according to this Gartner report, MDM is Critical to CRM Optimization, February 2014.

Are you ready to reassess your assumptions about mastering customer data in CRM?

Get the Gartner report now: MDM is Critical to CRM Optimization.

FacebookTwitterLinkedInEmailPrintShare
Posted in CMO, Customer Acquisition & Retention, Customers, Data Governance, Master Data Management, Mergers and Acquisitions | Tagged , , , , , , , , , , , , , , , , | Leave a comment

A Data-Driven Healthcare Culture is Foundational to Delivering Personalized Medicine in Healthcare

According to a recent article in the LA Times, healthcare costs in the United States far exceed costs in other countries. For example, heart bypass surgery costs an average of $75,345 in the U.S. compared to $15,742 in the Netherlands and $16,492 in Argentina. In the U.S. healthcare accounts for 18% of the U.S. GDP and is increasing. 

Michelle Blackmer is an healthcare industry expert at Informatica

Michelle Blackmer is an healthcare industry expert at Informatica

Michelle Blackmer is an healthcare industry expert at Informatica. In this interview, she explains why business as usual isn’t good enough anymore. Healthcare organizations are rethinking how they do business in an effort to improve outcomes, reduce costs, and comply with regulatory pressures such as the Affordable Care Act (ACA). Michelle believes a data-driven healthcare culture is foundational to personalized medicine and discusses the importance of clean, safe and connected data in executing a successful transformation.

Q. How is the healthcare industry responding to the rising costs of healthcare?
In response to the rising costs of healthcare, regulatory pressures (i.e. Affordable Care Act (ACA)), and the need to better patient outcomes at lower costs, the U.S. healthcare industry is transforming from a volume-based to a value-based model. In this new model, healthcare organizations need to invest in delivering personalized medicine.

To appreciate the potential of personalized medicine, think about your own healthcare experience. It’s typically reactive. You get sick, you go to the doctor, the doctor issues a prescription and you wait a couple of days to see if that drug works. If it doesn’t, you call the doctor and she tries another drug. This process is tedious, painful and costly.

Now imagine if you had a chronic disease like depression or cancer. On average, any given prescription drug only works for half of those who take it. Among cancer patients, the rate of ineffectiveness jumps to 75 percent. Anti-depressants are effective in only 62 percent of those who take them.

Video: MD Anderson Cancer CenterOrganizations like MD Anderson and UPMC aim to put an end to cancer. They are combining scientific research with access to clean, safe and connected data (data of all types including genomic data). The insights revealed will empower personalized chemotherapies. Personalized medicine offers customized treatments based on patient history and best practices. Personalized medicine will transform healthcare delivery. Click on the links to watch videos about their transformational work.

Q. What role does data play in enabling personalized medicine?
Data is foundational to value-based care and personalized medicine. Not just any data will do. It needs to be clean, safe and connected data. It needs to be delivered rapidly across hallways and across networks.

As an industry, healthcare is at a stage where meaningful electronic data is being generated. Now you need to ensure that the data is accessible and trustworthy so that it can be rapidly analyzed. As data is aggregated across the ecosystem, married with financial and genomic data, data quality issues become more obvious. It’s vital that you can define the data issues so the people can spend their time analyzing the data to gain insights instead of wading through and manually resolving data quality issues.

The ability to trust data will differentiate leaders from the followers. Leaders will advance personalized medicine because they rely on clean, safe and connected data to:

1)      Practice analytics as a core competency
2)      Define evidence, deliver best practice care and personalize medicine
3)      Engage patients and collaborate to foster strong, actionable relationships

Healthcare e-bookTake a look at this Healthcare eBook for more on this topic: Potential Unlocked: Transforming Healthcare by Putting Information to Work.

Q. What is holding healthcare organizations back from managing their healthcare data like other mission-critical assets?
When you say other mission-critical assets, I think of facilitates, equipment, etc. Each of these assets has people and money assigned to manage and maintain them. The healthcare organizations I talk to who are highly invested in personalized medicine recognize that data is mission-critical. They are investing in the people, processes and technology needed to ensure data is clean, safe and connected. The technology includes data integration, data quality and master data management (MDM).

What’s holding other healthcare organizations back is that while they realize they need data governance, they wrongly believe they need to hire big teams of “data stewards” to be successful. In reality, you don’t need to hire a big team. Use the people you already have doing data governance. You may not have made this a formal part of their job description and they might not have data governance technologies yet, but they do have the skillset and they are already doing the work of a data steward.

So while a technology investment is required and you need people who can use the technology, start by formalizing the data stewardship work people are doing already as part of their current job. This way you have people who understand the data, taking an active role in the management of the data and they even get excited about it because their work is being recognized. IT takes on the role of enabling these people instead of having responsibility for all things data.

Q. Can you share examples of how immature information governance is a serious impediment to healthcare payers and providers?
Cost of Bad DataSure, without information governance, data is not harmonized across sources and so it is hard to make sense of it. This isn’t a problem when you are one business unit or one department, but when you want to get a comprehensive view or a view that incorporates external sources of information, this approach falls apart.

For example, let’s say the cardiology department in a healthcare organization implements a dashboard. The dashboard looks impressive. Then a group of physicians sees the dashboard, point out erroes and ask where the information (i.e. diagnosis or attending physician) came from. If you can’t answer these questions, trace the data back to its sources, or if you have data inconsistencies, the dashboard loses credibility. This is an example of how analytics fail to gain adoption and fail to foster innovation.

Q. Can you share examples of what data-driven healthcare organizations are doing differently?
Certainly, while many are just getting started on their journey to becoming data-driven, I’m seeing some inspiring  examples, including:

  • Implementing data governance for healthcare analytics. The program and data is owned by the business and enabled by IT and supported by technology such as data integration, data quality and MDM.
  • Connecting information from across the entire healthcare ecosystem including 3rd party sources like payers, state agencies, and reference data like credit information from Equifax, firmographics from Dun & Bradstreet or NPI numbers from the national provider registry.
  • Establishing consistent data definitions and parameters
  • Thinking about the internet of things (IoT) and how to incorporate device data into analysis
  • Engaging patients through non-traditional channels including loyalty programs and social media; tracking this information in a customer relationship management (CRM) system
  • Fostering collaboration by understanding the relationships between patients, providers and the rest of the ecosystem
  • Analyzing data to understand what is working and what is not working so  that they can drive out unwanted variations in care

Q. What advice can you give healthcare provider and payer employees who want access to high quality healthcare data?
As with other organizational assets that deliver value—like buildings and equipment—data requires a foundational investment in people and systems to maximize return. In other words, institutions and individuals must start managing their mission-critical data with the same rigor they manage other mission-critical enterprise assets.

Q. Anything else you want to add?
Yes, I wanted to thank our 14 visionary customer executives at data-driven healthcare organizations such as MD Anderson, UPMC, Quest Diagnostics, Sutter Health, St. Joseph Health, Dallas Children’s Medical Center and Navinet for taking time out of their busy schedules to share their journeys toward becoming data-driven at Informatica World 2014.  In our next post, I’ll share some highlights about how they are using data, how they are ensuring it is clean, safe and connected and a few data management best practices. InformaticaWorld attendees will be able to download presentations starting today! If you missed InformaticaWorld 2014, stay tuned for our upcoming webinars featuring many of these examples.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Healthcare, Informatica World 2014, Master Data Management, Vertical | Tagged , , , , , , , , , , , , , , | Leave a comment

MDM Day Advice: Connect MDM to a Tangible Business Outcome or You Will Fail

“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information  Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.

A record 500 people attended MDM Day in Las Vegas

A record 500 people attended MDM Day in Las Vegas

In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.

Here are my MDM Day fun facts and key takeaways:

  • Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services.  Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.

  •  Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
    Barbara Latulippe, EMC

    Barbara Latulippe, Senior Director, Enterprise Information Management at EMC

    EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”

  • Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?

    Michael Delgado, Citrix

    Michael Delgado, Information Management Director at Citrix

    Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.

  •  Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?

    John Poonnen, Quintiles

    John Poonnen, Director Infosario Data Factory at Quintiles

    Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.

While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:

  • Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
  • Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”

I also had the opportunity MDM Day Sponsorsto speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.

There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, CMO, Customer Acquisition & Retention, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Informatica World 2014, Master Data Management, Partners, PiM, Product Information Management, Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , | 1 Comment

Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

“Trying to improve the quality of asset data when you don’t have a solid data management infrastructure in place is like trying to save a sinking boat with a bailing bucket,” explained Dean Balog, a senior principal consultant at Noah Consulting, in this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

Dean Balog from Noah Consulting explains how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Dean has 15 years of experience in information management in the utilities industry. In this interview, Dean and I discuss the top issues facing utility executives and how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Q: Dean, what are the top issues facing utility executives?
A: The first issue is asset management / equipment maintenance. Knowing where to invest precious dollars is critical. Utility executives are engaged in a constant tug of war between two competing priorities: replacing aging infrastructure and regular maintenance.

Q. How are utility executives determining that balance?
A. You need to start with facts – the real costs and reliability information for each asset in your infrastructure. Without it, you are guessing. Basically, it is a data problem. Utility executives should ask themselves these questions:

  • Do we have the ability to capture and combine cost and reliability information from multiple sources?  Is it granular enough to be useful?
  • Do we know the maintenance costs of eight-year-old breakers versus three-year-old breakers?
  • Do our meters start failing around the average lifespan? For this example, let us say that is five years. Rather than falling uniformly into that average, do 30% of our meters fail in the first year and the rest last eight years? Those three extra years of life can certainly help out the bottom line.

Knowing your data makes all the difference. The right capital investment strategy requires combining performance, reliability, and cost data.

Quotations about data challenges faced by utility companies

Quotations about data challenges faced by utility companies

Q. Why is it difficult for utility executives to understand the real costs and reliability of assets?
A. I know this does not come as a shock, but most companies do not trust their data. Asset data is often inaccurate, inconsistent, and disconnected. Even the most basic data may not be available. For example, manufacture dates on breakers should be filled in, but they are not. If less than 50% of your breakers have manufacture dates, how can you build a preventative maintenance program? You do not even know what to address first!

A traditional approach to solving this data problem is to do a big data cleanup. You clean the data, and then before you know it, errors creep back in, and the trust in the data you have worked so hard to establish is lost.

I like to illustrate the pain of this issue by using the sinking boat analogy. Data cleanup is like bailing out the water collecting in the bottom of the boat. You think you are solving the problem but more water still seeps into the boat. You cannot stop bailing or you will sink. What you need to do is fix the leaks, and then bail out the boat. But, if you do not lift up your head from bailing long enough to see the leaks and make the right investments, you are fighting a losing battle.

Q. What can utility executives do to improve the quality of asset data?
A. First of all, you need to develop a data governance framework. Going back to the analogy, a data governance framework gives you the structure to find the leaks, fix the leaks, and monitor how much of the water has been bailed out. If the water level is still rising, you have not fixed all the leaks. But having a data governance framework is not the be-all and end-all.

You also need to appoint data stewards to be accountable for establishing and maintaining high quality asset data. The job of a data steward would be easy if there was only one system where all asset data resided. But the fact of the matter is that asset data is fragmented – scattered across multiple systems. Data stewards have a huge responsibility and they need to be supported by a solid data management infrastructure to ease the burden of managing business-critical asset information.

Webinar, Attention Utility Executives Bad Asset Data

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar.

Master Data Management (MDM) ensures business-critical asset data is consistent everywhere by pulling together data that is scattered across multiple applications. It manages and masters it in a central location on a continuous basis and shares it with any applications that need that data. MDM provides a user interface and workflow for data stewards to manage the tangled web of names and IDs these assets are known by across systems. It also gives utilities a disciplined approach to manage important relationships between the asset data, such as an asset’s performance reliability and its cost.

Q. Any other pressing issues facing utilities?
A. Yes. Another big issue is tightening regulations that consume investment dollars and become key inputs into rate case submissions and defenses. One of the complicating factors is the number of regulations is not only increasing, but the regulators are also requiring faster implementation times than ever before. So, utilities cannot just do what they have done in the past: throw more people at the problem in the short-term and resolve to fix it later by automating it “when things slow down.” That day never comes.

Q. How can utilities deal with these regulatory pressures?
A. Utilities need a new approach to deal with regulations. Start with the assumption that all data is fair game for regulators. All data must be accessible. You need to be able to report on it, not only to comply with regulations, but for competitive advantage. This requires the high quality asset information we talked about earlier, and an analytical application to:

  • Perform what-if analyses for your asset investment program;
  • Develop regulatory compliance or environmental reports quickly, because the hard work (integrating the data within your MDM program) has already been done; and
  • Get access to granular, observed reliability and cost information using your own utility’s data – not benchmark data that is already a couple of years old and highly summarized.

Q. What is your advice for utility company executives?
A. If you are the one responsible for signing off on regulatory reports and you do not look good in an orange jumpsuit, you need to invest in a plan that includes people, process, and technology to support regulatory reporting and asset management / equipment maintenance.

  • People – Data stewards have clear accountability for the quality of asset data.
  • Process – Data governance is your game plan.
  • Technology – A solid data management infrastructure consisting of data integration, data quality, and master data management is your means.

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

 Our panel of utility data experts:

  • Reveal the five toughest business challenges facing utility industry executives;
  • Explain how bad asset data could be costing you millions of dollars in operating costs;
  • Share three best practices for optimizing asset management / equipment maintenance and regulatory reporting with accurate, consistent, and connected asset information; and
  • Show you how to implement these best practices with a demonstration.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Utilities & Energy, Vertical | Tagged , , , , , , , , , , , , , , , , | Leave a comment

How Much Does Bad Data Cost Your Business?

Bad data is bad for business. Ovum Research reported that poor quality data is costing businesses at least 30% of revenues. Never before have business leaders across a broad range of roles recognized the importance of using high quality information to drive business success. Leaders in functions ranging from marketing and sales to risk management and compliance have invested in world-class applications, six sigma processes, and the most advanced predictive analytics. So why are you not seeing more return on that investment? Simply put, if your business-critical data is a mess, the rest doesn’t matter.

Dennis Moore explains the implications of using inaccurate, inconsistent and disconnected data and the value business leaders can gain by mastering it.

Dennis Moore explains the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).

Not all business leaders know there’s a better way to manage their business-critical data. So, I asked Dennis Moore, the senior vice president and general manager of Informatica’s MDM business, who clocked hundreds of thousands of airline miles last year visiting business leaders around the world, to talk about the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).

Q. Why are business leaders focusing on business-critical data now?
A. Leaders have always cared about their business-critical data, the master data on which their enterprises depend most — their customers, suppliers, the products they sell, the locations where they do business, the assets they manage, the employees who make the business perform. Leaders see the value of having a clear picture, or “best version of the truth,” describing these “master data” entities. But, this is hard to come by with competing priorities, mergers and acquisitions and siloed systems.

As companies grow, business leaders start realizing there is a huge gap between what they do know and what they should know about their customers, suppliers, products, assets and employees. Even worse,  most businesses have lost their ability to understand the relationships between business-critical data so they can improve business outcomes. Line of business leaders have been asking questions such as:

  • How can we optimize sales across channels when we don’t know which customers bought which products from which stores, sites or suppliers?
  • How can we quickly execute a recall when we don’t know which supplier delivered a defective part to which factory and where those products are now?
  • How can we accelerate time-to-market for a new drug, when we don’t know which researcher at which site used which combination of compounds on which patients?
  • How can we meet regulatory reporting deadlines, when we don’t know which model of a product we manufactured in which lot on which date?

Q. What is the crux of the problem?
A. The crux of the problem is that as businesses grow, their business-critical data becomes fragmented. There is no big picture because it’s scattered across applications, including on premise applications (such as SAP, Oracle and PeopleSoft) and cloud applications (such as Salesforce, Marketo, and Workday). But it gets worse. Business-critical data changes all the time. For example,

  • a customer moves, changes jobs, gets married, or changes their purchasing habits;
  • a suppliers moves, goes bankrupt or acquires a competitor;
  • you discontinue a product or launch a new one; or
  • you onboard a new asset or retire an old one.

As all this change occurs, business-critical data becomes inconsistent, and no one knows which application has the most up-to-date information. This costs companies money. It saps productivity and forces people to do a lot of manual work outside their best-in-class processes and world-class applications. One question I always ask business leaders is, “Do you know how much bad data is costing your business?”

Q. What can business leaders do to deal with this issue?
A. First, find out where bad data is having the most significant impact on the business. It’s not hard – just about any employee can share stories of how bad data led to a lost sale, an extra “truck roll,” lost leverage with suppliers, or a customer service problem. From the call center to the annual board planning meeting, bad data results in sub-optimal decisions and lost opportunities. Work with your line of business partners to reach a common understanding of where an improvement can really make a difference. Bad master data is everywhere, but bad master data that has material costs to the business is a much more pressing and constrained problem. Don’t try to boil the ocean or bring a full-blown data governance maturity level 5 approach to your organization if it’s not already seeing success from better data!

Second, focus on the applications and processes used to create, share, and use master data. Many times, some training, a tweak to a process, or a new interface can be created between systems, resulting in very significant improvements for the users without major IT work or process changes.

Lastly, look for a technology that is purpose-built to deal with this problem.  Master data management (MDM) helps companies better manage business-critical data in a central location on an ongoing basis and then share that “best version of the truth” with all on premise and cloud applications that need it.

Master data management (MDM) helps manage business-critical customer data and creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

Master data management (MDM) helps manage business-critical customer data and creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

Let’s use customer data as an example. If valuable customer data is located in applications such as Salesforce, Marketo, Seibel CRM, and SAP, MDM brings together all the business-critical data, the core that’s the same across all those applications, and creates the “best version of the truth.” It also creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

MDM then shares that “mastered” customer data and the total customer relationship view with the applications that want it. MDM can be used to master the relationships between customers, such as legal entity hierarchies. This helps sales and customer service staff be more productive, while also improving legal compliance and management decision making. Advanced MDM products can also manage relationships across different types of master data. For example, advanced MDM enables you to relate an employee to a project to a contract to an asset to a commission plan. This ensures accurate and timely billing, effective expense management, managed supplier spend, and even improved workforce deployment.

When your sales team has the best possible customer information in Salesforce and the finance team has the best possible customer information in SAP, no one wastes time pulling together spreadsheets of information outside of their world-class applications. Your global workforce doesn’t waste time trying to investigate whether Jacqueline Geiger in one system and Jakki Geiger in another system is one or two customers, sending multiple bills and marketing offers at high cost in postage and customer satisfaction. All employees who have access to mastered customer information can be confident they have the best possible customer information available across the organization to do their jobs. And with the most advanced and intelligent data platform, all this information can be secured so only the authorized employees, partners, and systems have access.

Q. Which industries stand to gain the most from mastering their data?
A. In every industry there is some transformation going on that’s driving the need to know people, places and things better. Take insurance for example. Similar to the transformation in the travel industry that reduced the need for travel agents, the insurance industry is experiencing a shift from the agent/broker model to a more direct model. Traditional insurance companies now have an urgent need to know their customers so they can better serve them across all channels and across multiple lines of business.

In other industries, there is an urgent need to get a lot better at supply-chain management or to accelerate new product introductions  to compete better with an emerging rival. Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data, particularly in industries undergoing rapid transformation, or with rapidly changing regulatory requirements.

Q. Which business functions seem most interested in mastering their business-critical data?
A. It varies by industry, but there are three common threads that seem to span most industries:

Business leaders are starting to make the connection between transformation failures and bad data.

Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data.

  • MDM can help the marketing team optimize the cross-sell and up-sell process with high quality data about customers, their households or company hierarchies, the products and services they have purchased through various channels, and the interactions their organizations have had with these customers.
  • MDM can help the procurement team optimize strategic sourcing including supplier spend management and supplier risk management with high quality data about suppliers, company hierarchies,  contracts and the products they supply.
  • MDM can help the compliance teams manage all the business-critical data they need to create regulatory reports on time without burning the midnight oil.

Q. How is the use of MDM evolving?
A. When MDM technology was first introduced a decade ago, it was used as a filter. It cleaned up business-critical data on its way to the data warehouse so you’d have clean, consistent, and connected information (“conformed dimensions”) for reporting. Now business leaders are investing in MDM technology to ensure that all of their global employees have access to high quality business-critical data across all applications. They believe high quality data is mission-critical to their operations. High quality data is viewed as the the lifeblood of the company and will enable the next frontier of innovation.

Second, many companies mastered data in only one or two domains (customer and product), and used separate MDM systems for each. One system was dedicated to mastering customer data. You may recall the term Customer Data Integration (CDI). Another system was dedicated to mastering product data. Because the two systems were in silos and business-critical data about customers and products wasn’t connected, they delivered limited business value. Since that time, business leaders have questioned this approach because business problems don’t contain themselves to one type of data, such as customer or product, and many of the benefits of mastering data come from mastering other domains including supplier, chart of accounts, employee and other master or reference data shared across systems.

The relationships between data matter to the business. Knowing what customer bought from which store or site is more valuable than just knowing your customer. The business insights you can gain from these relationships is limitless. Over 90% of our customers last year bought MDM because they wanted to master multiple types of data. Our customers value having all types of business-critical data in one system to deliver clean, consistent and connected data to their applications to fuel business success.

One last evolution we’re seeing a lot involves the types and numbers of systems connecting to the master data management system. In the past, there were a small number of operational systems pushing data through the MDM system into a data warehouse used for analytical purposes. Today, we have customers with hundreds of operational systems communicating with each other via an MDM system that has just a few milliseconds to respond, and which must maintain the highest levels of availability and reliability of any system in the enterprise. For example, one major retailer manages all customer information in the MDM system, using the master data to drive real-time recommendations as well as a level of customer service in every interaction that remains the envy of their industry.

Q. Dennis, why should business leaders consider attending MDM Day?
A. Business leaders should consider attending MDM Day at InformaticaWorld 2014 on Monday, May 12, 2014. You can hear first-hand the business value companies are gaining by using clean, consistent and connected information in their operations. We’re excited to have fantastic customers who are willing to share their stories and lessons learned. We have presenters from St. Jude Medical, Citrix, Quintiles and Crestline Geiger and panelists from Thomson Reuters, Accenture, EMC, Jones Lang Lasalle, Wipro, Deloitte, AutoTrader Group, McAfee-Intel, Abbvie, Infoverity, Capgemini, and Informatica among others.

Last year’s Las Vegas event, and the events we held in London, New York and Sao Paolo were extremely well received. This year’s event is packed with even more customer sessions and opportunities to learn and to influence our product road map. MDM Day is one day before InformaticaWorld and is included in the cost of your InformaticaWorld registration. We’d love to see you there!

See the MDM Day Agenda.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Customers, Data Quality, Informatica World 2014, Master Data Management | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment