I was recently searching for fishing rods for my 5-year old son and his friends to use at our neighborhood pond. I know nothing about fishing, so I needed to get educated. First up, a Google search on my laptop at home. Then, I jostled between my phone, tablet and laptop visiting websites, reading descriptions, looking at photos and reading reviews. Offline, I talked to friends and visited local stores recently, searching for fishing rods for my 5-year old son and his friends to use at our neighborhood pond. I know nothing about fishing, so I needed to get educated. First up, a Google search on my laptop at home. Then, I jostled between my phone, tablet and laptop visiting websites, reading descriptions, looking at photos and reading reviews. Offline, I talked to friends and visited local stores.
This blog post initially appeared on CMSwire.com and is reblogged here with their consent.
The product descriptions weren’t very helpful. What is a “practice casting plug”? Turns out, this was a great feature! Instead of a hook, the rod had a rubber fish to practice casting safely. What a missed opportunity for the retailers who didn’t share this information. I bought the fishing rods from the retailer that educated me with valuable product information and offered free three to five day shipping.
What does this mean for companies who sell products across multiple channels?
Virtually everyone is a cross-channel shopper: 95 percent of consumers frequently or at least occasionally shop a retailer’s website and store, according to the “Omni-Channel Insights” study by CFI Group. In the report, “The Omnichannel Opportunity: Unlocking the Power of the Connected Customer,” Deloitte predicts more than 50 percent of in-store purchases will be influenced digitally by the end of 2014.
Because of all this crosschannel activity, a new term is trending: omnichannel
What Does Omnichannel Mean?
Let’s take a look back in time. Retailers started with one channel — the brick-and-mortar store. Then they introduced the catalog and call center. Then they built another channel — e-Commerce. Instead of making it an extension of the brick-and-mortar experience, many implemented an independent strategy, including operations, resources, technology and inventory. Retailers recently started integrating brick-and-mortar and e-Commerce channels, but it’s not always consistent. And now they are building another channel — mobile sites and apps.
Multichannel is a retailer-centric, transaction-focused view of operations. Each channel operates and aims to boost sales independently. Omnichannel is a customer-centric view. The goal is to understand through which channels customers want to engage at each stage of the shopping journey and enable a seamless, integrated and consistent brand experience across channels and devices.
Shoppers expect an omnichannel experience, but delivering it efficiently isn’t easy. Those responsible for enabling an omnichannel experience are encountering barriers. Let’s look at the three barriers most relevant for marketing, merchandising, sales, customer experience and information management leaders.
Barrier #1: Shift from product-centric to customer-centric view
Many retailers focus on how many products are sold by channel. Three key questions are:
- How can we drive store sales growth?
- How can we drive online sales growth?
- What’s our mobile strategy?
This is the old way of running a retail business. The new way is analyzing customer data to understand how they are engaging and transacting across channels.
Why is this difficult? At the Argyle eCommerce Leadership Forum, Vice President of Multichannel at GameStop Corp Jason Allen shared the $8.8 billion video game retailer’s approach to overcoming this barrier. While online represents 3 percent of sales, no one measured how much the online channel was influencing overall business.
They started by collecting customer data for analytics to find out who their customers were and how they interacted with Game Stop online and in 6,600 stores across 15 countries. The analysis revealed customers used multiple channels: 60 percent engaged on the web, and 26 percent of web visitors who didn’t buy online bought in-store within 48 hours.
This insight changed the perception of the online channel as a small contributor. Now they use two metrics to measure performance. While the online channel delivers 3 percent of sales, it influences 22 percent of overall business.
Take Action: Start collecting customer data. Analyze it. Learn who your customers are. Find out how they engage and transact with your business across channels.
Barrier #2: Shift from fragmented customer data to centralized customer data everyone can use
Nikki Baird, Managing Partner at Retail Systems Research (RSR), told me she believes the fundamentals of retail are changing from “right product, right price, right place, right time” to:
- Who is my customer?
- What are they trying to accomplish?
- How can we help?
According to RSR, creating a consistent customer experience remains the most valued capability for retailers, but 54 percent indicated their biggest inhibitor was not having a single view of the customer across channels.
Why is this difficult? A $12 billion specialty retailer known for its relentless focus on customer experience, with 200 stores and an online channel had to overcome this barrier. To deliver a high-touch omnichannel experience, they needed to replace the many views of the customer with one unified customer view. They invested in master data management (MDM) technology and competencies.
Now they bring together customer, employee and product data scattered across 30 applications (e.g., e-Commerce, POS, clienteling, customer service, order management) into a central location, where it’s managed and shared on an ongoing basis. Employees’ applications are fueled with clean, consistent and connected customer data. They are able to deliver a high-touch omnichannel experience because they can answer important questions about customers and their valuable relationships, such as:
- Who is this customer and who’s in their household?
- Who do they buy for, what do they buy, where do they buy?
- Which employees do they typically buy from in store?
Take Action: Think of the valuable information customers share when they interact with different parts of your business. Tap into it by bridging customer information silos. Bring fragmented customer information together in one central location. Make it universally accessible. Don’t let it remain locked up in departmental applications. Keep it up-to-date. Automate the process of updating customer information across departmental applications.
Barrier #3: Shift from fragmented product data to centralized product data everyone can use
Two-thirds of purchase journeys start with a Google search. To have a fighting chance, retailers need rich and high quality product information to rank higher than the competition.
Take a look at the image on the left. Would you buy this product? Probably not. One-third of shoppers who don’t make a purchase didn’t have enough information to make a purchase decision. What product information does a shopper need to convert in the moment? Rich, high quality information has conversion power.
Consumers return about 40 percent of all fashion and 15 percent of electronics purchases. That’s not good for retailers or shoppers. Minimize costly returns with complete product information so shoppers can make more informed purchase decisions. Jason Allen’s advice is, “Focus less on the cart and check out. Focus more on search, product information and your store locator. Eighty percent of customers are coming to the web for research.”
Why is this difficult? Crestline is a multichannel direct marketing firm selling promotional products through direct mail and e-Commerce. The barrier to quickly bringing products to market and updating product information across channels was fragmented and complex product information. To replace the manual, time consuming spreadsheet process to manage product information, they invested in product information management (PIM) technology.
Now Crestline’s product introduction and update process is 300 percent more efficient. Because they are 100 percent current on top products and over 50 percent current for all products, the company is boosting margins and customer service.
Take Action: Think about all the product information shoppers need to research and make a decision. Tap into it by bridging product information silos. Bring fragmented product information together in one central location. Make it universally usable, not channel-specific. Keep it up-to-date. Automate the process of publishing product information across channels, including the applications used by customer service and store associates.
Delivering an omnichannel experience efficiently isn’t easy. The Game Stop team collected and analyzed customer data to learn more about who their customers are and how they interact with the company. A specialty retailer centralized fragmented customer data. Crestline centralized product information to accelerate their ability to bring products to market and make updates across channels. Which of these barriers are holding you back from delivering an omnichannel experience?
“Inaccurate, inconsistent and disconnected supplier information prohibits us from doing accurate supplier spend analysis, leveraging discounts, comparing and choosing the best prices, and enforcing corporate standards.”
This is quotation from a manufacturing company executive. It illustrates the negative impact that poorly managed supplier information can have on a company’s ability to cut costs and achieve revenue targets.
Many supply chain and procurement teams at large companies struggle to see the total relationship they have with suppliers across product lines, business units and regions. Why? Supplier information is scattered across dozens or hundreds of Enterprise Resource Planning (ERP) and Accounts Payable (AP) applications. Too much valuable time is spent manually reconciling inaccurate, inconsistent and disconnected supplier information in an effort to see the big picture. All this manual effort results in back office administrative costs that are higher than they should be.
Do these quotations from supply chain leaders and their teams sound familiar?
“We have 500,000 suppliers. 15-20% of our supplier records are duplicates. 5% are inaccurate.”
“I get 100 e-mails a day questioning which supplier to use.”
“To consolidate vendor reporting for a single supplier between divisions is really just a guess.”
“Every year 1099 tax mailings get returned to us because of invalid addresses, and we play a lot of Schedule B fines to the IRS.”
“Two years ago we spent a significant amount of time and money cleansing supplier data. Now we are back where we started.”
Please join me and Naveen Sharma, Director of the Master Data Management (MDM) Practice at Cognizant for a Webinar, Supercharge Your Supply Chain Applications with Better Supplier Information, on Tuesday, July 29th at 11 am PT.
During the Webinar, we’ll explain how better managing supplier information can help you achieve the following goals:
- Accelerate supplier onboarding
- Mitiate the risk of supply disruption
- Better manage supplier performance
- Streamline billing and payment processes
- Improve supplier relationship management and collaboration
- Make it easier to evaluate non-compliance with Service Level Agreements (SLAs)
- Decrease costs by negotiating favorable payment terms and SLAs
I hope you can join us for this upcoming Webinar!
“Not only do we underestimate the cost for projects up to 150%, but we overestimate the revenue it will generate.” This quotation from an Energy & Petroleum (E&P) company executive illustrates the negative impact of inaccurate, inconsistent and disconnected well data and asset data on revenue potential.
“Operational Excellence” is a common goal of many E&P company executives pursuing higher growth targets. But, inaccurate, inconsistent and disconnected well data and asset data may be holding them back. It obscures the complete picture of the well information lifecycle, making it difficult to maximize production efficiency, reduce Non-Productive Time (NPT), streamline the oilfield supply chain, calculate well by-well profitability, and mitigate risk.
To explain how E&P companies can better manage well data and asset data, we hosted a webinar, “Attention E&P Executives: Streamlining the Well Information Lifecycle.” Our well data experts Stephanie Wilkin, Senior Principal Consultant at Noah Consulting, and Stephan Zoder, Director of Value Engineering at Informatica shared some advice. E&P companies should reevaluate “throwing more bodies at a data cleanup project twice a year.” This approach does not support the pursuit of operational excellence.
In this interview, Stephanie shares details about the award-winning collaboration between Noah Consulting and Devon Energy to create a single trusted source of well data, which is standardized and mastered.
Q. Congratulations on winning the 2014 Innovation Award, Stephanie!
A. Thanks Jakki. It was really exciting working with Devon Energy. Together we put the technology and processes in place to manage and master well data in a central location and share it with downstream systems on an ongoing basis. We were proud to win the 2014 Innovation Award for Best Enterprise Data Platform.
Q. What was the business need for mastering well data?
A. As E&P companies grow so do their needs for business-critical well data. All departments need clean, consistent and connected well data to fuel their applications. We implemented a master data management (MDM) solution for well data with the goals of improving information management, business productivity, organizational efficiency, and reporting.
Q. How long did it take to implement the MDM solution for well data?
A. The Devon Energy project kicked off in May of 2012. Within five months we built the complete solution from gathering business requirements to development and testing.
Q. What were the steps in implementing the MDM solution?
A: The first and most important step was securing buy-in on a common definition for master well data or Unique Well Identifier (UWI). The key was to create a definition that would meet the needs of various business functions. Then we built the well master, which would be consistent across various systems, such as G&G, Drilling, Production, Finance, etc. We used the Professional Petroleum Data Management Association (PPDM) data model and created more than 70 unique attributes for the well, including Lahee Class, Fluid Direction, Trajectory, Role and Business Interest.
As part of the original go-live, we had three source systems of well data and two target systems connected to the MDM solution. Over the course of the next year, we added three additional source systems and four additional target systems. We did a cross-system analysis to make sure every department has the right wells and the right data about those wells. Now the company uses MDM as the single trusted source of well data, which is standardized and mastered, to do analysis and build reports.
Q. What’s been the traditional approach for managing well data?
A. Typically when a new well is created, employees spend time entering well data into their own systems. For example, one person enters well data into the G&G application. Another person enters the same well data into the Drilling application. A third person enters the same well data into the Finance application. According to statistics, it takes about 30 minutes to enter wells into a particular financial application.
So imagine if you need to add 500 new wells to your systems. This is common after a merger or acquisition. That translates to roughly 250 hours or 6.25 weeks of employee time saved on the well create process! By automating across systems, you not only save time, you eliminate redundant data entry and possible errors in the process.
Q. That sounds like a painfully slow and error-prone process.
A. It is! But that’s only half the problem. Without a single trusted source of well data, how do you get a complete picture of your wells? When you compare the well data in the G&G system to the well data in the Drilling or Finance systems, it’s typically inconsistent and difficult to reconcile. This leads to the question, “Which one of these systems has the best version of the truth?” Employees spend too much time manually reconciling well data for reporting and decision-making.
Q. So there is a lot to be gained by better managing well data.
A. That’s right. The CFO typically loves the ROI on a master well data project. It’s a huge opportunity to save time and money, boost productivity and get more accurate reporting.
Q: What were some of the business requirements for the MDM solution?
A: We couldn’t build a solution that was narrowly focused on meeting the company’s needs today. We had to keep the future in mind. Our goal was to build a framework that was scalable and supportable as the company’s business environment changed. This allows the company to add additional data domains or attributes to the well data model at any time.
Q: Why did you choose Informatica MDM?
A: The decision to use Informatica MDM for the MDM Trust Framework came down to the following capabilities:
- Match and Merge: With Informatica, we get a lot of flexibility. Some systems carry the API or well government ID, but some don’t. We can match and merge records differently based on the system.
- X-References: We keep a cross-reference between all the systems. We can go back to the master well data and find out where that data came from and when. We can see where changes have occurred because Informatica MDM tracks the history and lineage.
- Scalability: This was a key requirement. While we went live after only 5 months, we’ve been continually building out the well master based on the requiremets of the target systems.
- Flexibility: Down the road, if we want to add an additional facet or classification to the well master, the framework allows for that.
- Simple Integration: Instead of building point-to-point integrations, we use the hub model.
In addition to Informatica MDM, our Noah Consulting MDM Trust Framework includes Informatica PowerCenter for data integration, Informatica Data Quality for data cleansing and Informatica Data Virtualization.
Q: Can you give some examples of the business value gained by mastering well data?
A: One person said to me, “I’m so overwhelmed! We’ve never had one place to look at this well data before.” With MDM centrally managing master well data and fueling key business applications, many upstream processes can be optimized to achieve their full potential value.
People spend less time entering well data on the front end and reconciling well data on the back end. Well data is entered once and it’s automatically shared across all systems that need it. People can trust that it’s consistent across systems. Also, because the data across systems is now tied together, it provides business value they were unable to realize before, such as predictive analytics.
Q. What’s next?
A. There’s a lot of insight that can be gained by understanding the relationships between the well, and the people, equipment and facilities associated with it. Next, we’re planning to add the operational hierarchy. For example, we’ll be able to identify which production engineer, reservoir engineer and foreman are working on a particular well.
We’ve also started gathering business requirements for equipment and facilities to be tied to each well. There’s a lot more business value on the horizon as the company streamlines their well information lifecycle and the valuable relationships around the well.
If you missed the webinar, you can watch the replay now: Attention E&P Executives: Streamlining the Well Information Lifecycle.
Step 1: Determine if you have a customer data problem
A statement I often hear from marketing and sales leaders unfamiliar with the concept of mastering customer data is, “My CRM application is our single source of trusted customer data.” They use CRM to onboard new customers, collecting addresses, phone numbers and email addresses. They append a DUNS number. So it’s no surprise they may expect they can master their customer data in CRM. (To learn more about the basics of managing trusted customer data, read this: How much does bad data cost your business?)
It may seem logical to expect your CRM investment to be your customer master – especially since so many CRM vendors promise a “360 degree view of your customer.” But you should only consider your CRM system as the source of truth for trusted customer data if:
· You have only a single instance of Salesforce.com, Siebel CRM, or other CRM
· You have only one sales organization (vs. distributed across regions and LOBs)
· Your CRM manages all customer-focused processes and interactions (marketing, service, support, order management, self-service, etc)
· The master customer data in your CRM is clean, complete, fresh, and free of duplicates
Unfortunately most mid-to-large companies cannot claim such simple operations. For most large enterprises, CRM never delivered on that promise of a trusted 360-degree customer view. That’s what prompted Gartner analysts Bill O’Kane and Kimbery Collins to write this report, MDM is Critical to CRM Optimization, in February 2014.
“The reality is that the vast majority of the Fortune 2000 companies we talk to are complex,” says Christopher Dwight, who leads a team of master data management (MDM) and product information management (PIM) sales specialists for Informatica. Christopher and team spend each day working with retailers, distributors and CPG companies to help them get more value from their customer, product and supplier data. “Business-critical customer data doesn’t live in one place. There’s no clear and simple source. Functional organizations, processes, and systems landscapes are much more complicated. Typically they have multiple selling organizations across business units or regions.”
As an example, listed below are typical functional organizations, and common customer master data-dependent applications they rely upon, to support the lead-to-cash process within a typical enterprise:
· Marketing: marketing automation, campaign management and customer analytics systems.
· Ecommerce: e-commerce storefront and commerce applications.
· Sales: sales force automation, quote management,
· Fulfillment: ERP, shipping and logistics systems.
· Finance: order management and billing systems.
· Customer Service: CRM, IVR and case management systems.
The fragmentation of critical customer data across multiple organizations and applications is further exacerbated by the explosive adoption of Cloud applications such as Salesforce.com and Marketo. Merger and acquisition (M&A) activity is common among many larger organizations where additional legacy customer applications must be onboarded and reconciled. Suddenly your customer data challenge grows exponentially.
Step 2: Measure how customer data fragmentation impacts your business
Ask yourself: if your customer data is inaccurate, inconstant and disconnected can you:
· See the full picture of a customer’s relationship with the business across business units, product lines, channels and regions?
· Better understand and segment customers for personalized offers, improving lead conversion rates and boosting cross-sell and up-sell success?
· Deliver an exceptional, differentiated customer experience?
· Leverage rich sources of 3rd party data as well as big data such as social, mobile, sensors, etc.., to enrich customer insights?
“One company I recently spoke with was having a hard time creating a single consolidated invoice for each customer that included all the services purchased across business units,” says Dwight. “When they investigated, they were shocked to find that 80% of their consolidated invoices contained errors! The root cause was innaccurate, inconsistent and inconsistent customer data. This was a serious business problem costing the company a lot of money.”
Let’s do a quick test right now. Are any of these companies your customers: GE, Coke, Exxon, AT&T or HP? Do you know the legal company names for any of these organizations? Most people don’t. I’m willing to bet there are at least a handful of variations of these company names such as Coke, Coca-Cola, The Coca Cola Company, etc in your CRM application. Chances are there are dozens of variations in the numerous applications where business-critical customer data lives and these customer profiles are tied to transactions. That’s hard to clean up. You can’t just merge records because you need to maintain the transaction history and audit history. So you can’t clean up the customer data in this system and merge the duplicates.
The same holds true for B2C customers. In fact, I’m a nightmare for a large marketing organization. I get multiple offers and statements addressed to different versions of my name: Jakki Geiger, Jacqueline Geiger, Jackie Geiger and J. Geiger. But my personal favorite is when I get an offer from a company I do business with addressed to “Resident”. Why don’t they know I live here? They certainly know where to find me when they bill me!
Step 3: Transform how you view, manage and share customer data
Why do so many businesses that try to master customer data in CRM fail? Let’s be frank. CRM systems such as Salesforce.com and Siebel CRM were purpose built to support a specific set of business processes, and for the most part they do a great job. But they were never built with a focus on mastering customer data for the business beyond the scope of their own processes.
But perhaps you disagree with everything discussed so far. Or you’re a risk-taker and want to take on the challenge of bringing all master customer data that exists across the business into your CRM app. Be warned, you’ll likely encounter four major problems:
1) Your master customer data in each system has a different data model with different standards and requirements for capture and maintenance. Good luck reconciling them!
2) To be successful, your customer data must be clean and consistent across all your systems, which is rarely the case.
3) Even if you use DUNS numbers, some systems use the global DUNS number; others use a regional DUNS number. Some manage customer data at the legal entity level, others at the site level. How do you connect those?
4) If there are duplicate customer profiles in CRM tied to transactions, you can’t just merge the profiles because you need to maintain the transactional integrity and audit history. In this case, you’re dead on arrival.
There is a better way! Customer-centric, data-driven companies recognize these obstacles and they don’t rely on CRM as the single source of trusted customer data. Instead, they are transforming how they view, manage and share master customer data across the critical applications their businesses rely upon. They embrace master data management (MDM) best practices and technologies to reconcile, merge, share and govern business-critical customer data.
More and more B2B and B2C companies are investing in MDM capabilities to manage customer households and multiple views of customer account hierarchies (e.g. a legal view can be shared with finance, a sales territory view can be shared with sales, or an industry view can be shared with a business unit).
According to Gartner analysts Bill O’Kane and Kimberly Collins, “Through 2017, CRM leaders who avoid MDM will derive erroneous results that annoy customers, resulting in a 25% reduction in potential revenue gains,” according to this Gartner report, MDM is Critical to CRM Optimization, February 2014.
Are you ready to reassess your assumptions about mastering customer data in CRM?
Get the Gartner report now: MDM is Critical to CRM Optimization.
According to a recent article in the LA Times, healthcare costs in the United States far exceed costs in other countries. For example, heart bypass surgery costs an average of $75,345 in the U.S. compared to $15,742 in the Netherlands and $16,492 in Argentina. In the U.S. healthcare accounts for 18% of the U.S. GDP and is increasing.
Michelle Blackmer is an healthcare industry expert at Informatica. In this interview, she explains why business as usual isn’t good enough anymore. Healthcare organizations are rethinking how they do business in an effort to improve outcomes, reduce costs, and comply with regulatory pressures such as the Affordable Care Act (ACA). Michelle believes a data-driven healthcare culture is foundational to personalized medicine and discusses the importance of clean, safe and connected data in executing a successful transformation.
Q. How is the healthcare industry responding to the rising costs of healthcare?
In response to the rising costs of healthcare, regulatory pressures (i.e. Affordable Care Act (ACA)), and the need to better patient outcomes at lower costs, the U.S. healthcare industry is transforming from a volume-based to a value-based model. In this new model, healthcare organizations need to invest in delivering personalized medicine.
To appreciate the potential of personalized medicine, think about your own healthcare experience. It’s typically reactive. You get sick, you go to the doctor, the doctor issues a prescription and you wait a couple of days to see if that drug works. If it doesn’t, you call the doctor and she tries another drug. This process is tedious, painful and costly.
Now imagine if you had a chronic disease like depression or cancer. On average, any given prescription drug only works for half of those who take it. Among cancer patients, the rate of ineffectiveness jumps to 75 percent. Anti-depressants are effective in only 62 percent of those who take them.
Organizations like MD Anderson and UPMC aim to put an end to cancer. They are combining scientific research with access to clean, safe and connected data (data of all types including genomic data). The insights revealed will empower personalized chemotherapies. Personalized medicine offers customized treatments based on patient history and best practices. Personalized medicine will transform healthcare delivery. Click on the links to watch videos about their transformational work.
Q. What role does data play in enabling personalized medicine?
Data is foundational to value-based care and personalized medicine. Not just any data will do. It needs to be clean, safe and connected data. It needs to be delivered rapidly across hallways and across networks.
As an industry, healthcare is at a stage where meaningful electronic data is being generated. Now you need to ensure that the data is accessible and trustworthy so that it can be rapidly analyzed. As data is aggregated across the ecosystem, married with financial and genomic data, data quality issues become more obvious. It’s vital that you can define the data issues so the people can spend their time analyzing the data to gain insights instead of wading through and manually resolving data quality issues.
The ability to trust data will differentiate leaders from the followers. Leaders will advance personalized medicine because they rely on clean, safe and connected data to:
1) Practice analytics as a core competency
2) Define evidence, deliver best practice care and personalize medicine
3) Engage patients and collaborate to foster strong, actionable relationships
Take a look at this Healthcare eBook for more on this topic: Potential Unlocked: Transforming Healthcare by Putting Information to Work.
Q. What is holding healthcare organizations back from managing their healthcare data like other mission-critical assets?
When you say other mission-critical assets, I think of facilitates, equipment, etc. Each of these assets has people and money assigned to manage and maintain them. The healthcare organizations I talk to who are highly invested in personalized medicine recognize that data is mission-critical. They are investing in the people, processes and technology needed to ensure data is clean, safe and connected. The technology includes data integration, data quality and master data management (MDM).
What’s holding other healthcare organizations back is that while they realize they need data governance, they wrongly believe they need to hire big teams of “data stewards” to be successful. In reality, you don’t need to hire a big team. Use the people you already have doing data governance. You may not have made this a formal part of their job description and they might not have data governance technologies yet, but they do have the skillset and they are already doing the work of a data steward.
So while a technology investment is required and you need people who can use the technology, start by formalizing the data stewardship work people are doing already as part of their current job. This way you have people who understand the data, taking an active role in the management of the data and they even get excited about it because their work is being recognized. IT takes on the role of enabling these people instead of having responsibility for all things data.
Q. Can you share examples of how immature information governance is a serious impediment to healthcare payers and providers?
Sure, without information governance, data is not harmonized across sources and so it is hard to make sense of it. This isn’t a problem when you are one business unit or one department, but when you want to get a comprehensive view or a view that incorporates external sources of information, this approach falls apart.
For example, let’s say the cardiology department in a healthcare organization implements a dashboard. The dashboard looks impressive. Then a group of physicians sees the dashboard, point out erroes and ask where the information (i.e. diagnosis or attending physician) came from. If you can’t answer these questions, trace the data back to its sources, or if you have data inconsistencies, the dashboard loses credibility. This is an example of how analytics fail to gain adoption and fail to foster innovation.
Q. Can you share examples of what data-driven healthcare organizations are doing differently?
Certainly, while many are just getting started on their journey to becoming data-driven, I’m seeing some inspiring examples, including:
- Implementing data governance for healthcare analytics. The program and data is owned by the business and enabled by IT and supported by technology such as data integration, data quality and MDM.
- Connecting information from across the entire healthcare ecosystem including 3rd party sources like payers, state agencies, and reference data like credit information from Equifax, firmographics from Dun & Bradstreet or NPI numbers from the national provider registry.
- Establishing consistent data definitions and parameters
- Thinking about the internet of things (IoT) and how to incorporate device data into analysis
- Engaging patients through non-traditional channels including loyalty programs and social media; tracking this information in a customer relationship management (CRM) system
- Fostering collaboration by understanding the relationships between patients, providers and the rest of the ecosystem
- Analyzing data to understand what is working and what is not working so that they can drive out unwanted variations in care
Q. What advice can you give healthcare provider and payer employees who want access to high quality healthcare data?
As with other organizational assets that deliver value—like buildings and equipment—data requires a foundational investment in people and systems to maximize return. In other words, institutions and individuals must start managing their mission-critical data with the same rigor they manage other mission-critical enterprise assets.
Q. Anything else you want to add?
Yes, I wanted to thank our 14 visionary customer executives at data-driven healthcare organizations such as MD Anderson, UPMC, Quest Diagnostics, Sutter Health, St. Joseph Health, Dallas Children’s Medical Center and Navinet for taking time out of their busy schedules to share their journeys toward becoming data-driven at Informatica World 2014. In our next post, I’ll share some highlights about how they are using data, how they are ensuring it is clean, safe and connected and a few data management best practices. InformaticaWorld attendees will be able to download presentations starting today! If you missed InformaticaWorld 2014, stay tuned for our upcoming webinars featuring many of these examples.
“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.
In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.
Here are my MDM Day fun facts and key takeaways:
- Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?
GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services. Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.
- Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”
- Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?
Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.
- Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?
Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.
While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:
- Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
- Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”
I also had the opportunity to speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.
There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.
“Trying to improve the quality of asset data when you don’t have a solid data management infrastructure in place is like trying to save a sinking boat with a bailing bucket,” explained Dean Balog, a senior principal consultant at Noah Consulting, in this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data
Dean has 15 years of experience in information management in the utilities industry. In this interview, Dean and I discuss the top issues facing utility executives and how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.
Q: Dean, what are the top issues facing utility executives?
A: The first issue is asset management / equipment maintenance. Knowing where to invest precious dollars is critical. Utility executives are engaged in a constant tug of war between two competing priorities: replacing aging infrastructure and regular maintenance.
Q. How are utility executives determining that balance?
A. You need to start with facts – the real costs and reliability information for each asset in your infrastructure. Without it, you are guessing. Basically, it is a data problem. Utility executives should ask themselves these questions:
- Do we have the ability to capture and combine cost and reliability information from multiple sources? Is it granular enough to be useful?
- Do we know the maintenance costs of eight-year-old breakers versus three-year-old breakers?
- Do our meters start failing around the average lifespan? For this example, let us say that is five years. Rather than falling uniformly into that average, do 30% of our meters fail in the first year and the rest last eight years? Those three extra years of life can certainly help out the bottom line.
Knowing your data makes all the difference. The right capital investment strategy requires combining performance, reliability, and cost data.
Q. Why is it difficult for utility executives to understand the real costs and reliability of assets?
A. I know this does not come as a shock, but most companies do not trust their data. Asset data is often inaccurate, inconsistent, and disconnected. Even the most basic data may not be available. For example, manufacture dates on breakers should be filled in, but they are not. If less than 50% of your breakers have manufacture dates, how can you build a preventative maintenance program? You do not even know what to address first!
A traditional approach to solving this data problem is to do a big data cleanup. You clean the data, and then before you know it, errors creep back in, and the trust in the data you have worked so hard to establish is lost.
I like to illustrate the pain of this issue by using the sinking boat analogy. Data cleanup is like bailing out the water collecting in the bottom of the boat. You think you are solving the problem but more water still seeps into the boat. You cannot stop bailing or you will sink. What you need to do is fix the leaks, and then bail out the boat. But, if you do not lift up your head from bailing long enough to see the leaks and make the right investments, you are fighting a losing battle.
Q. What can utility executives do to improve the quality of asset data?
A. First of all, you need to develop a data governance framework. Going back to the analogy, a data governance framework gives you the structure to find the leaks, fix the leaks, and monitor how much of the water has been bailed out. If the water level is still rising, you have not fixed all the leaks. But having a data governance framework is not the be-all and end-all.
You also need to appoint data stewards to be accountable for establishing and maintaining high quality asset data. The job of a data steward would be easy if there was only one system where all asset data resided. But the fact of the matter is that asset data is fragmented – scattered across multiple systems. Data stewards have a huge responsibility and they need to be supported by a solid data management infrastructure to ease the burden of managing business-critical asset information.
Master Data Management (MDM) ensures business-critical asset data is consistent everywhere by pulling together data that is scattered across multiple applications. It manages and masters it in a central location on a continuous basis and shares it with any applications that need that data. MDM provides a user interface and workflow for data stewards to manage the tangled web of names and IDs these assets are known by across systems. It also gives utilities a disciplined approach to manage important relationships between the asset data, such as an asset’s performance reliability and its cost.
Q. Any other pressing issues facing utilities?
A. Yes. Another big issue is tightening regulations that consume investment dollars and become key inputs into rate case submissions and defenses. One of the complicating factors is the number of regulations is not only increasing, but the regulators are also requiring faster implementation times than ever before. So, utilities cannot just do what they have done in the past: throw more people at the problem in the short-term and resolve to fix it later by automating it “when things slow down.” That day never comes.
Q. How can utilities deal with these regulatory pressures?
A. Utilities need a new approach to deal with regulations. Start with the assumption that all data is fair game for regulators. All data must be accessible. You need to be able to report on it, not only to comply with regulations, but for competitive advantage. This requires the high quality asset information we talked about earlier, and an analytical application to:
- Perform what-if analyses for your asset investment program;
- Develop regulatory compliance or environmental reports quickly, because the hard work (integrating the data within your MDM program) has already been done; and
- Get access to granular, observed reliability and cost information using your own utility’s data – not benchmark data that is already a couple of years old and highly summarized.
Q. What is your advice for utility company executives?
A. If you are the one responsible for signing off on regulatory reports and you do not look good in an orange jumpsuit, you need to invest in a plan that includes people, process, and technology to support regulatory reporting and asset management / equipment maintenance.
- People – Data stewards have clear accountability for the quality of asset data.
- Process – Data governance is your game plan.
- Technology – A solid data management infrastructure consisting of data integration, data quality, and master data management is your means.
If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data
Our panel of utility data experts:
- Reveal the five toughest business challenges facing utility industry executives;
- Explain how bad asset data could be costing you millions of dollars in operating costs;
- Share three best practices for optimizing asset management / equipment maintenance and regulatory reporting with accurate, consistent, and connected asset information; and
- Show you how to implement these best practices with a demonstration.
Bad data is bad for business. Ovum Research reported that poor quality data is costing businesses at least 30% of revenues. Never before have business leaders across a broad range of roles recognized the importance of using high quality information to drive business success. Leaders in functions ranging from marketing and sales to risk management and compliance have invested in world-class applications, six sigma processes, and the most advanced predictive analytics. So why are you not seeing more return on that investment? Simply put, if your business-critical data is a mess, the rest doesn’t matter.
Not all business leaders know there’s a better way to manage their business-critical data. So, I asked Dennis Moore, the senior vice president and general manager of Informatica’s MDM business, who clocked hundreds of thousands of airline miles last year visiting business leaders around the world, to talk about the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).
Q. Why are business leaders focusing on business-critical data now?
A. Leaders have always cared about their business-critical data, the master data on which their enterprises depend most — their customers, suppliers, the products they sell, the locations where they do business, the assets they manage, the employees who make the business perform. Leaders see the value of having a clear picture, or “best version of the truth,” describing these “master data” entities. But, this is hard to come by with competing priorities, mergers and acquisitions and siloed systems.
As companies grow, business leaders start realizing there is a huge gap between what they do know and what they should know about their customers, suppliers, products, assets and employees. Even worse, most businesses have lost their ability to understand the relationships between business-critical data so they can improve business outcomes. Line of business leaders have been asking questions such as:
- How can we optimize sales across channels when we don’t know which customers bought which products from which stores, sites or suppliers?
- How can we quickly execute a recall when we don’t know which supplier delivered a defective part to which factory and where those products are now?
- How can we accelerate time-to-market for a new drug, when we don’t know which researcher at which site used which combination of compounds on which patients?
- How can we meet regulatory reporting deadlines, when we don’t know which model of a product we manufactured in which lot on which date?
Q. What is the crux of the problem?
A. The crux of the problem is that as businesses grow, their business-critical data becomes fragmented. There is no big picture because it’s scattered across applications, including on premise applications (such as SAP, Oracle and PeopleSoft) and cloud applications (such as Salesforce, Marketo, and Workday). But it gets worse. Business-critical data changes all the time. For example,
- a customer moves, changes jobs, gets married, or changes their purchasing habits;
- a suppliers moves, goes bankrupt or acquires a competitor;
- you discontinue a product or launch a new one; or
- you onboard a new asset or retire an old one.
As all this change occurs, business-critical data becomes inconsistent, and no one knows which application has the most up-to-date information. This costs companies money. It saps productivity and forces people to do a lot of manual work outside their best-in-class processes and world-class applications. One question I always ask business leaders is, “Do you know how much bad data is costing your business?”
Q. What can business leaders do to deal with this issue?
A. First, find out where bad data is having the most significant impact on the business. It’s not hard – just about any employee can share stories of how bad data led to a lost sale, an extra “truck roll,” lost leverage with suppliers, or a customer service problem. From the call center to the annual board planning meeting, bad data results in sub-optimal decisions and lost opportunities. Work with your line of business partners to reach a common understanding of where an improvement can really make a difference. Bad master data is everywhere, but bad master data that has material costs to the business is a much more pressing and constrained problem. Don’t try to boil the ocean or bring a full-blown data governance maturity level 5 approach to your organization if it’s not already seeing success from better data!
Second, focus on the applications and processes used to create, share, and use master data. Many times, some training, a tweak to a process, or a new interface can be created between systems, resulting in very significant improvements for the users without major IT work or process changes.
Lastly, look for a technology that is purpose-built to deal with this problem. Master data management (MDM) helps companies better manage business-critical data in a central location on an ongoing basis and then share that “best version of the truth” with all on premise and cloud applications that need it.
Let’s use customer data as an example. If valuable customer data is located in applications such as Salesforce, Marketo, Seibel CRM, and SAP, MDM brings together all the business-critical data, the core that’s the same across all those applications, and creates the “best version of the truth.” It also creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.
MDM then shares that “mastered” customer data and the total customer relationship view with the applications that want it. MDM can be used to master the relationships between customers, such as legal entity hierarchies. This helps sales and customer service staff be more productive, while also improving legal compliance and management decision making. Advanced MDM products can also manage relationships across different types of master data. For example, advanced MDM enables you to relate an employee to a project to a contract to an asset to a commission plan. This ensures accurate and timely billing, effective expense management, managed supplier spend, and even improved workforce deployment.
When your sales team has the best possible customer information in Salesforce and the finance team has the best possible customer information in SAP, no one wastes time pulling together spreadsheets of information outside of their world-class applications. Your global workforce doesn’t waste time trying to investigate whether Jacqueline Geiger in one system and Jakki Geiger in another system is one or two customers, sending multiple bills and marketing offers at high cost in postage and customer satisfaction. All employees who have access to mastered customer information can be confident they have the best possible customer information available across the organization to do their jobs. And with the most advanced and intelligent data platform, all this information can be secured so only the authorized employees, partners, and systems have access.
Q. Which industries stand to gain the most from mastering their data?
A. In every industry there is some transformation going on that’s driving the need to know people, places and things better. Take insurance for example. Similar to the transformation in the travel industry that reduced the need for travel agents, the insurance industry is experiencing a shift from the agent/broker model to a more direct model. Traditional insurance companies now have an urgent need to know their customers so they can better serve them across all channels and across multiple lines of business.
In other industries, there is an urgent need to get a lot better at supply-chain management or to accelerate new product introductions to compete better with an emerging rival. Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data, particularly in industries undergoing rapid transformation, or with rapidly changing regulatory requirements.
Q. Which business functions seem most interested in mastering their business-critical data?
A. It varies by industry, but there are three common threads that seem to span most industries:
- MDM can help the marketing team optimize the cross-sell and up-sell process with high quality data about customers, their households or company hierarchies, the products and services they have purchased through various channels, and the interactions their organizations have had with these customers.
- MDM can help the procurement team optimize strategic sourcing including supplier spend management and supplier risk management with high quality data about suppliers, company hierarchies, contracts and the products they supply.
- MDM can help the compliance teams manage all the business-critical data they need to create regulatory reports on time without burning the midnight oil.
Q. How is the use of MDM evolving?
A. When MDM technology was first introduced a decade ago, it was used as a filter. It cleaned up business-critical data on its way to the data warehouse so you’d have clean, consistent, and connected information (“conformed dimensions”) for reporting. Now business leaders are investing in MDM technology to ensure that all of their global employees have access to high quality business-critical data across all applications. They believe high quality data is mission-critical to their operations. High quality data is viewed as the the lifeblood of the company and will enable the next frontier of innovation.
Second, many companies mastered data in only one or two domains (customer and product), and used separate MDM systems for each. One system was dedicated to mastering customer data. You may recall the term Customer Data Integration (CDI). Another system was dedicated to mastering product data. Because the two systems were in silos and business-critical data about customers and products wasn’t connected, they delivered limited business value. Since that time, business leaders have questioned this approach because business problems don’t contain themselves to one type of data, such as customer or product, and many of the benefits of mastering data come from mastering other domains including supplier, chart of accounts, employee and other master or reference data shared across systems.
The relationships between data matter to the business. Knowing what customer bought from which store or site is more valuable than just knowing your customer. The business insights you can gain from these relationships is limitless. Over 90% of our customers last year bought MDM because they wanted to master multiple types of data. Our customers value having all types of business-critical data in one system to deliver clean, consistent and connected data to their applications to fuel business success.
One last evolution we’re seeing a lot involves the types and numbers of systems connecting to the master data management system. In the past, there were a small number of operational systems pushing data through the MDM system into a data warehouse used for analytical purposes. Today, we have customers with hundreds of operational systems communicating with each other via an MDM system that has just a few milliseconds to respond, and which must maintain the highest levels of availability and reliability of any system in the enterprise. For example, one major retailer manages all customer information in the MDM system, using the master data to drive real-time recommendations as well as a level of customer service in every interaction that remains the envy of their industry.
Q. Dennis, why should business leaders consider attending MDM Day?
A. Business leaders should consider attending MDM Day at InformaticaWorld 2014 on Monday, May 12, 2014. You can hear first-hand the business value companies are gaining by using clean, consistent and connected information in their operations. We’re excited to have fantastic customers who are willing to share their stories and lessons learned. We have presenters from St. Jude Medical, Citrix, Quintiles and Crestline Geiger and panelists from Thomson Reuters, Accenture, EMC, Jones Lang Lasalle, Wipro, Deloitte, AutoTrader Group, McAfee-Intel, Abbvie, Infoverity, Capgemini, and Informatica among others.
Last year’s Las Vegas event, and the events we held in London, New York and Sao Paolo were extremely well received. This year’s event is packed with even more customer sessions and opportunities to learn and to influence our product road map. MDM Day is one day before InformaticaWorld and is included in the cost of your InformaticaWorld registration. We’d love to see you there!
See the MDM Day Agenda.
The Surprising Link Between Hurricanes and Strawberry Pop-Tarts: Brought to you by Clean, Consistent and Connected Data
What do you think Wal-Mart’s best-seller is right before a hurricane? If you guessed water like I did, you’d be wrong. According to this New York Times article, “What Wal-Mart Knows About Customers’ Habits” the retailer sells 7X more strawberry Pop-Tarts in Florida right before a hurricane than any other time. Armed with predictive analytics and a solid information management foundation, the team stocks up on strawberry Pop-Tarts to make sure they have enough supply to meet demand.
I learned this fun fact from Andrew Donaher, Director of Information Management Strategy at Groundswell Group, a consulting firm based in western Canada that specializes in information management services. In this interview, Andy and I discuss how IT leaders can increase the value of data to drive business value, explain how some IT leaders are collaborating with business leaders to improve predictive analytics, and share advice about how to talk to business leaders, such as the CFO about investing in an information management strategy.
Q. Andy, what can IT leaders do to increase the value of data to drive business value?
A. Simply put, each business leader in a company needs to focus on achieving their goals. The first step IT leaders should take is to engage with each business leader to understand their long and short-term goals and ask some key questions, such as:
- What type of information is critical to achieving their goals?
- Do they have the information they need to make the next decision or take the next best action?
- Is all the data they need in house? If not, where is it?
- What challenges are they facing when it comes to their data?
- How much time are people spending trying to pull together the information they need?
- How much time are people spending fixing bad data?
- How much is this costing them?
- What opportunities exist if they had all the information they need and could trust it?
Q. How are IT leaders collaborating with business partners to improve predictive analytics?
A. Wal-Mart’s IT team collaborated with the business to improve the forecasting and demand planning process. Once they found out what was important, IT figured out how to gather, store and seamlessly integrate external data like historical weather and future weather forecasts into the process. This enabled the business to get more valuable insights, tailor product selections at particular stores, and generate more revenue.
Q. Why is it difficult for IT leaders to convince business leaders to invest in an information management strategy?
A. In most cases, business leaders don’t see the value in an information management strategy or they haven’t seen value before. Unfortunately this often happens because IT isn’t able to connect the dots between the information management strategy and the outcomes that matter to the business.
Business leaders see value in having control over their business-critical information, being able to access it quickly and to allocate their resources to get any additional information they need. Relinquishing control takes a lot of trust. When IT leaders want to get buy-in from business leaders to invest in an information management strategy they need to be clear about how it will impact business priorities. Data integration, data quality and master data management (MDM) should be built into the budget for predictive or advanced analytics initiatives to ensure the data the business is relying on is clean, consistent and connected.
Q: You liked this quotation from an IT leader at a beer manufacturing company, “We don’t just make beer. We make beer and data. We need to manage our product supply chain and information supply chain equally efficiently.”
A.What I like about that quote is the IT leader was able to connect the dots between the primary revenue generator for the company and the role data plays in improving organizational performance. That’s something that a lot of IT leaders struggle with. IT leaders should always be thinking about what’s the next thing they can do to increase business value with the data they have in house and other data that the company may not yet be tapping into.
Q. According to a recent survey by Gartner and the Financial Executives Research Foundation, 60% of Chief Financial Officers (CFOs) are investing in analytics and improved decision-making as their #1 IT priority. What’s your advice for IT Leaders who need to get buy-in from the CFO to invest in information management?
A. Read your company’s financial statements, especially the Management Discussion and Analysis section. You’ll learn about the company’s direction, what the stakeholders are looking for, and what the CFO needs to deliver. Offer to get your CFO the information s/he needs to make decisions and to deliver. When you talk to a CFO about investing in information management, focus on the two things that matter most:
- Risk mitigation: CFOs know that bad decisions based on bad information can negatively impact revenue, expenses and market value. If you have to caveat all your decisions because you can’t trust the information, or it isn’t current, then you have problems. CFOs need to trust their information. They need to feel confident they can use it to make important financial decisions and deliver accurate reports for compliance.
- Opportunity: Once you have mitigated the risk and can trust the data, you can take advantage of predictive analytics. Wal-Mart doesn’t just do forecasting and demand planning. They do “demand shaping.” They use accurate, consistent and connected data to plan events and promotions not just to drive inventory turns, but to optimize inventory and the supply chain process. Some companies in the energy market are using accurate, consistent and connected data for predictive asset maintenance. By preventing unplanned maintenance they are saving millions of dollars, protecting revenue streams, and gaining health and safety benefits.
To do either of these things you need a solid information management plan to manage clean, consistent and connected information. It takes a commitment but the pays offs can be very significant.
Q. What are the top three business requirements when building an information management and integration strategy?
A: In my experience, IT leaders should focus on:
- Business value: A solid information management and integration strategy that has a chance of getting funded must be focused on delivering business value. Otherwise, your strategy will lack clarity and won’t drive priorities. If you focus on business value, it will be much easier to gain organizational buy-in. Get that dollar figure before you start anything. Whether it is risk mitigation, time savings, revenue generation or cost savings, you need to calculate that value to the business and get their buy-in.
- Trust: When people know they can trust the information they are getting it liberates them to explore new ideas and not have to worry about issues in the data itself.
- Flexibility: Flexibility should be banked right into the strategy. Business drivers will evolve and change. You must be able to adapt to change. One of the most neglected, and I would argue most important, parts of a solid strategy is the ability to make continuous small improvements that may require more effort than a typical maintenance event, but don’t create long delays. This will be very much appreciated by the business. We work with our clients to ensure that this is addressed.