Category Archives: Data Quality

Becoming a Revenue Driven Business Model through Data is Painful for Government Agencies

Recently, I presented a Business Value Assessment to a client.  The findings were based on a revenue-generating state government agency. Everyone at the presentation was stunned to find out how much money was left on the table by not basing their activities on transactions, which could be cleanly tied to the participating citizenry and a variety of channel partners. There was over $38 million in annual benefits left over, which included partially recovered lost revenue, cost avoidance and reduction. A higher data impact to this revenue driven business model could have prevented this.

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Given the total revenue volume, this may seem small. However, after factoring in the little technology effort required to “collect and connect” data from existing transactions, it is actually extremely high.

The real challenge for this organization will be the required policy transformation to turn the organization from “data-starved” to “data-intensive”. This would eliminate strategic decisions around new products, locations and customers relying on surveys that face sampling errors, biases, etc. Additionally, surveys are often delayed, making them practically ineffective in this real-time world we live in today.

Despite no applicable legal restrictions, the leadership’s main concern was that gathering more data would erode the public’s trust and positive image of the organization.

To be clear; by “more” data being collected by this type of government agency I mean literally 10% of what any commercial retail entity has gathered on all of us for decades.  This is not the next NSA revelation as any conspiracy theorist may fear.

While I respect their culturally driven self-censorship despite no legal barricades, it raises their stakeholders’ (the state’s citizenry) concern over its performance.  To be clear, there would be no additional revenue for the state’s programs without more citizen data.  You may believe that they already know everything about you, including your income, property value, tax information, etc. However, inter-departmental sharing of criminally-non-relevant information is legally constrained.

Another interesting finding from this evaluation was that they had no sense of conversion rate from email and social media campaigns. Impressions from click-throughs as well as hard/soft bounces were more important than tracking who actually generated revenue.

This is a very market-driven organization compared to other agencies. It actually does try to measure itself like a commercial enterprise and attempts to change in order to generate additional revenue for state programs benefiting the citizenry. I can only imagine what non-revenue-generating agencies (local, state or federal) do in this respect.  Is revenue-oriented thinking something the DoD, DoJ or Social Security should subscribe to?

Think tanks and political pundits are now looking at the trade-off between bringing democracy to every backyard on our globe and its long-term, budget ramifications. The DoD is looking to reduce the active component to its lowest in decades given the U.S. federal debt level.

Putting the data bits and pieces together for revenue

Putting the data bits and pieces together for revenue

recent article in HBR explains that cost cutting has never sustained an organization’s growth over a longer period of time, but new revenue sources did. Is your company or government agency only looking at cost and personnel productivity?

Disclaimer:

Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Operational Efficiency, Real-Time | Tagged , , | Leave a comment

Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

“Trying to improve the quality of asset data when you don’t have a solid data management infrastructure in place is like trying to save a sinking boat with a bailing bucket,” explained Dean Balog, a senior principal consultant at Noah Consulting, in this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

Dean Balog from Noah Consulting explains how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Dean has 15 years of experience in information management in the utilities industry. In this interview, Dean and I discuss the top issues facing utility executives and how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Q: Dean, what are the top issues facing utility executives?
A: The first issue is asset management / equipment maintenance. Knowing where to invest precious dollars is critical. Utility executives are engaged in a constant tug of war between two competing priorities: replacing aging infrastructure and regular maintenance.

Q. How are utility executives determining that balance?
A. You need to start with facts – the real costs and reliability information for each asset in your infrastructure. Without it, you are guessing. Basically, it is a data problem. Utility executives should ask themselves these questions:

  • Do we have the ability to capture and combine cost and reliability information from multiple sources?  Is it granular enough to be useful?
  • Do we know the maintenance costs of eight-year-old breakers versus three-year-old breakers?
  • Do our meters start failing around the average lifespan? For this example, let us say that is five years. Rather than falling uniformly into that average, do 30% of our meters fail in the first year and the rest last eight years? Those three extra years of life can certainly help out the bottom line.

Knowing your data makes all the difference. The right capital investment strategy requires combining performance, reliability, and cost data.

Quotations about data challenges faced by utility companies

Quotations about data challenges faced by utility companies

Q. Why is it difficult for utility executives to understand the real costs and reliability of assets?
A. I know this does not come as a shock, but most companies do not trust their data. Asset data is often inaccurate, inconsistent, and disconnected. Even the most basic data may not be available. For example, manufacture dates on breakers should be filled in, but they are not. If less than 50% of your breakers have manufacture dates, how can you build a preventative maintenance program? You do not even know what to address first!

A traditional approach to solving this data problem is to do a big data cleanup. You clean the data, and then before you know it, errors creep back in, and the trust in the data you have worked so hard to establish is lost.

I like to illustrate the pain of this issue by using the sinking boat analogy. Data cleanup is like bailing out the water collecting in the bottom of the boat. You think you are solving the problem but more water still seeps into the boat. You cannot stop bailing or you will sink. What you need to do is fix the leaks, and then bail out the boat. But, if you do not lift up your head from bailing long enough to see the leaks and make the right investments, you are fighting a losing battle.

Q. What can utility executives do to improve the quality of asset data?
A. First of all, you need to develop a data governance framework. Going back to the analogy, a data governance framework gives you the structure to find the leaks, fix the leaks, and monitor how much of the water has been bailed out. If the water level is still rising, you have not fixed all the leaks. But having a data governance framework is not the be-all and end-all.

You also need to appoint data stewards to be accountable for establishing and maintaining high quality asset data. The job of a data steward would be easy if there was only one system where all asset data resided. But the fact of the matter is that asset data is fragmented – scattered across multiple systems. Data stewards have a huge responsibility and they need to be supported by a solid data management infrastructure to ease the burden of managing business-critical asset information.

Webinar, Attention Utility Executives Bad Asset Data

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar.

Master Data Management (MDM) ensures business-critical asset data is consistent everywhere by pulling together data that is scattered across multiple applications. It manages and masters it in a central location on a continuous basis and shares it with any applications that need that data. MDM provides a user interface and workflow for data stewards to manage the tangled web of names and IDs these assets are known by across systems. It also gives utilities a disciplined approach to manage important relationships between the asset data, such as an asset’s performance reliability and its cost.

Q. Any other pressing issues facing utilities?
A. Yes. Another big issue is tightening regulations that consume investment dollars and become key inputs into rate case submissions and defenses. One of the complicating factors is the number of regulations is not only increasing, but the regulators are also requiring faster implementation times than ever before. So, utilities cannot just do what they have done in the past: throw more people at the problem in the short-term and resolve to fix it later by automating it “when things slow down.” That day never comes.

Q. How can utilities deal with these regulatory pressures?
A. Utilities need a new approach to deal with regulations. Start with the assumption that all data is fair game for regulators. All data must be accessible. You need to be able to report on it, not only to comply with regulations, but for competitive advantage. This requires the high quality asset information we talked about earlier, and an analytical application to:

  • Perform what-if analyses for your asset investment program;
  • Develop regulatory compliance or environmental reports quickly, because the hard work (integrating the data within your MDM program) has already been done; and
  • Get access to granular, observed reliability and cost information using your own utility’s data – not benchmark data that is already a couple of years old and highly summarized.

Q. What is your advice for utility company executives?
A. If you are the one responsible for signing off on regulatory reports and you do not look good in an orange jumpsuit, you need to invest in a plan that includes people, process, and technology to support regulatory reporting and asset management / equipment maintenance.

  • People – Data stewards have clear accountability for the quality of asset data.
  • Process – Data governance is your game plan.
  • Technology – A solid data management infrastructure consisting of data integration, data quality, and master data management is your means.

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

 Our panel of utility data experts:

  • Reveal the five toughest business challenges facing utility industry executives;
  • Explain how bad asset data could be costing you millions of dollars in operating costs;
  • Share three best practices for optimizing asset management / equipment maintenance and regulatory reporting with accurate, consistent, and connected asset information; and
  • Show you how to implement these best practices with a demonstration.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Utilities & Energy, Vertical | Tagged , , , , , , , , , , , , , , , , | Leave a comment

How Much Does Bad Data Cost Your Business?

Bad data is bad for business. Ovum Research reported that poor quality data is costing businesses at least 30% of revenues. Never before have business leaders across a broad range of roles recognized the importance of using high quality information to drive business success. Leaders in functions ranging from marketing and sales to risk management and compliance have invested in world-class applications, six sigma processes, and the most advanced predictive analytics. So why are you not seeing more return on that investment? Simply put, if your business-critical data is a mess, the rest doesn’t matter.

Dennis Moore explains the implications of using inaccurate, inconsistent and disconnected data and the value business leaders can gain by mastering it.

Dennis Moore explains the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).

Not all business leaders know there’s a better way to manage their business-critical data. So, I asked Dennis Moore, the senior vice president and general manager of Informatica’s MDM business, who clocked hundreds of thousands of airline miles last year visiting business leaders around the world, to talk about the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).

Q. Why are business leaders focusing on business-critical data now?
A. Leaders have always cared about their business-critical data, the master data on which their enterprises depend most — their customers, suppliers, the products they sell, the locations where they do business, the assets they manage, the employees who make the business perform. Leaders see the value of having a clear picture, or “best version of the truth,” describing these “master data” entities. But, this is hard to come by with competing priorities, mergers and acquisitions and siloed systems.

As companies grow, business leaders start realizing there is a huge gap between what they do know and what they should know about their customers, suppliers, products, assets and employees. Even worse,  most businesses have lost their ability to understand the relationships between business-critical data so they can improve business outcomes. Line of business leaders have been asking questions such as:

  • How can we optimize sales across channels when we don’t know which customers bought which products from which stores, sites or suppliers?
  • How can we quickly execute a recall when we don’t know which supplier delivered a defective part to which factory and where those products are now?
  • How can we accelerate time-to-market for a new drug, when we don’t know which researcher at which site used which combination of compounds on which patients?
  • How can we meet regulatory reporting deadlines, when we don’t know which model of a product we manufactured in which lot on which date?

Q. What is the crux of the problem?
A. The crux of the problem is that as businesses grow, their business-critical data becomes fragmented. There is no big picture because it’s scattered across applications, including on premise applications (such as SAP, Oracle and PeopleSoft) and cloud applications (such as Salesforce, Marketo, and Workday). But it gets worse. Business-critical data changes all the time. For example,

  • a customer moves, changes jobs, gets married, or changes their purchasing habits;
  • a suppliers moves, goes bankrupt or acquires a competitor;
  • you discontinue a product or launch a new one; or
  • you onboard a new asset or retire an old one.

As all this change occurs, business-critical data becomes inconsistent, and no one knows which application has the most up-to-date information. This costs companies money. It saps productivity and forces people to do a lot of manual work outside their best-in-class processes and world-class applications. One question I always ask business leaders is, “Do you know how much bad data is costing your business?”

Q. What can business leaders do to deal with this issue?
A. First, find out where bad data is having the most significant impact on the business. It’s not hard – just about any employee can share stories of how bad data led to a lost sale, an extra “truck roll,” lost leverage with suppliers, or a customer service problem. From the call center to the annual board planning meeting, bad data results in sub-optimal decisions and lost opportunities. Work with your line of business partners to reach a common understanding of where an improvement can really make a difference. Bad master data is everywhere, but bad master data that has material costs to the business is a much more pressing and constrained problem. Don’t try to boil the ocean or bring a full-blown data governance maturity level 5 approach to your organization if it’s not already seeing success from better data!

Second, focus on the applications and processes used to create, share, and use master data. Many times, some training, a tweak to a process, or a new interface can be created between systems, resulting in very significant improvements for the users without major IT work or process changes.

Lastly, look for a technology that is purpose-built to deal with this problem.  Master data management (MDM) helps companies better manage business-critical data in a central location on an ongoing basis and then share that “best version of the truth” with all on premise and cloud applications that need it.

Master data management (MDM) helps manage business-critical customer data and creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

Master data management (MDM) helps manage business-critical customer data and creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

Let’s use customer data as an example. If valuable customer data is located in applications such as Salesforce, Marketo, Seibel CRM, and SAP, MDM brings together all the business-critical data, the core that’s the same across all those applications, and creates the “best version of the truth.” It also creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.

MDM then shares that “mastered” customer data and the total customer relationship view with the applications that want it. MDM can be used to master the relationships between customers, such as legal entity hierarchies. This helps sales and customer service staff be more productive, while also improving legal compliance and management decision making. Advanced MDM products can also manage relationships across different types of master data. For example, advanced MDM enables you to relate an employee to a project to a contract to an asset to a commission plan. This ensures accurate and timely billing, effective expense management, managed supplier spend, and even improved workforce deployment.

When your sales team has the best possible customer information in Salesforce and the finance team has the best possible customer information in SAP, no one wastes time pulling together spreadsheets of information outside of their world-class applications. Your global workforce doesn’t waste time trying to investigate whether Jacqueline Geiger in one system and Jakki Geiger in another system is one or two customers, sending multiple bills and marketing offers at high cost in postage and customer satisfaction. All employees who have access to mastered customer information can be confident they have the best possible customer information available across the organization to do their jobs. And with the most advanced and intelligent data platform, all this information can be secured so only the authorized employees, partners, and systems have access.

Q. Which industries stand to gain the most from mastering their data?
A. In every industry there is some transformation going on that’s driving the need to know people, places and things better. Take insurance for example. Similar to the transformation in the travel industry that reduced the need for travel agents, the insurance industry is experiencing a shift from the agent/broker model to a more direct model. Traditional insurance companies now have an urgent need to know their customers so they can better serve them across all channels and across multiple lines of business.

In other industries, there is an urgent need to get a lot better at supply-chain management or to accelerate new product introductions  to compete better with an emerging rival. Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data, particularly in industries undergoing rapid transformation, or with rapidly changing regulatory requirements.

Q. Which business functions seem most interested in mastering their business-critical data?
A. It varies by industry, but there are three common threads that seem to span most industries:

Business leaders are starting to make the connection between transformation failures and bad data.

Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data.

  • MDM can help the marketing team optimize the cross-sell and up-sell process with high quality data about customers, their households or company hierarchies, the products and services they have purchased through various channels, and the interactions their organizations have had with these customers.
  • MDM can help the procurement team optimize strategic sourcing including supplier spend management and supplier risk management with high quality data about suppliers, company hierarchies,  contracts and the products they supply.
  • MDM can help the compliance teams manage all the business-critical data they need to create regulatory reports on time without burning the midnight oil.

Q. How is the use of MDM evolving?
A. When MDM technology was first introduced a decade ago, it was used as a filter. It cleaned up business-critical data on its way to the data warehouse so you’d have clean, consistent, and connected information (“conformed dimensions”) for reporting. Now business leaders are investing in MDM technology to ensure that all of their global employees have access to high quality business-critical data across all applications. They believe high quality data is mission-critical to their operations. High quality data is viewed as the the lifeblood of the company and will enable the next frontier of innovation.

Second, many companies mastered data in only one or two domains (customer and product), and used separate MDM systems for each. One system was dedicated to mastering customer data. You may recall the term Customer Data Integration (CDI). Another system was dedicated to mastering product data. Because the two systems were in silos and business-critical data about customers and products wasn’t connected, they delivered limited business value. Since that time, business leaders have questioned this approach because business problems don’t contain themselves to one type of data, such as customer or product, and many of the benefits of mastering data come from mastering other domains including supplier, chart of accounts, employee and other master or reference data shared across systems.

The relationships between data matter to the business. Knowing what customer bought from which store or site is more valuable than just knowing your customer. The business insights you can gain from these relationships is limitless. Over 90% of our customers last year bought MDM because they wanted to master multiple types of data. Our customers value having all types of business-critical data in one system to deliver clean, consistent and connected data to their applications to fuel business success.

One last evolution we’re seeing a lot involves the types and numbers of systems connecting to the master data management system. In the past, there were a small number of operational systems pushing data through the MDM system into a data warehouse used for analytical purposes. Today, we have customers with hundreds of operational systems communicating with each other via an MDM system that has just a few milliseconds to respond, and which must maintain the highest levels of availability and reliability of any system in the enterprise. For example, one major retailer manages all customer information in the MDM system, using the master data to drive real-time recommendations as well as a level of customer service in every interaction that remains the envy of their industry.

Q. Dennis, why should business leaders consider attending MDM Day?
A. Business leaders should consider attending MDM Day at InformaticaWorld 2014 on Monday, May 12, 2014. You can hear first-hand the business value companies are gaining by using clean, consistent and connected information in their operations. We’re excited to have fantastic customers who are willing to share their stories and lessons learned. We have presenters from St. Jude Medical, Citrix, Quintiles and Crestline Geiger and panelists from Thomson Reuters, Accenture, EMC, Jones Lang Lasalle, Wipro, Deloitte, AutoTrader Group, McAfee-Intel, Abbvie, Infoverity, Capgemini, and Informatica among others.

Last year’s Las Vegas event, and the events we held in London, New York and Sao Paolo were extremely well received. This year’s event is packed with even more customer sessions and opportunities to learn and to influence our product road map. MDM Day is one day before InformaticaWorld and is included in the cost of your InformaticaWorld registration. We’d love to see you there!

See the MDM Day Agenda.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Customers, Data Quality, Informatica World 2014, Master Data Management | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment

Data: The Unsung Hero (or Villain) of every Communications Service Provider

The faceless hero of CSPs: Data

The faceless hero of CSPs: Data

Analyzing current business trends helps illustrate how difficult and complex the Communication Service Provider business environment has become. CSPs face many challenges. Clients expect high quality, affordable content that can move between devices with minimum advertising or privacy concerns. To illustrate this phenomenon, here are a few recent examples:

  • Apple is working with Comcast/NBC Universal on a new converged offering
  • Vodafone purchased the Spanish cable operator, Ono, having to quickly separate the wireless customers from the cable ones and cross-sell existing products
  • Net neutrality has been scuttled in the US and upheld in the EU so now a US CSP can give preferential bandwidth to content providers, generating higher margins
  • Microsoft’s Xbox community collects terabytes of data every day making effective use, storage and disposal based on local data retention regulation a challenge
  • Expensive 4G LTE infrastructure investment by operators such as Reliance is bringing streaming content to tens of millions of new consumers

To quickly capitalize on “new” (often old, but unknown) data sources, there has to be a common understanding of:

  • Where the data is
  • What state it is in
  • What it means
  • What volume and attributes are required to accommodate a one-off project vs. a recurring one

When a multitude of departments request data for analytical projects with their one-off, IT-unsanctioned on-premise or cloud applications, how will you go about it? The average European operator has between 400 and 1,500 (known) applications. Imagine what the unknown count is.

A European operator with 20-30 million subscribers incurs an average of $3 million per month due to unpaid invoices. This often results from incorrect or incomplete contact information. Imagine how much you would have to add for lost productivity efforts, including gathering, re-formatting, enriching, checking and sending  invoices. And this does not even account for late invoice payments or extended incorrect credit terms.

Think about all the wrong long-term conclusions that are being drawn from this wrong data. This single data problem creates indirect cost in excess of three times the initial, direct impact of unpaid invoices.

Want to fix your data and overcome the accelerating cost of change? Involve your marketing, CEM, strategy, finance and sales leaders to help them understand data’s impact on the bottom line.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Quality, Operational Efficiency | Tagged , , , , | Comments Off

Is your social media investment hampered by your “data poverty”?

Why is nobody measuring the “bag of money” portion? Source: emediate.com

Why is nobody measuring the “bag of money” portion? Source: emediate.com

Recently, I talked with a company that had allocated millions of dollars for paid social media promotion. Their hope was that a massive investment in Twitter and Facebook campaigns would lead to “more eyeballs” for their online gambling sites. Although they had internal social media expertise, they lacked a comprehensive partnership with IT. In addition, they lacked a properly funded policy vision. As a result, when asked how much of their socially-driven traffic resulted in actual sales, their answer was a resounding “No Idea.” I attribute this to “data poverty.”

There is a key reason that they were unable to quantify the ROI of their promotion: Their business model is, by design, “data poor.”  Although a great deal of customer data was available to them, they didn’t elect to use it. They could have used available data to identify “known players” as well as individuals with “playing potential.” There was no law prohibiting them from acquiring this data. However, they were uncomfortable obtaining a higher degree of attribution beyond name, address, e-mail and age.  They feared that customers would view them as a commercial counterpart to the NSA. As a result, key data elements like net worth, life-time-value, credit risk, location, marital status, employment status, number of friends/followers and property value we not considered when targeting potential users on social media. So, though the Social Media team considered this granular targeting to be a “dream-come-true,” others within the organization considered it to be too “1984.”

In addition to a hesitation to leverage available data, they were also limited by their dependence on a 3rd party IT provider. This lack of self-sufficiency created data quality issues, which limited their productivity. Ultimately, this dependency prevented them from capitalizing on new market opportunities in a timely way.

It should have been possible for them to craft a multi-channel approach. They ought to have been able to serve up promoted Tweets, banner ads and mobile application ads. They should have been able to track the click-through, IP and timestamp information from each one. They should have been able to make a BCR for redeeming a promotional offer at a retail location.

Strategic channel allocation would certainly have triggered additional sales. In fact, when we applied click-through, CAC and conversion benchmarks to their available transactional information, we modeled over $8 million in additional sales and $3 million in customer acquisition cost savings. In addition to the financial benefits, strategic channel allocation would have generated more data (and resulting insights) about their prospects and customers than they had when they began.

But, because they were hesitant to use all the data available to them, they failed to capitalize on their opportunities. Don’t let this happen to you. Make a strategic policy change to become a data-driven company.

Beyond the revenue gains of targeted social marketing, there are other reasons to become a data-driven company. Clean data can help you correctly identify ideal channel partners. This company failed to use sufficient data to properly select and retain their partners.  Hundreds of channel partners were removed without proper, data-driven confirmation. Reasons for this removal included things like “death of owner,”“fire,” and “unknown”.  To ensure more thorough vetting, the company could have used data points like the owner’s age, past business endeavors and legal proceedings. They could also have have studied location-fenced attributes like footfall, click-throughs and sales cannibalization risk. In fact, when we modeled the potential overall annual savings, across all business scenarios, for becoming a data driven company, the potential savings amount approached $40 million dollars.

Would a $40 million dollar savings inspire you to invest in your data? Would that amount be enough to motivate you to acquire, standardize, deduplicate, link, hierarchically structure and enrich YOUR data? It’s a no brainer. But it requires a policy shift to make your data work for you. Without this, it’s all just “potential”.

Do you have stories about companies that recently switched from traditional operations to smarter, data-driven operations? If so, I’d love to hear from you.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer  and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Quality, Master Data Management | Tagged , , | 2 Comments

The Surprising Link Between Hurricanes and Strawberry Pop-Tarts: Brought to you by Clean, Consistent and Connected Data

What do you think Wal-Mart’s best-seller is right before a hurricane? If you guessed water like I did, you’d be wrong. According to this New York Times article, “What Wal-Mart Knows About Customers’ Habits” the retailer sells 7X more strawberry Pop-Tarts in Florida right before a hurricane than any other time. Armed with predictive analytics and a solid information management foundation, the team stocks up on strawberry Pop-Tarts to make sure they have enough supply to meet demand.

Andrew Donaher advises IT leaders to ask business  leaders how much bad data is costing them.

Andrew Donaher
advises IT leaders to ask business
leaders how much bad data is costing them.

I learned this fun fact from Andrew Donaher, Director of Information Management Strategy at Groundswell Group, a consulting firm based in western Canada that specializes in information management services. In this interview, Andy and I discuss how IT leaders can increase the value of data to drive business value, explain how some IT leaders are collaborating with business leaders to improve predictive analytics, and share advice about how to talk to business leaders, such as the CFO about investing in an information management strategy.

Q. Andy, what can IT leaders do to increase the value of data to drive business value?

A. Simply put, each business leader in a company needs to focus on achieving their goals. The first step IT leaders should take is to engage with each business leader to understand their long and short-term goals and ask some key questions, such as:

  • What type of information is critical to achieving their goals?
  • Do they have the information they need to make the next decision or take the next best action?
  • Is all the data they need in house? If not, where is it?
  • What challenges are they facing when it comes to their data?
  • How much time are people spending trying to pull together the information they need?
  • How much time are people spending fixing bad data?
  • How much is this costing them?
  • What opportunities exist if they had all the information they need and could trust it?
You need a solid information management strategy to make the shift from looking into the rear-view mirror realizing the potential business value of predictive analytics.

If you want to get the business value you’re expecting by shifting from rear-view mirror style reporting to predictive analytics, you need to use clean, consistent and connected data

Q. How are IT leaders collaborating with business partners to improve predictive analytics?

A. Wal-Mart’s IT team collaborated with the business to improve the forecasting and demand planning process. Once they found out what was important, IT figured out how to gather, store and seamlessly integrate external data like historical weather and future weather forecasts into the process. This enabled the business to get more valuable insights, tailor product selections at particular stores, and generate more revenue.

Q. Why is it difficult for IT leaders to convince business leaders to invest in an information management strategy?

A. In most cases, business leaders don’t see the value in an information management strategy or they haven’t seen value before. Unfortunately this often happens because IT isn’t able to connect the dots between the information management strategy and the outcomes that matter to the business.

Business leaders see value in having control over their business-critical information, being able to access it quickly and to allocate their resources to get any additional information they need. Relinquishing control takes a lot of trust. When IT leaders want to get buy-in from business leaders to invest in an information management strategy they need to be clear about how it will impact business priorities. Data integration, data quality and master data management (MDM) should be built into the budget for predictive or advanced analytics initiatives to ensure the data the business is relying on is clean, consistent and connected.

Q: You liked this quotation from an IT leader at a beer manufacturing company, “We don’t just make beer. We make beer and data. We need to manage our product supply chain and information supply chain equally efficiently.”

A.What I like about that quote is the IT leader was able to connect the dots between the primary revenue generator for the company and the role data plays in improving organizational performance. That’s something that a lot of IT leaders struggle with. IT leaders should always be thinking about what’s the next thing they can do to increase business value with the data they have in house and other data that the company may not yet be tapping into.

Q. According to a recent survey by Gartner and the Financial Executives Research Foundation, 60% of Chief Financial Officers (CFOs) are investing in analytics and improved decision-making as their #1 IT priority. What’s your advice for IT Leaders who need to get buy-in from the CFO to invest in information management?

A. Read your company’s financial statements, especially the Management Discussion and Analysis section. You’ll learn about the company’s direction, what the stakeholders are looking for, and what the CFO needs to deliver. Offer to get your CFO the information s/he needs to make decisions and to deliver. When you talk to a CFO about investing in information management, focus on the two things that matter most:

  1. Risk mitigation: CFOs know that bad decisions based on bad information can negatively impact revenue, expenses and market value. If you have to caveat all your decisions because you can’t trust the information, or it isn’t current, then you have problems. CFOs need to trust their information. They need to feel confident they can use it to make important financial decisions and deliver accurate reports for compliance.
  2. Opportunity: Once you have mitigated the risk and can trust the data, you can take advantage of predictive analytics. Wal-Mart doesn’t just do forecasting and demand planning. They do “demand shaping.” They use accurate, consistent and connected data to plan events and promotions not just to drive inventory turns, but to optimize inventory and the supply chain process. Some companies in the energy market are using accurate, consistent and connected data for predictive asset maintenance. By preventing unplanned maintenance they are saving millions of dollars, protecting revenue streams, and gaining health and safety benefits.

To do either of these things you need a solid information management plan to manage clean, consistent and connected information.  It takes a commitment but the pays offs can be very significant.

Q. What are the top three business requirements when building an information management and integration strategy?
A: In my experience, IT leaders should focus on:

  1. Business value: A solid information management and integration strategy that has a chance of getting funded must be focused on delivering business value. Otherwise, your strategy will lack clarity and won’t drive priorities. If you focus on business value, it will be much easier to gain organizational buy-in. Get that dollar figure before you start anything. Whether it is risk mitigation, time savings, revenue generation or cost savings, you need to calculate that value to the business and get their buy-in.
  2. Trust: When people know they can trust the information they are getting it liberates them to explore new ideas and not have to worry about issues in the data itself.
  3. Flexibility: Flexibility should be banked right into the strategy. Business drivers will evolve and change. You must be able to adapt to change. One of the most neglected, and I would argue most important, parts of a solid strategy is the ability to make continuous small improvements that may require more effort than a typical maintenance event, but don’t create long delays. This will be very much appreciated by the business. We work with our clients to ensure that this is addressed.
FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Retail | Tagged , , , , , , , , , | 11 Comments

INFAgraphic: The Healthcare Organization of the Future will be Data-Driven

Thumbnail_HIMSS

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Privacy, Data Quality, data replication, Healthcare | Tagged , , , | 1 Comment

Holistic Data Stewardship

Holistic Data QualityNine years ago when I started in the data integration and quality space, data quality was all about algorithms and cleansing technology.  Data went in, and the “best” solution was the one that could do the best job of fuzzy matching the data and cleaning more data than the other products.  Of course, not data quality solution could clean 100% of the data so “exceptions” were dumped into a file that were left as an “exercise for the user” to deal with on their own.  This usually meant using the data management product of choice, when there is nothing else…. Data goes into a spreadsheet, and then users would remediate the mistakes by hand in the spreadsheet.  Then someone would write an SQL query to write the corrections back into the database.  In the end, managing the exceptions was a very manual process with very little to no governance to the process.

The problem with this of course is that for very many companies, data stewardship is not the person’s day job.  So if they have to spend time checking to see if someone else has corrected an error in the data, or getting approval to make a data change, or spending time then consolidating all the manual changes they made and then communicating those changes to management, then they don’t have much time left to sleep, much less eat.   In the end, but business of creating quality data just doesn’t get done, or doesn’t get done well.  In the end, data quality is a business issue, supported by IT, but the business facing part of the solution has been missing.

But that is about to change.  Informatica already provides the most scalable data quality product for handling the automated portion of the data quality process.  And now, in the latest release of Informatica Data Quality 9.6, we have created a new edition called the Data Quality Governance Edition to fully manage the exception process.    This edition provides a completely governed process for managing remediation of data exceptions by business data stewards.   It allows organizations to create their own customized process with different levels of review.  Additionally, it makes it possible for business users to create their own data quality rules, describing the rules in plain language…. no coding necessary.

And of course, every organization wants to be able to track how they are improving.  And Informatica Data Quality 9.6 includes embeddable dashboards that show the progress of how data quality is improving and impacting the business in a positive way.

Great data isn’t an accident.  Great data happens by design.  And for the first time, data cleansing has been combined with a holistic data stewardship process, allowing business and IT to collaborate to create quality data that supports critical business processes.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality | Tagged , | Leave a comment

Business Agility Depends on Data Integration Agility and Your Skills

data integration agilityOver the last 40 years, data has become increasingly distributed.  It used to all sit on storage connected to a mainframe.  It used to be that the application of computing power to solve business problems was limited by the availability of CPU, memory, network and disk.  Those limitations are no longer big inhibitors.  Data fragmentation is now the new inhibitor to business agility.   Data is now generated from distributed data sources not just within a corporation, but from business partners, from device sensors and from consumers Facebook-ing and tweeting away on the internet.

So to solve any interesting business problem in today’s fragmented data world, you now have to pull data together from a wide variety of data sources.  That means business agility 100% depends on data integration agility.  But how do you do deliver that agility in a way that is not just fast, but reliable, and delivers high quality data?

First, to achieve data integration agility, you need to move from a traditional waterfall development process to an agile development process.

Second, if you need reliability, you have to think about how you start treating your data integration process as a critical business process.  That means thinking about how you will make your integration processes highly available.  It also means you need to monitor and validate your operational data integration processes on an ongoing basis.   The good news is that the capabilities you need for data validation as well as operational monitoring and alerting for your data integration process are now built into Informatica’s newest PowerCenter Edition, PowerCenter Premium Edition.

Lastly, the days where you can just move data from A to B without including a data quality process are over.  Great data doesn’t happen by accident, it happens by design.  And that means you also have to build in data quality directly into your data integration process.

Great businesses depend on great data.  And great data means data that is delivered on time, with confidence and with high quality.    So think about how your understanding of data integration and great data can make your career.  Great businesses depend on great data and people like you who have the skills to make a difference.     As a data professional,  the time has never been better for you to make a contribution to the greatness of your organization.   You have the opportunity to make a difference and have an impact because your skills and your understanding of data integration has never been more critical.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality | Tagged , , | Leave a comment