Tag Archives: Data Quality

Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

“Trying to improve the quality of asset data when you don’t have a solid data management infrastructure in place is like trying to save a sinking boat with a bailing bucket,” explained Dean Balog, a senior principal consultant at Noah Consulting, in this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

Dean Balog from Noah Consulting explains how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Dean has 15 years of experience in information management in the utilities industry. In this interview, Dean and I discuss the top issues facing utility executives and how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Q: Dean, what are the top issues facing utility executives?
A: The first issue is asset management / equipment maintenance. Knowing where to invest precious dollars is critical. Utility executives are engaged in a constant tug of war between two competing priorities: replacing aging infrastructure and regular maintenance.

Q. How are utility executives determining that balance?
A. You need to start with facts – the real costs and reliability information for each asset in your infrastructure. Without it, you are guessing. Basically, it is a data problem. Utility executives should ask themselves these questions:

  • Do we have the ability to capture and combine cost and reliability information from multiple sources?  Is it granular enough to be useful?
  • Do we know the maintenance costs of eight-year-old breakers versus three-year-old breakers?
  • Do our meters start failing around the average lifespan? For this example, let us say that is five years. Rather than falling uniformly into that average, do 30% of our meters fail in the first year and the rest last eight years? Those three extra years of life can certainly help out the bottom line.

Knowing your data makes all the difference. The right capital investment strategy requires combining performance, reliability, and cost data.

Quotations about data challenges faced by utility companies

Quotations about data challenges faced by utility companies

Q. Why is it difficult for utility executives to understand the real costs and reliability of assets?
A. I know this does not come as a shock, but most companies do not trust their data. Asset data is often inaccurate, inconsistent, and disconnected. Even the most basic data may not be available. For example, manufacture dates on breakers should be filled in, but they are not. If less than 50% of your breakers have manufacture dates, how can you build a preventative maintenance program? You do not even know what to address first!

A traditional approach to solving this data problem is to do a big data cleanup. You clean the data, and then before you know it, errors creep back in, and the trust in the data you have worked so hard to establish is lost.

I like to illustrate the pain of this issue by using the sinking boat analogy. Data cleanup is like bailing out the water collecting in the bottom of the boat. You think you are solving the problem but more water still seeps into the boat. You cannot stop bailing or you will sink. What you need to do is fix the leaks, and then bail out the boat. But, if you do not lift up your head from bailing long enough to see the leaks and make the right investments, you are fighting a losing battle.

Q. What can utility executives do to improve the quality of asset data?
A. First of all, you need to develop a data governance framework. Going back to the analogy, a data governance framework gives you the structure to find the leaks, fix the leaks, and monitor how much of the water has been bailed out. If the water level is still rising, you have not fixed all the leaks. But having a data governance framework is not the be-all and end-all.

You also need to appoint data stewards to be accountable for establishing and maintaining high quality asset data. The job of a data steward would be easy if there was only one system where all asset data resided. But the fact of the matter is that asset data is fragmented – scattered across multiple systems. Data stewards have a huge responsibility and they need to be supported by a solid data management infrastructure to ease the burden of managing business-critical asset information.

Webinar, Attention Utility Executives Bad Asset Data

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar.

Master Data Management (MDM) ensures business-critical asset data is consistent everywhere by pulling together data that is scattered across multiple applications. It manages and masters it in a central location on a continuous basis and shares it with any applications that need that data. MDM provides a user interface and workflow for data stewards to manage the tangled web of names and IDs these assets are known by across systems. It also gives utilities a disciplined approach to manage important relationships between the asset data, such as an asset’s performance reliability and its cost.

Q. Any other pressing issues facing utilities?
A. Yes. Another big issue is tightening regulations that consume investment dollars and become key inputs into rate case submissions and defenses. One of the complicating factors is the number of regulations is not only increasing, but the regulators are also requiring faster implementation times than ever before. So, utilities cannot just do what they have done in the past: throw more people at the problem in the short-term and resolve to fix it later by automating it “when things slow down.” That day never comes.

Q. How can utilities deal with these regulatory pressures?
A. Utilities need a new approach to deal with regulations. Start with the assumption that all data is fair game for regulators. All data must be accessible. You need to be able to report on it, not only to comply with regulations, but for competitive advantage. This requires the high quality asset information we talked about earlier, and an analytical application to:

  • Perform what-if analyses for your asset investment program;
  • Develop regulatory compliance or environmental reports quickly, because the hard work (integrating the data within your MDM program) has already been done; and
  • Get access to granular, observed reliability and cost information using your own utility’s data – not benchmark data that is already a couple of years old and highly summarized.

Q. What is your advice for utility company executives?
A. If you are the one responsible for signing off on regulatory reports and you do not look good in an orange jumpsuit, you need to invest in a plan that includes people, process, and technology to support regulatory reporting and asset management / equipment maintenance.

  • People – Data stewards have clear accountability for the quality of asset data.
  • Process – Data governance is your game plan.
  • Technology – A solid data management infrastructure consisting of data integration, data quality, and master data management is your means.

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

 Our panel of utility data experts:

  • Reveal the five toughest business challenges facing utility industry executives;
  • Explain how bad asset data could be costing you millions of dollars in operating costs;
  • Share three best practices for optimizing asset management / equipment maintenance and regulatory reporting with accurate, consistent, and connected asset information; and
  • Show you how to implement these best practices with a demonstration.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Utilities & Energy, Vertical | Tagged , , , , , , , , , , , , , , , , | Leave a comment

Is your social media investment hampered by your “data poverty”?

Why is nobody measuring the “bag of money” portion? Source: emediate.com

Why is nobody measuring the “bag of money” portion? Source: emediate.com

Recently, I talked with a company that had allocated millions of dollars for paid social media promotion. Their hope was that a massive investment in Twitter and Facebook campaigns would lead to “more eyeballs” for their online gambling sites. Although they had internal social media expertise, they lacked a comprehensive partnership with IT. In addition, they lacked a properly funded policy vision. As a result, when asked how much of their socially-driven traffic resulted in actual sales, their answer was a resounding “No Idea.” I attribute this to “data poverty.”

There is a key reason that they were unable to quantify the ROI of their promotion: Their business model is, by design, “data poor.”  Although a great deal of customer data was available to them, they didn’t elect to use it. They could have used available data to identify “known players” as well as individuals with “playing potential.” There was no law prohibiting them from acquiring this data. However, they were uncomfortable obtaining a higher degree of attribution beyond name, address, e-mail and age.  They feared that customers would view them as a commercial counterpart to the NSA. As a result, key data elements like net worth, life-time-value, credit risk, location, marital status, employment status, number of friends/followers and property value we not considered when targeting potential users on social media. So, though the Social Media team considered this granular targeting to be a “dream-come-true,” others within the organization considered it to be too “1984.”

In addition to a hesitation to leverage available data, they were also limited by their dependence on a 3rd party IT provider. This lack of self-sufficiency created data quality issues, which limited their productivity. Ultimately, this dependency prevented them from capitalizing on new market opportunities in a timely way.

It should have been possible for them to craft a multi-channel approach. They ought to have been able to serve up promoted Tweets, banner ads and mobile application ads. They should have been able to track the click-through, IP and timestamp information from each one. They should have been able to make a BCR for redeeming a promotional offer at a retail location.

Strategic channel allocation would certainly have triggered additional sales. In fact, when we applied click-through, CAC and conversion benchmarks to their available transactional information, we modeled over $8 million in additional sales and $3 million in customer acquisition cost savings. In addition to the financial benefits, strategic channel allocation would have generated more data (and resulting insights) about their prospects and customers than they had when they began.

But, because they were hesitant to use all the data available to them, they failed to capitalize on their opportunities. Don’t let this happen to you. Make a strategic policy change to become a data-driven company.

Beyond the revenue gains of targeted social marketing, there are other reasons to become a data-driven company. Clean data can help you correctly identify ideal channel partners. This company failed to use sufficient data to properly select and retain their partners.  Hundreds of channel partners were removed without proper, data-driven confirmation. Reasons for this removal included things like “death of owner,”“fire,” and “unknown”.  To ensure more thorough vetting, the company could have used data points like the owner’s age, past business endeavors and legal proceedings. They could also have have studied location-fenced attributes like footfall, click-throughs and sales cannibalization risk. In fact, when we modeled the potential overall annual savings, across all business scenarios, for becoming a data driven company, the potential savings amount approached $40 million dollars.

Would a $40 million dollar savings inspire you to invest in your data? Would that amount be enough to motivate you to acquire, standardize, deduplicate, link, hierarchically structure and enrich YOUR data? It’s a no brainer. But it requires a policy shift to make your data work for you. Without this, it’s all just “potential”.

Do you have stories about companies that recently switched from traditional operations to smarter, data-driven operations? If so, I’d love to hear from you.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer  and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Quality, Master Data Management | Tagged , , | 2 Comments

The Surprising Link Between Hurricanes and Strawberry Pop-Tarts: Brought to you by Clean, Consistent and Connected Data

What do you think Wal-Mart’s best-seller is right before a hurricane? If you guessed water like I did, you’d be wrong. According to this New York Times article, “What Wal-Mart Knows About Customers’ Habits” the retailer sells 7X more strawberry Pop-Tarts in Florida right before a hurricane than any other time. Armed with predictive analytics and a solid information management foundation, the team stocks up on strawberry Pop-Tarts to make sure they have enough supply to meet demand.

Andrew Donaher advises IT leaders to ask business  leaders how much bad data is costing them.

Andrew Donaher
advises IT leaders to ask business
leaders how much bad data is costing them.

I learned this fun fact from Andrew Donaher, Director of Information Management Strategy at Groundswell Group, a consulting firm based in western Canada that specializes in information management services. In this interview, Andy and I discuss how IT leaders can increase the value of data to drive business value, explain how some IT leaders are collaborating with business leaders to improve predictive analytics, and share advice about how to talk to business leaders, such as the CFO about investing in an information management strategy.

Q. Andy, what can IT leaders do to increase the value of data to drive business value?

A. Simply put, each business leader in a company needs to focus on achieving their goals. The first step IT leaders should take is to engage with each business leader to understand their long and short-term goals and ask some key questions, such as:

  • What type of information is critical to achieving their goals?
  • Do they have the information they need to make the next decision or take the next best action?
  • Is all the data they need in house? If not, where is it?
  • What challenges are they facing when it comes to their data?
  • How much time are people spending trying to pull together the information they need?
  • How much time are people spending fixing bad data?
  • How much is this costing them?
  • What opportunities exist if they had all the information they need and could trust it?
You need a solid information management strategy to make the shift from looking into the rear-view mirror realizing the potential business value of predictive analytics.

If you want to get the business value you’re expecting by shifting from rear-view mirror style reporting to predictive analytics, you need to use clean, consistent and connected data

Q. How are IT leaders collaborating with business partners to improve predictive analytics?

A. Wal-Mart’s IT team collaborated with the business to improve the forecasting and demand planning process. Once they found out what was important, IT figured out how to gather, store and seamlessly integrate external data like historical weather and future weather forecasts into the process. This enabled the business to get more valuable insights, tailor product selections at particular stores, and generate more revenue.

Q. Why is it difficult for IT leaders to convince business leaders to invest in an information management strategy?

A. In most cases, business leaders don’t see the value in an information management strategy or they haven’t seen value before. Unfortunately this often happens because IT isn’t able to connect the dots between the information management strategy and the outcomes that matter to the business.

Business leaders see value in having control over their business-critical information, being able to access it quickly and to allocate their resources to get any additional information they need. Relinquishing control takes a lot of trust. When IT leaders want to get buy-in from business leaders to invest in an information management strategy they need to be clear about how it will impact business priorities. Data integration, data quality and master data management (MDM) should be built into the budget for predictive or advanced analytics initiatives to ensure the data the business is relying on is clean, consistent and connected.

Q: You liked this quotation from an IT leader at a beer manufacturing company, “We don’t just make beer. We make beer and data. We need to manage our product supply chain and information supply chain equally efficiently.”

A.What I like about that quote is the IT leader was able to connect the dots between the primary revenue generator for the company and the role data plays in improving organizational performance. That’s something that a lot of IT leaders struggle with. IT leaders should always be thinking about what’s the next thing they can do to increase business value with the data they have in house and other data that the company may not yet be tapping into.

Q. According to a recent survey by Gartner and the Financial Executives Research Foundation, 60% of Chief Financial Officers (CFOs) are investing in analytics and improved decision-making as their #1 IT priority. What’s your advice for IT Leaders who need to get buy-in from the CFO to invest in information management?

A. Read your company’s financial statements, especially the Management Discussion and Analysis section. You’ll learn about the company’s direction, what the stakeholders are looking for, and what the CFO needs to deliver. Offer to get your CFO the information s/he needs to make decisions and to deliver. When you talk to a CFO about investing in information management, focus on the two things that matter most:

  1. Risk mitigation: CFOs know that bad decisions based on bad information can negatively impact revenue, expenses and market value. If you have to caveat all your decisions because you can’t trust the information, or it isn’t current, then you have problems. CFOs need to trust their information. They need to feel confident they can use it to make important financial decisions and deliver accurate reports for compliance.
  2. Opportunity: Once you have mitigated the risk and can trust the data, you can take advantage of predictive analytics. Wal-Mart doesn’t just do forecasting and demand planning. They do “demand shaping.” They use accurate, consistent and connected data to plan events and promotions not just to drive inventory turns, but to optimize inventory and the supply chain process. Some companies in the energy market are using accurate, consistent and connected data for predictive asset maintenance. By preventing unplanned maintenance they are saving millions of dollars, protecting revenue streams, and gaining health and safety benefits.

To do either of these things you need a solid information management plan to manage clean, consistent and connected information.  It takes a commitment but the pays offs can be very significant.

Q. What are the top three business requirements when building an information management and integration strategy?
A: In my experience, IT leaders should focus on:

  1. Business value: A solid information management and integration strategy that has a chance of getting funded must be focused on delivering business value. Otherwise, your strategy will lack clarity and won’t drive priorities. If you focus on business value, it will be much easier to gain organizational buy-in. Get that dollar figure before you start anything. Whether it is risk mitigation, time savings, revenue generation or cost savings, you need to calculate that value to the business and get their buy-in.
  2. Trust: When people know they can trust the information they are getting it liberates them to explore new ideas and not have to worry about issues in the data itself.
  3. Flexibility: Flexibility should be banked right into the strategy. Business drivers will evolve and change. You must be able to adapt to change. One of the most neglected, and I would argue most important, parts of a solid strategy is the ability to make continuous small improvements that may require more effort than a typical maintenance event, but don’t create long delays. This will be very much appreciated by the business. We work with our clients to ensure that this is addressed.
FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Retail | Tagged , , , , , , , , , | 11 Comments

Holistic Data Stewardship

Holistic Data QualityNine years ago when I started in the data integration and quality space, data quality was all about algorithms and cleansing technology.  Data went in, and the “best” solution was the one that could do the best job of fuzzy matching the data and cleaning more data than the other products.  Of course, not data quality solution could clean 100% of the data so “exceptions” were dumped into a file that were left as an “exercise for the user” to deal with on their own.  This usually meant using the data management product of choice, when there is nothing else…. Data goes into a spreadsheet, and then users would remediate the mistakes by hand in the spreadsheet.  Then someone would write an SQL query to write the corrections back into the database.  In the end, managing the exceptions was a very manual process with very little to no governance to the process.

The problem with this of course is that for very many companies, data stewardship is not the person’s day job.  So if they have to spend time checking to see if someone else has corrected an error in the data, or getting approval to make a data change, or spending time then consolidating all the manual changes they made and then communicating those changes to management, then they don’t have much time left to sleep, much less eat.   In the end, but business of creating quality data just doesn’t get done, or doesn’t get done well.  In the end, data quality is a business issue, supported by IT, but the business facing part of the solution has been missing.

But that is about to change.  Informatica already provides the most scalable data quality product for handling the automated portion of the data quality process.  And now, in the latest release of Informatica Data Quality 9.6, we have created a new edition called the Data Quality Governance Edition to fully manage the exception process.    This edition provides a completely governed process for managing remediation of data exceptions by business data stewards.   It allows organizations to create their own customized process with different levels of review.  Additionally, it makes it possible for business users to create their own data quality rules, describing the rules in plain language…. no coding necessary.

And of course, every organization wants to be able to track how they are improving.  And Informatica Data Quality 9.6 includes embeddable dashboards that show the progress of how data quality is improving and impacting the business in a positive way.

Great data isn’t an accident.  Great data happens by design.  And for the first time, data cleansing has been combined with a holistic data stewardship process, allowing business and IT to collaborate to create quality data that supports critical business processes.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Quality | Tagged , | Leave a comment

Business Agility Depends on Data Integration Agility and Your Skills

data integration agilityOver the last 40 years, data has become increasingly distributed.  It used to all sit on storage connected to a mainframe.  It used to be that the application of computing power to solve business problems was limited by the availability of CPU, memory, network and disk.  Those limitations are no longer big inhibitors.  Data fragmentation is now the new inhibitor to business agility.   Data is now generated from distributed data sources not just within a corporation, but from business partners, from device sensors and from consumers Facebook-ing and tweeting away on the internet.

So to solve any interesting business problem in today’s fragmented data world, you now have to pull data together from a wide variety of data sources.  That means business agility 100% depends on data integration agility.  But how do you do deliver that agility in a way that is not just fast, but reliable, and delivers high quality data?

First, to achieve data integration agility, you need to move from a traditional waterfall development process to an agile development process.

Second, if you need reliability, you have to think about how you start treating your data integration process as a critical business process.  That means thinking about how you will make your integration processes highly available.  It also means you need to monitor and validate your operational data integration processes on an ongoing basis.   The good news is that the capabilities you need for data validation as well as operational monitoring and alerting for your data integration process are now built into Informatica’s newest PowerCenter Edition, PowerCenter Premium Edition.

Lastly, the days where you can just move data from A to B without including a data quality process are over.  Great data doesn’t happen by accident, it happens by design.  And that means you also have to build in data quality directly into your data integration process.

Great businesses depend on great data.  And great data means data that is delivered on time, with confidence and with high quality.    So think about how your understanding of data integration and great data can make your career.  Great businesses depend on great data and people like you who have the skills to make a difference.     As a data professional,  the time has never been better for you to make a contribution to the greatness of your organization.   You have the opportunity to make a difference and have an impact because your skills and your understanding of data integration has never been more critical.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality | Tagged , , | Leave a comment

Hospitality Execs: Invest in Great Customer Information to Support A Customer-Obsessed Culture

I love exploring new places. I’ve had exceptional experiences at the W in Hong Kong, El Dorado Royale in the Riviera Maya and Ventana Inn in Big Sur. I belong to almost every loyalty program under the sun, but not all hospitality companies are capitalizing on the potential of my customer information. Imagine if employees had access to it so they could personalize their interactions with me and send me marketing offers that appeal to my interests.

Do I have high expectations? Yes. But so do many travelers. This puts pressure on marketing and sales executives who want to compete to win. According to Deloitte’s report, “Hospitality 2015: Game changers or spectators?,” hospitality companies need to adapt to meet consumers’ increasing expectations to know their preferences and tastes and to customize packages that suit individual needs.

Jeff Klagenberg helps companies to use their data as a strategic asset

Jeff Klagenberg helps companies use data as a strategic asset and get the most value out of it.

In this interview, Jeff Klagenberg, senior principal at Myers-Holum, explains how one of the largest, most customer-focused companies in the hospitality industry is investing in better customer, product, and asset information. Why? To personalize customer interactions, bundle appealing promotion packages and personalize marketing offers across channels.

Q: What are the company’s goals?
A: The executive team at one of the world’s leading providers of family travel and leisure experiences is focused on achieving excellence in quality and guest services. They generate revenues from the sales of room nights at hotels, food and beverages, merchandise, admissions and vacation club properties. The executive team believes their future success depends on stronger execution based on better measurement and a better understanding of customers.

Q: What role does customer, product and asset information play in achieving these goals?
A: Without the highest quality business-critical data, how can employees continually improve customer interactions? How can they bundle appealing promotional packages or personalize marketing offers? How can they accurately measure the impact of sales and marketing efforts? The team recognized the powerful role of high quality information in their pursuit of excellence.

Q: What are they doing to improve the quality of this business-critical information?
A: To get the most value out of their data and deliver the highest quality information to business and analytical applications, they knew they needed to invest in an integrated information management infrastructure to support their data governance process. Now they use the Informatica Total Customer Relationship Solution, which combines data integration, data quality, and master data management (MDM). It pulls together fragmented customer information, product information, and asset information scattered across hundreds of applications in their global operations into one central, trusted location where it can be managed and shared with analytical and operational applications on an ongoing basis.

Many marketers overlook the importance of using high quality customer information in their personalization capabilities.

Many marketers overlook the importance of using high quality customer information in their investments in personalization.

Q: How will this impact marketing and sales?
A: With clean, consistent and connected customer information, product information, and asset information in the company’s applications, they are optimizing marketing, sales and customer service processes. They get limitless insights into who their customers are and their valuable relationships, including households, corporate hierarchies and influencer networks. They see which products and services customers have purchased in the past, their preferences and tastes. High quality information enables the marketing and sales team to personalize customer interactions across touch points, bundle appealing promotional packages, and personalize marketing offers across channels. They have a better understanding of which marketing, advertising and promotional programs work and which don’t.

Q: What is the role did the marketing and sales leaders play in this initiative?
A: The marketing leaders and sales leaders played a key role in getting this initiative off the ground. With an integrated information management infrastructure in place, they’ll benefit from better integration between business-critical master data about customers, products and assets and transaction data.

Q. How will this help them gain customer insights from “Big Data”?
A. We helped the business leaders understand that getting customer insights from “Big Data” such as weblogs, call logs, social and mobile data requires a strong backbone of integrated business-critical data. By investing in a data-centric approach, they future-proofed their business. They are ready to incorporate any type of data they will want to analyze, such as interaction data. A key realization was there is no such thing as “Small Data.” The future is about getting very bit of understanding out of every data source.

Q: What advice do you have for hospitality industry executives?
A: Ask yourself, “Which of our strategic initiatives can be achieved with inaccurate, inconsistent and disconnected information?” Most executives know that the business-critical data in their applications, used by employees across the globe, is not the highest quality. But they are shocked to learn how much this is costing the company. My advice is talk to IT about the current state of your customer, product and asset information. Find out if it is holding you back from achieving your strategic initiatives.

Also, many business executives are excited about the prospect of analyzing “Big Data” to gain revenue-generating insights about customers. But the business-critical data about customers, products and assets is often in terrible shape. To use an analogy: look at a wheat field and imagine the bread it will yield. But don’t forget if you don’t separate the grain from the chaff you’ll be disappointed with the outcome. If you are working on a Big Data initiative, don’t forget to invest in the integrated information management infrastructure required to give you the clean, consistent and connected information you need to achieve great things.

FacebookTwitterLinkedInEmailPrintShare
Posted in Customer Acquisition & Retention, Customers, Data Integration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , , , , , , , , , , , , , , , | Leave a comment

All Aboard! Engineer Your Mission-Critical Asset Information So It Doesn’t Go Off The Rails

Do you know what year the first steam engine locomotive was invented? 1804. It traveled 9 miles in two hours. Now, you and I would be pretty upset of we boarded a train and it took 2 hours to go 9 miles. But, 200 years ago, this was a huge innovation and led to the invention of the modern day train and railway.

Tremendous Growth In Demand for Rail Travel Puts Pressure on Rail Infrastructure
Today, Britain is experiencing tremendous growth in demand for rail travel. One million more trains and 500 million more passengers travel by train than just 5 years ago. Over the next 30 years passenger demand for rail will more than double and freight demand is expected to go up by 140%. This puts tremendous pressure on the rail infrastructure.

Is it hard to make sense of your mission-critical asset information because it's scattered across multiple applications?

Is it hard to make sense of your valuable asset information because it’s scattered across multiple applications?

Network Rail is in the modern-day rail business. Employees work day and night running, maintaining and updating Britain’s rail infrastructure, including millions of assets, such as 22,000 miles of track, 6,500 crossings, 43,000 bridges, viaducts and tunnels. Improving the rail network provides faster, more frequent and more reliable journeys between Britain’s towns and cities.

Network Rail is investing more in the rail infrastructure than in Victorian times. In the last six months, they spent about $25 million a day! In a recent news release, Patrick Bucher, group finance director said, “We continue to invest record amounts to deliver a bigger, better railway for passengers and businesses across Britain. We are also driving down the cost of running Britain’s railway to help make it more affordable in the years ahead.”

Employees Need to Trust Asset Information to Pinpoint and Fix Problems Quickly
To pinpoint and fix problems quickly, keep their operating costs low and maintain a strong safety record, Network Rail’s employees need to trust their mission-critical asset information, such as:

  1. What is the problem?
  2. Where is it?
  3. What equipment, tools and skills are needed to fix it?
  4. Who is closest to the problem that could fix it?

Difficult to Make Sense of Asset Information Scattered across Applications
Similar to many companies their size, Network Rail’s mission-critical asset information was scattered across many applications, which made it difficult for employees to make sense of asset information and the interaction between assets.

The asset information team recognized the limitations of employees depending on an application-centric view of their business. To operate more efficiently and effectively, they needed clean asset information, consistent asset information, and connected asset information.

Investing in Rail Infrastructure AND the Information Infrastructure to Support It
Network Rail now uses a combination of data integration, data quality, and master data management (MDM) to manage their mission-critical asset information in a central location on an ongoing basis, to:

  1. make sense of asset information,
  2. understand the relationships between assets, and
  3. track changes to asset information.

In a news release, Patrick Bossert Director of Network Rail’s Asset Information services business said, “With more accurate and reliable information about assets and their condition our team can make better business decisions, enable innovation in our asset management policy, planning and execution, and improve  rail-system-wide investment decisions that benefit the rail industry as a whole.”

If you work for a company that revolves around mission-critical asset information, ask yourself these questions:

  1. Can our employees makes sense of our asset information?
  2. Can they easily see relationships between assets and how they interact?
  3. Can they see the history of changes to asset information over time?

Or are are they limited by an application-centric view of the business because asset information is scattered across in multiple systems?

Have a similar story about how you are managing your mission-critical asset information? Please share it in the comments below.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality, Master Data Management, Transportation, Uncategorized | Tagged , , , , , , , , , , , , , , | 2 Comments

Sunshine Act Spotlights Requirement for Accurate Physician Information

The Physician Payments Sunshine Act shines a spotlight on the disorganized state of physician information, which is scattered across systems, often incomplete, inaccurate and inconsistent in most pharmaceutical and medical device manufacturing companies.

According to the recent Wall Street Journal article Doctors Face New Scrutiny over Gifts, “Drug companies collectively pay hundreds of millions of dollars in fees and gifts to doctors every year. In 2012, Pfizer Inc., the biggest drug maker by sales, paid $173.2 million to U.S. health-care professionals.”

The Risks of Creating Reports with Inaccurate Physician Information

TheSunshineAct

Failure to comply with the federal Sunshine Act opens companies up to damaging relationships with physicians they’ve spent years cultivating.

There are serious risks of filing inaccurate reports. Just imagine dealing with:

  • An angry call from a physician who received a $25 meal, which was inaccurately reported as $250 or who reportedly, received a gift that actually went to someone with a similar name.
  • Hefty fines and increased scrutiny from the Centers for Medicare and Medicaid Services (CMS). Fines range from $1,000 to $10,000 for each transaction with a maximum penalty of maximum $1.15 million.
  • Negative media attention. Reports will be available for anyone to access on a publicly accessible website.

How prepared are manufacturers to track and report physician payment information?

One of the major obstacles is getting a complete picture of the total payments made to one physician. Manufacturers need to know if Dr. Sriram Mennon and Dr. Sri Menon are one and the same.

On top of that, they need to understand the complicated connections between Dr. Sriram Menon, sales representatives’ expense report spreadsheets (T&E), marketing and R&D expenses, event data, and accounts payable data.

3 Steps to Ensure Physician Information is Accurate

In recent years, some pharmaceutical manufacturers and medical device manufacturers were required to respond to “Sunshine Act” type laws in states like California and Massachusetts. To simplify, automate and ensure physician payment reports are filed correctly and on time, they use an Aggregate Spend Repository or Physician Spend Management solution.

They also use these solutions to proactively track and review physician payments on a regular basis to ensure mandated thresholds are met before reports are due. Aggregate Spend Repository and Physician Spend Management solutions rely on a foundation of  data integration, data quality, and master data management (MDM) software to better manage physician information.

For those manufacturers who want to avoid the risk of losing valuable physician relationships, paying hefty fines, and receiving scrutiny from CMS and negative media attention, here are three steps to ensure accurate physician information:

  1. Bring all your scattered physician information, including identifiers, addresses and specialties into a central place to fix incorrect, missing or inconsistent information and uniquely identify each physician.
  2. Identify connections between physicians and the hospitals and clinics where they work to help aggregate accurate payment information for each physician.
  3. Standardize transaction information so it’s easy to identify the purpose of payments and related products and link transaction information to physician information.

Physicians Will Review Reports for Accuracy in January 2014

In January 2014, after physicians review the federally mandated financial disclosures, they may question the accuracy of reported payments. Within two months manufacturers will need to fix any discrepancies and file their Sunshine Act reports, which will become part of  a permanent archive. Time is precious for those companies who haven’t built an Aggregate Spend Repository or Physician Spend Management solution to drive their Sunshine Act compliance reports.

If you work for one of the pharmaceutical or medical device manufacturing companies already using an Aggregate Spend Repository or Physician Spend Management solution, please share your tips and tricks with others who are behind.

Tick tock, tick tock….

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Life Sciences, Manufacturing, Master Data Management | Tagged , , , , , , , , , , | 1 Comment

Garbage In, Treasure Out: Real-World Use Cases

Last week I described how Informatica Identity Resolution (IIR) can be used to match data from different lists or databases even when the data includes typos, translation mistakes, transcription errors, invalid abbreviations, and other errors.  IIR has a wide range of use cases.  Here are a few. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Governance, Data Integration, Enterprise Data Management, Identity Resolution, Integration Competency Centers | Tagged , , | Leave a comment

What’s worse, a disaster or your company’s delayed response?

Just in time for Halloween, I’m sharing a scary story. Warning: this is a true story. You may wonder:

  • Could this happen to me?
  • Can this situation be avoided?
  • How can I prevent this from happening to me?

Last summer, the worst wildfire in Colorado history burned hundreds of acres, 360 homes, killing two people and forcing 38,000 people to evacuate the area.

Colorado Wildfire 2013

When disaster strikes, how long will it take your organization to identify, contact and communicate with employees who need to be evacuated or deployed elsewhere? It took one company 2 days. Photo credit: Helen H. Richardson, The Denver Post

Unfortunately, it was during the Colorado wildfire that a large integrated healthcare provider with hospitals, doctors, healthcare providers and employees located throughout the United States (who shall remain nameless) realized they had a problem. They couldn’t respond in real time to the disaster by mobilizing their workforce quickly. They struggled to identify, contact and communicate with doctors, healthcare providers and employees located at the disaster area to warn them not to go to the hospital or redirect them to alternative sites where they could help.

This healthcare provider’s inability to respond to this disaster in real time was an “Aha” moment. What was holding them back was a major information problem. Because their employee information was scattered across hundreds of systems, they couldn’t pull a single, comprehensive and accurate list of doctors, healthcare providers and employees in the disaster area. They didn’t know which employees needed to be evacuated or which could be sent to assist people in other locations. So, they had to email everyone in the company.

When Disaster Strikes Blog Call OutThe good news is that we’re in the process of helping them create and maintain a central location called an “employee master” built on our  data integration, data quality, and master data management (MDM) software. This will be their “go-to” place for an up-to-date, complete and accurate list of employees and their contact information, such as work email, phone, pager (doctors still use them), home phone and personal email as well as their location, so they know exactly who is working where and how best to contact them.

This healthcare provider will no longer be held back by an information problem. In three months, they’ll be able to respond to disasters in real time by mobilizing their workforce quickly.

An interesting side note: Immediately before our Informatica team of experts arrived to talk to this healthcare provider about how we can help them, there was a power outage in the building.  They struggled to alert the employees who were impacted. So our team personally experienced the pain of this organization’s employee information problem.

When disaster strikes, will you be ready to respond in real time? Or do you have an information problem that could hold you back from mobilizing your own employees?

I want your opinion. Are you interested in more scary stories? Let me know in the comments below. I’m thinking about making this a regular series.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality, Enterprise Data Management, Healthcare, Master Data Management, Uncategorized | Tagged , , , | Leave a comment