Tag Archives: Data Integration

When It Comes to Data Integration Skills, Big Data and Cloud Projects Need the Most Expertise

Looking for a data integration expert? Join the club. As cloud computing and big data become more desirable within the Global 2000, an abundance of data integration talent is required to make both cloud and big data work properly.

The fact of the matter is that you can’t deploy a cloud-based system without some sort of data integration as part of the solution. Either from on-premise to cloud, cloud-to-cloud, or even intra-company use of private clouds, these projects need someone who knows what they are doing when it comes to data integration.

linthicum

While many cloud projects were launched without a clear understanding of the role of data integration, most people understand it now. As companies become more familiar with the could, they learn that data integration is key to the solution. For this reason, it’s important for teams to have at least some data integration talent.

The same goes for big data projects. Massive amounts of data need to be loaded into massive databases. You can’t do these projects using ad-hoc technologies anymore. The team needs someone with integration knowledge, including what technologies to bring to the project.

Generally speaking, big data systems are built around data integration solutions. Similar to cloud, the use of data integration architectural expertise should be a core part of the project. I see big data projects succeed and fail, and the biggest cause of failure is the lack of data integration expertise.

The demand for data integration talent has exploded with the growth of both big data and cloud computing. A week does not go by that I’m not asked for the names of people who have data integration, cloud computing and big data systems skills. I know several people who fit that bill, however they all have jobs and recently got raises.

The scary thing is, if these jobs go unfilled by qualified personnel, project directors may hire individuals without the proper skills and experience. Or worse, they may not hire anyone at all. If they plod along without the expertise required, in a year they’ll wonder why the systems are not sharing data the way they should, resulting in a big failure.

So, what can organizations do? You can find or build the talent you need before starting important projects. Thus, now is the time to begin the planning process, including how to find and hire the right resources. This might even mean internal training, hiring mentors or outside consultants, or working with data integration technology providers. Do everything necessary to make sure you get data integration done right the first time.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Operational Efficiency | Tagged , , | Leave a comment

Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

“Trying to improve the quality of asset data when you don’t have a solid data management infrastructure in place is like trying to save a sinking boat with a bailing bucket,” explained Dean Balog, a senior principal consultant at Noah Consulting, in this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

Dean Balog from Noah Consulting explains how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Dean has 15 years of experience in information management in the utilities industry. In this interview, Dean and I discuss the top issues facing utility executives and how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Q: Dean, what are the top issues facing utility executives?
A: The first issue is asset management / equipment maintenance. Knowing where to invest precious dollars is critical. Utility executives are engaged in a constant tug of war between two competing priorities: replacing aging infrastructure and regular maintenance.

Q. How are utility executives determining that balance?
A. You need to start with facts – the real costs and reliability information for each asset in your infrastructure. Without it, you are guessing. Basically, it is a data problem. Utility executives should ask themselves these questions:

  • Do we have the ability to capture and combine cost and reliability information from multiple sources?  Is it granular enough to be useful?
  • Do we know the maintenance costs of eight-year-old breakers versus three-year-old breakers?
  • Do our meters start failing around the average lifespan? For this example, let us say that is five years. Rather than falling uniformly into that average, do 30% of our meters fail in the first year and the rest last eight years? Those three extra years of life can certainly help out the bottom line.

Knowing your data makes all the difference. The right capital investment strategy requires combining performance, reliability, and cost data.

Quotations about data challenges faced by utility companies

Quotations about data challenges faced by utility companies

Q. Why is it difficult for utility executives to understand the real costs and reliability of assets?
A. I know this does not come as a shock, but most companies do not trust their data. Asset data is often inaccurate, inconsistent, and disconnected. Even the most basic data may not be available. For example, manufacture dates on breakers should be filled in, but they are not. If less than 50% of your breakers have manufacture dates, how can you build a preventative maintenance program? You do not even know what to address first!

A traditional approach to solving this data problem is to do a big data cleanup. You clean the data, and then before you know it, errors creep back in, and the trust in the data you have worked so hard to establish is lost.

I like to illustrate the pain of this issue by using the sinking boat analogy. Data cleanup is like bailing out the water collecting in the bottom of the boat. You think you are solving the problem but more water still seeps into the boat. You cannot stop bailing or you will sink. What you need to do is fix the leaks, and then bail out the boat. But, if you do not lift up your head from bailing long enough to see the leaks and make the right investments, you are fighting a losing battle.

Q. What can utility executives do to improve the quality of asset data?
A. First of all, you need to develop a data governance framework. Going back to the analogy, a data governance framework gives you the structure to find the leaks, fix the leaks, and monitor how much of the water has been bailed out. If the water level is still rising, you have not fixed all the leaks. But having a data governance framework is not the be-all and end-all.

You also need to appoint data stewards to be accountable for establishing and maintaining high quality asset data. The job of a data steward would be easy if there was only one system where all asset data resided. But the fact of the matter is that asset data is fragmented – scattered across multiple systems. Data stewards have a huge responsibility and they need to be supported by a solid data management infrastructure to ease the burden of managing business-critical asset information.

Webinar, Attention Utility Executives Bad Asset Data

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar.

Master Data Management (MDM) ensures business-critical asset data is consistent everywhere by pulling together data that is scattered across multiple applications. It manages and masters it in a central location on a continuous basis and shares it with any applications that need that data. MDM provides a user interface and workflow for data stewards to manage the tangled web of names and IDs these assets are known by across systems. It also gives utilities a disciplined approach to manage important relationships between the asset data, such as an asset’s performance reliability and its cost.

Q. Any other pressing issues facing utilities?
A. Yes. Another big issue is tightening regulations that consume investment dollars and become key inputs into rate case submissions and defenses. One of the complicating factors is the number of regulations is not only increasing, but the regulators are also requiring faster implementation times than ever before. So, utilities cannot just do what they have done in the past: throw more people at the problem in the short-term and resolve to fix it later by automating it “when things slow down.” That day never comes.

Q. How can utilities deal with these regulatory pressures?
A. Utilities need a new approach to deal with regulations. Start with the assumption that all data is fair game for regulators. All data must be accessible. You need to be able to report on it, not only to comply with regulations, but for competitive advantage. This requires the high quality asset information we talked about earlier, and an analytical application to:

  • Perform what-if analyses for your asset investment program;
  • Develop regulatory compliance or environmental reports quickly, because the hard work (integrating the data within your MDM program) has already been done; and
  • Get access to granular, observed reliability and cost information using your own utility’s data – not benchmark data that is already a couple of years old and highly summarized.

Q. What is your advice for utility company executives?
A. If you are the one responsible for signing off on regulatory reports and you do not look good in an orange jumpsuit, you need to invest in a plan that includes people, process, and technology to support regulatory reporting and asset management / equipment maintenance.

  • People – Data stewards have clear accountability for the quality of asset data.
  • Process – Data governance is your game plan.
  • Technology – A solid data management infrastructure consisting of data integration, data quality, and master data management is your means.

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

 Our panel of utility data experts:

  • Reveal the five toughest business challenges facing utility industry executives;
  • Explain how bad asset data could be costing you millions of dollars in operating costs;
  • Share three best practices for optimizing asset management / equipment maintenance and regulatory reporting with accurate, consistent, and connected asset information; and
  • Show you how to implement these best practices with a demonstration.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Utilities & Energy, Vertical | Tagged , , , , , , , , , , , , , , , , | Leave a comment

Why Healthcare Data Professionals belong at Informatica World 2014

This year, over one dozen healthcare leaders will share their knowledge on data driven insights at Informatica World 2014. These will be included in six tracks and over 100 breakout sessions during the conference. We are only five weeks away and I am excited that the healthcare path has grown 220% from 2013!

Informatica-World-Healthcare

Join us for these healthcare sessions:

  • Moving From Vision to Reality at UPMC : Structuring a Data Integration and Analytics Program: University of Pittsburgh Medical Center (UPMC) partnered with Informatica IPS to establish enterprise analytics as a core organizational competency through an Integration Competency Center engagement. Join IPS and UPMC to learn more.
  • HIPAA Validation for Eligibility and Claims Status in Real Time: Healthcare reform requires healthcare payers to exchange and process HIPAA messages in less time with greater accuracy. Learn how HealthNet tackled this challenge.
  • Application Retirement for Healthcare ROI : Dallas Children’s Hospital needed to retire outdated operating systems, hardware, and applications while retaining access to their legacy data for compliance purposes. Learn why application retirement is critical to the healthcare industry, how Dallas Children’s selected which applications to retire and the healthcare specific functionality that Informatica is delivering.
  • UPMC’s story of implementing a Multi-Domain MDM healthcare solution in support of Data Governance : This presentation will unfold the UPMC story of implementing a Multi-Domain MDM healthcare solution as part of an overall enterprise analytics / data warehousing effort.  MDM is a vital part of the overall architecture needed to support UPMC’s efforts to improve the quality of patient care and help create methods for personalized medicine. Today, the leading MDM solution developer will discuss how the team put together the roadmap, worked with domain specific workgroups, created the trust matrix and share his lessons learned. He will also share what they have planned for their consolidated and trusted Patient, Provider and Facility master data in this changing healthcare industry. This will also explain how the MDM program fits into the ICC (Integration Competency Center) currently implemented at UPMC.
  • Enterprise Codeset Repositories for Healthcare: Controlling the Chaos: Learn the benefit of a centralized storage point to govern and manage codes (ICD-9/10, CPT, HCPCS, DRG, SNOMED, Revenue, TOS, POS, Service Category, etc.), mappings and artifacts that reference codes.
  • Christus Health Roadmap to Data Driven Healthcare : To organize information and effectively deliver services in a hypercompetitive market, healthcare organizations must deliver data in an accurate, timely, efficient way while ensuring its clarity. Learn how CHRISTUS Health is developing and pursuing its vision for data management, including lessons adopted from other industries and the business case used to fund data management as a strategic initiative.
  • Business Value of Data Quality : This customer panel will address why data quality is a business imperative which significantly affects business success.
  • MD Anderson – Foster Business and IT Collaboration to Reveal Data Insights with Informatica: Is your integration team intimidated by the new Informatica 9.6 tools? Do your analysts and business users require faster access to data and answers about where data comes from. If so, this session is a must attend.
  • The Many Faces of the Healthcare Customer : In the healthcare industry, the customer paying for services (individuals, insurers, employers, the government) is not necessarily the decision-influencer (physicians) or even the patient — and the provider comes in just as many varieties. Learn how, Quest, the world’s leading provider of diagnostic information leverages master data management to resolve the chaos of serving 130M+ patients, 1200+ payers, and almost half of all US physicians and hospitals.
  • Lessons in Healthcare Enterprise Information Management from St. Joseph Health and Sutter Health St. Joseph : Health created a business case for enterprise information management, then built a future-proofed strategy and architecture to unlock, share, and use data. Sutter Health engaged the business, established a governance structure, and freed data from silos for better organizational performance and efficiency. Come hear these leading health systems share their best practices and lessons learned in making data-driven care a reality.
  • Navinet, Inc and Informatica – Delivering Network Intelligence, The Value to the Payer, Provider and Patient: Today, healthcare payers and providers must share information in unprecedented ways to reduce redundancy, cut costs, coordinate care, and drive positive outcomes. Learn how NaviNet’s vision of a “smart” communications network combines Big Data and network intelligence to share proactive real-time information between insurers and providers.
  • Providence Health Services takes a progressive approach to automating ETL development and documentation: A newly organized team of BI Generalists, most of whom have no ETL experience and even fewer with Informatica skills, were tasked with Informatica development when Providence migrated from Microsoft SSIS to Informatica. Learn how the team relied on Informatica to alleviate the burden of low value tasks.
  • Using IDE for Data On-boarding Framework at HMS : HMS’s core business is to onboard large amounts of external data that arrive in different formats. HMS developed a framework using IDE to standardize the on-boarding process. This tool can be used by non-IT analysts and provides standard profiling reports and reusable mapping “templates” which has improved the hand-off to IT and significantly reduced misinterpretations and errors.

Additionally, this year’s attendees are invited to:

  1. Over 100 breakout sessions: Customers from other industries, including financial services, insurance, retail, manufacturing, oil and gas will share their data driven stories.
  2. Healthcare networking reception on Wednesday, May 14th: Join your healthcare peers and Informatica’s healthcare team on Wednesday from 6-7:30pm in the Vesper bar of the Cosmopolitan Resort for a private Healthcare networking reception. Come and hear firsthand how others are achieving a competitive advantage by maximizing return on data while enjoying hors d’oeuvres and cocktails.
  3. Data Driven Healthcare Roundtable Breakfast on Wednesday, May 14th. Customer led roundtable discussion.
  4. Personal meetings: Since most of the Informatica team will be in attendance, this is a great opportunity to meet face to face with Informatica’s product, services and solution teams.
  5. Informatica Pavilion and Partner Expo: Interact with the latest Informatica and our partners provide.
  6. An expanded “Hands-on-Lab”: Learn from real-life case studies and talk to experts about your unique environment.

The Healthcare industry is facing extraordinary changes and uncertainty — both from a business and a technology perspective. Join us to learn about key drivers for change and innovative uses of data technology solutions to discover sources for operational and process improvement. There is still time to Register now!

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business/IT Collaboration, Data Integration, Healthcare, Informatica World 2014, Master Data Management | Tagged , , , | 2 Comments

Data Ambition: What Does It Take To Become an Analytics Leader?

shutterstock_136599614

Which comes first: innovation or analytics?

Bain & Company released some survey findings a few months back that actually put a value on big data.  Companies with advanced analytic capabilities, the consultancy finds, are twice as likely to be in the top quartile of financial performance within their industries; five times as likely to make decisions much faster than market peers; three times as likely to execute decisions as intended; and twice as likely to use data very frequently when making decisions.

This is all good stuff, and the survey, which covered the input of 400 executives, makes a direct correlation between big data analytics efforts and the business’s bottom line. However, it begs a question: How does an organization become one of these analytic leaders? And there’s a more brain-twisting question to this as well: would the type of organization supporting an advanced analytics culture be more likely to be ahead of its competitors because its management tends to be more forward-thinking on a lot of fronts, and not just big data?

You just can’t throw a big data or analytics program or solution set on top of the organization (or drop in a data scientist) and expect to be dazzled with sudden clarity and insight. If an organization is dysfunctional, with a lot of silos, fiefdoms, or calcified and uninspired management, all the big data in the world isn’t going to lift its intelligence quota.

The author of the Bain and Company study, Travis Pearson and Rasmus Wegener, point out that “big data isn’t just one more technology initiative” – “in fact, it isn’t a technology initiative at all; it’s a business program that requires technical savvy.”

Succeeding with big data analytics requires a change in the organization’s culture, and the way it approaches problems and opportunities. The enterprise needs to be open to innovation and change. And, as Pearson and Wegener point out, “you need to embed big data deeply into your organization. It’s the only way to ensure that information and insights are shared across business units and functions. This also guarantees the entire company recognizes the synergies and scale benefits that a well-conceived analytics capability can provide.”

Pearson and Wegener also point to the following common characteristics of big data leaders they have studied:

Pick the “right angle of entry”: There are many areas of the business that can benefit from big data analytics, but just a few key areas that will really impact the business. It’s important to focus big data efforts on the right things. Pearson and Wegener say there are four areas where analytics can be relevant: “improving existing products and services, improving internal processes, building new product or service offerings, and transforming business models.”

Communicate big data ambition: Make it clear that big data analytics is a strategy that has the full commitment of management, and it’s a key part of the organization’s strategy. Messages that need to be communicated: “We will embrace big data as a new way of doing business. We will incorporate advanced analytics and insights as key elements of all critical decisions.” And, the co-authors add, “the senior team must also answer the question: To what end? How is big data going to improve our performance as a business? What will the company focus on?”

Sell and evangelize: Selling big data is a long-term process, not just one or two announcements at staff meetings. “Organizations don’t change easily and the value of analytics may not be apparent to everyone, so senior leaders may have to make the case for big data in one venue after another,” the authors caution. Big data leaders, they observe, have learned to take advantage of the tools at their disposal: they “define clear owners and sponsors for analytics initiatives. They provide incentives for analytics-driven behavior, thereby ensuring that data is incorporated into processes for making key decisions. They create targets for operational or financial improvements. They work hard to trace the causal impact of big data on the achievement of these targets.”

Find an organizational “home” for big data analysis: A common trend seen among big data leaders is that they have created an organizational home for their advanced analytics capability, “often a Center of Excellence overseen by a chief analytics officer,” according to Pearson and Wegener. This is where matters such as strategy, collection and ownership of data across business functions come into play. Organizations also need to plan how to generate insights, and prioritize opportunities and allocation of data analysts’ scientists’ time.

There is a hope and perception that adopting data analytics will open up new paths to innovation. But it often takes a innovative spirit to open up analytics.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration | Tagged , | 2 Comments

The Links Between Health Information Exchanges and Data Integration

Health Information ExchangeAccording to Health IT Portal, “Having an integrated health IT infrastructure allows a healthcare organization and its providers to streamline the flow of data from one department to the next. Not all health settings, however, find themselves in this situation. Either through business agreements or vendor selection processes, many a healthcare organization has to spend considerable time and resources getting their disparate health IT systems to talk to each.”

In other words, you can’t leverage Health Information Exchanges (HIEs) without a sound data integration strategy. This is something I’ve ranted about for years. The foundation of any entity-to-entity exchange, health, finance, or other, is that all relevant systems freely communicate, and thus able to consume and produce information that’s required by any information exchange.

The article cites the case of Memorial Healthcare, a community health care system in Owosso, MI. Memorial Healthcare has Meditech on the hospital side and Allscripts in its physician offices. Frank Fear, the CIO of Memorial Healthcare, spent the last few years working on solutions to enable data integration. The resulting solution between the two vendors’ offerings, as well as within the same system, is made up of both an EHR and a practice management solution.

Those in the world of healthcare are moving headlong into these exchanges. Most have no clue as to what must change within internal IT to get ahead of the need for the free flow of information. Moreover, there needs to be a good data governance strategy in place, as well as security, and a focus on compliance issues as well.

The reality is that, for the most part, data integration in the world of healthcare is largely ad-hoc, and tactical in nature. This has led to no standardized method for systems to talk one-to-another, and certainly no standard ways for data to flow out through exchanges. Think of plumbing that was built haphazardly and ad hoc over the years, with whatever was quick and easy. Now, you’ve finally turned on the water and there are many, many leaks.

In terms of data integration, healthcare has been underfunded for far too long. Now clear regulatory changes require better information management and security approaches. Unfortunately, healthcare IT is way behind, in terms of leveraging proper data integration approaches, as well as leveraging the right data integration technology.

As things change in the world of healthcare, including the move to HIEs, I suspect that data integration will finally get a hard look from those who manage IT in healthcare organizations. However, they need to do this with some sound planning, which should include an understanding of what the future holds in terms of information management, and how to create a common infrastructure that supports most of the existing and future use cases. Healthcare, you’re about 10 years behind, so let’s get moving this year.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Healthcare | Tagged , , , | Leave a comment

Architects: 8 Great Reasons To Be At Informatica World 2014

Architects INFA14For years, corporations have tried to solve business problems by “throwing new enterprise applications at them.” This strategy has created ‘silos of data’ that are attached to these applications and databases. As a result, it is now increasingly difficult to find, access and use the correct data for new projects.

Data fragmentation leads to slow project delivery and “shadow IT.” Without a change in enterprise data strategy, is it unlikely this problem will improve. On the contrary, the growth of cloud applications and platforms, mobile applications, NoSQL, and the “Internet of Things” create increasing urgency.  Unless a new approach to enterprise data management is taken, the house of cards is going to crumble.

I seriously doubt that this is a surprise to most of you. The question is, “What should we do about it?” The Informatica World 2014 event is a perfect place to find answers. Here are eight benefits architects will enjoy at Informatica World 2014:

  1. A dedicated track of breakout sessions for architects. This track explores reference architectures, design patterns, best practices and real-world examples for building and sustaining your next-generation information architecture. The track will begin with a keynote on the broader issues of enterprise data architecture. It will also include panel of architects from leading companies.
  2. Inspiration to participate in defining the enterprise data architecture of the future. (I can’t spoil it by divulging the details here, but I promise that it will be interesting and worth your while!)
  3. New insights on how to manage information for the data-centric enterprise. These will expand your thinking about data architecture and platforms.
  4. Chances to network with architect peers.
  5. A Hands-on Lab, were you can talk to Informatica experts about their products and solutions.
  6. Updates on what Informatica is doing to bring business subject matter experts in as full partners in the co-development of data-related projects.
  7. A chance to win a one-hour one-on-one session with an Informatica architect at Informatica World.
  8. A chance to learn to control your biggest enterprise system: The collection of all data-moving resources across your company.

We believe that architecture is the key to unleashing information potential. To compete in today’s global 24x7x365 economy, business requires well-designed information architectures that can continuously evolve to support the new heights of flexibility, efficiency, responsiveness, and trust. I hope you will join us in Las Vegas, May 12-15!

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Enterprise Data Management, Informatica World 2014 | Tagged , | Leave a comment

Why the Government needs Data Integration

Data IntegrationLoraine Lawson does an outstanding job of covering the issues around government use of “data heavy” projects.  This includes a report by the government IT site, MeriTalk.

“The report identifies five factors, which it calls the Big Five of IT, that will significantly affect the flow of data into and out of organizations: Big data, data center consolidation, mobility, security and cloud computing.”

MeriTalk surveyed 201 state and local government IT professionals, and found that, while the majority of organizations plan to deploy the Big Five, 94 percent of IT pros say their agency is not fully prepared.  “In fact, if Big Data, mobile, cloud, security and data center consolidation all took place today, 89 percent say they’d need additional network capacity to maintain service levels. Sixty-three percent said they’d face network bottleneck risks, according to the report.”

This report states what most who work with the government already know; the government is not ready for the influx of data.  Nor is the government ready for the different uses of data, and thus there is a large amount of risk as the amount of data under management within the government explodes.

Add issues with the approaches and technologies leveraged for data integration to the list.  As cloud computing and mobile computing continue to rise in popularity, there is not a clear strategy and technology for syncing data in the cloud, or on mobile devices, with data that exists within government agencies.  Consolidation won’t be possible without a sound data integration strategy, nor will the proper use of big data technology.

The government sees a huge wave of data heading for it, as well as opportunities with new technology such as big data, cloud, and mobile.  However, there doesn’t seem to be an overall plan to surf this wave.  According to the report, if they do wade into the big data wave, they are likely to face much larger risks.

The answer to this problem is really rather simple.  As the government moves to take advantage of the rising tide of data, as well as new technologies, they need to be funded to get the infrastructure and the technology they need to be successful.  The use of data integration approaches and technologies, for example, will return the investment ten-fold, if properly introduced into the government problem domains.  This includes integration with big data systems, mobile devices, and, of course, the rising use of cloud-based platforms.

While data integration is not a magic bullet for the government, nor any other organization, the proper and planned use of this technology goes a long way toward reducing the inherent risks that the report identified.  Lacking that plan, I don’t think the government will get very far, very fast.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Public Sector | Tagged , , | Leave a comment

The Surprising Link Between Hurricanes and Strawberry Pop-Tarts: Brought to you by Clean, Consistent and Connected Data

What do you think Wal-Mart’s best-seller is right before a hurricane? If you guessed water like I did, you’d be wrong. According to this New York Times article, “What Wal-Mart Knows About Customers’ Habits” the retailer sells 7X more strawberry Pop-Tarts in Florida right before a hurricane than any other time. Armed with predictive analytics and a solid information management foundation, the team stocks up on strawberry Pop-Tarts to make sure they have enough supply to meet demand.

Andrew Donaher advises IT leaders to ask business  leaders how much bad data is costing them.

Andrew Donaher
advises IT leaders to ask business
leaders how much bad data is costing them.

I learned this fun fact from Andrew Donaher, Director of Information Management Strategy at Groundswell Group, a consulting firm based in western Canada that specializes in information management services. In this interview, Andy and I discuss how IT leaders can increase the value of data to drive business value, explain how some IT leaders are collaborating with business leaders to improve predictive analytics, and share advice about how to talk to business leaders, such as the CFO about investing in an information management strategy.

Q. Andy, what can IT leaders do to increase the value of data to drive business value?

A. Simply put, each business leader in a company needs to focus on achieving their goals. The first step IT leaders should take is to engage with each business leader to understand their long and short-term goals and ask some key questions, such as:

  • What type of information is critical to achieving their goals?
  • Do they have the information they need to make the next decision or take the next best action?
  • Is all the data they need in house? If not, where is it?
  • What challenges are they facing when it comes to their data?
  • How much time are people spending trying to pull together the information they need?
  • How much time are people spending fixing bad data?
  • How much is this costing them?
  • What opportunities exist if they had all the information they need and could trust it?
You need a solid information management strategy to make the shift from looking into the rear-view mirror realizing the potential business value of predictive analytics.

If you want to get the business value you’re expecting by shifting from rear-view mirror style reporting to predictive analytics, you need to use clean, consistent and connected data

Q. How are IT leaders collaborating with business partners to improve predictive analytics?

A. Wal-Mart’s IT team collaborated with the business to improve the forecasting and demand planning process. Once they found out what was important, IT figured out how to gather, store and seamlessly integrate external data like historical weather and future weather forecasts into the process. This enabled the business to get more valuable insights, tailor product selections at particular stores, and generate more revenue.

Q. Why is it difficult for IT leaders to convince business leaders to invest in an information management strategy?

A. In most cases, business leaders don’t see the value in an information management strategy or they haven’t seen value before. Unfortunately this often happens because IT isn’t able to connect the dots between the information management strategy and the outcomes that matter to the business.

Business leaders see value in having control over their business-critical information, being able to access it quickly and to allocate their resources to get any additional information they need. Relinquishing control takes a lot of trust. When IT leaders want to get buy-in from business leaders to invest in an information management strategy they need to be clear about how it will impact business priorities. Data integration, data quality and master data management (MDM) should be built into the budget for predictive or advanced analytics initiatives to ensure the data the business is relying on is clean, consistent and connected.

Q: You liked this quotation from an IT leader at a beer manufacturing company, “We don’t just make beer. We make beer and data. We need to manage our product supply chain and information supply chain equally efficiently.”

A.What I like about that quote is the IT leader was able to connect the dots between the primary revenue generator for the company and the role data plays in improving organizational performance. That’s something that a lot of IT leaders struggle with. IT leaders should always be thinking about what’s the next thing they can do to increase business value with the data they have in house and other data that the company may not yet be tapping into.

Q. According to a recent survey by Gartner and the Financial Executives Research Foundation, 60% of Chief Financial Officers (CFOs) are investing in analytics and improved decision-making as their #1 IT priority. What’s your advice for IT Leaders who need to get buy-in from the CFO to invest in information management?

A. Read your company’s financial statements, especially the Management Discussion and Analysis section. You’ll learn about the company’s direction, what the stakeholders are looking for, and what the CFO needs to deliver. Offer to get your CFO the information s/he needs to make decisions and to deliver. When you talk to a CFO about investing in information management, focus on the two things that matter most:

  1. Risk mitigation: CFOs know that bad decisions based on bad information can negatively impact revenue, expenses and market value. If you have to caveat all your decisions because you can’t trust the information, or it isn’t current, then you have problems. CFOs need to trust their information. They need to feel confident they can use it to make important financial decisions and deliver accurate reports for compliance.
  2. Opportunity: Once you have mitigated the risk and can trust the data, you can take advantage of predictive analytics. Wal-Mart doesn’t just do forecasting and demand planning. They do “demand shaping.” They use accurate, consistent and connected data to plan events and promotions not just to drive inventory turns, but to optimize inventory and the supply chain process. Some companies in the energy market are using accurate, consistent and connected data for predictive asset maintenance. By preventing unplanned maintenance they are saving millions of dollars, protecting revenue streams, and gaining health and safety benefits.

To do either of these things you need a solid information management plan to manage clean, consistent and connected information.  It takes a commitment but the pays offs can be very significant.

Q. What are the top three business requirements when building an information management and integration strategy?
A: In my experience, IT leaders should focus on:

  1. Business value: A solid information management and integration strategy that has a chance of getting funded must be focused on delivering business value. Otherwise, your strategy will lack clarity and won’t drive priorities. If you focus on business value, it will be much easier to gain organizational buy-in. Get that dollar figure before you start anything. Whether it is risk mitigation, time savings, revenue generation or cost savings, you need to calculate that value to the business and get their buy-in.
  2. Trust: When people know they can trust the information they are getting it liberates them to explore new ideas and not have to worry about issues in the data itself.
  3. Flexibility: Flexibility should be banked right into the strategy. Business drivers will evolve and change. You must be able to adapt to change. One of the most neglected, and I would argue most important, parts of a solid strategy is the ability to make continuous small improvements that may require more effort than a typical maintenance event, but don’t create long delays. This will be very much appreciated by the business. We work with our clients to ensure that this is addressed.
FacebookTwitterLinkedInEmailPrintShare
Posted in Business/IT Collaboration, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Retail | Tagged , , , , , , , , , | 11 Comments

Are you Ready for the Massive Wave of Data?

Leo Eweani makes the case that the data tsunami is coming.  “Businesses are scrambling to respond and spending accordingly. Demand for data analysts is up by 92%; 25% of IT budgets are spent on the data integration projects required to access the value locked up in this data “ore” – it certainly seems that enterprise is doing The Right Thing – but is it?”

Data is exploding within most enterprises.  However, most enterprises have no clue how to manage this data effectively.  While you would think that an investment in data integration would be an area of focus, many enterprises don’t have a great track record in making data integration work.  “Scratch the surface, and it emerges that 83% of IT staff expect there to be no ROI at all on data integration projects and that they are notorious for being late, over-budget and incredibly risky.”

Are you Ready for the massive Wave of Data

The core message from me is that enterprises need to ‘up their game’ when it comes to data integration.  This recommendation is based upon the amount of data growth we’ve already experienced, and will experience in the near future.  Indeed, a “data tsunami” is on the horizon, and most enterprises are ill prepared for it.

So, how do you get prepared?   While many would say it’s all about buying anything and everything, when it comes to big data technology, the best approach is to splurge on planning.  This means defining exactly what data assets are in place now, and will be in place in the future, and how they should or will be leveraged.

To face the forthcoming wave of data, certain planning aspects and questions about data integration rise to the top:

Performance, including data latency.  Or, how quickly does the data need to flow from point or points A to point or points B?  As the volume of data quickly rises, the data integration engines have got to keep up.

Data security and governance.  Or, how will the data be protected both at-rest and in-flight, and how will the data be managed in terms of controls on use and change?

Abstraction, and removing data complexity.  Or, how will the enterprise remap and re-purpose key enterprise data that may not currently exist in a well-defined and functional structure?

Integration with cloud-based data.  Or, how will the enterprise link existing enterprise data assets with those that exist on remote cloud platforms?

While this may seem like a complex and risky process, think through the problems, leverage the right technology, and you can remove the risk and complexity.  The enterprises that seem to fail at data integration do not follow that advice.

I suspect the explosion of data to be the biggest challenge enterprise IT will face in many years.  While a few will take advantage of their data, most will struggle, at least initially.  Which route will you take?

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Cloud Computing, Data Governance, Data Integration | Tagged , , , | Leave a comment

Business Agility Depends on Data Integration Agility and Your Skills

data integration agilityOver the last 40 years, data has become increasingly distributed.  It used to all sit on storage connected to a mainframe.  It used to be that the application of computing power to solve business problems was limited by the availability of CPU, memory, network and disk.  Those limitations are no longer big inhibitors.  Data fragmentation is now the new inhibitor to business agility.   Data is now generated from distributed data sources not just within a corporation, but from business partners, from device sensors and from consumers Facebook-ing and tweeting away on the internet.

So to solve any interesting business problem in today’s fragmented data world, you now have to pull data together from a wide variety of data sources.  That means business agility 100% depends on data integration agility.  But how do you do deliver that agility in a way that is not just fast, but reliable, and delivers high quality data?

First, to achieve data integration agility, you need to move from a traditional waterfall development process to an agile development process.

Second, if you need reliability, you have to think about how you start treating your data integration process as a critical business process.  That means thinking about how you will make your integration processes highly available.  It also means you need to monitor and validate your operational data integration processes on an ongoing basis.   The good news is that the capabilities you need for data validation as well as operational monitoring and alerting for your data integration process are now built into Informatica’s newest PowerCenter Edition, PowerCenter Premium Edition.

Lastly, the days where you can just move data from A to B without including a data quality process are over.  Great data doesn’t happen by accident, it happens by design.  And that means you also have to build in data quality directly into your data integration process.

Great businesses depend on great data.  And great data means data that is delivered on time, with confidence and with high quality.    So think about how your understanding of data integration and great data can make your career.  Great businesses depend on great data and people like you who have the skills to make a difference.     As a data professional,  the time has never been better for you to make a contribution to the greatness of your organization.   You have the opportunity to make a difference and have an impact because your skills and your understanding of data integration has never been more critical.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality | Tagged , , | Leave a comment