Category Archives: Data Integration

Becoming a Revenue Driven Business Model through Data is Painful for Government Agencies

Recently, I presented a Business Value Assessment to a client.  The findings were based on a revenue-generating state government agency. Everyone at the presentation was stunned to find out how much money was left on the table by not basing their activities on transactions, which could be cleanly tied to the participating citizenry and a variety of channel partners. There was over $38 million in annual benefits left over, which included partially recovered lost revenue, cost avoidance and reduction. A higher data impact to this revenue driven business model could have prevented this.

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Should government leaders go to the “School of Data” to understand where more revenue can be created without necessary tax hikes? (Source:creativecommons.org)

Given the total revenue volume, this may seem small. However, after factoring in the little technology effort required to “collect and connect” data from existing transactions, it is actually extremely high.

The real challenge for this organization will be the required policy transformation to turn the organization from “data-starved” to “data-intensive”. This would eliminate strategic decisions around new products, locations and customers relying on surveys that face sampling errors, biases, etc. Additionally, surveys are often delayed, making them practically ineffective in this real-time world we live in today.

Despite no applicable legal restrictions, the leadership’s main concern was that gathering more data would erode the public’s trust and positive image of the organization.

To be clear; by “more” data being collected by this type of government agency I mean literally 10% of what any commercial retail entity has gathered on all of us for decades.  This is not the next NSA revelation as any conspiracy theorist may fear.

While I respect their culturally driven self-censorship despite no legal barricades, it raises their stakeholders’ (the state’s citizenry) concern over its performance.  To be clear, there would be no additional revenue for the state’s programs without more citizen data.  You may believe that they already know everything about you, including your income, property value, tax information, etc. However, inter-departmental sharing of criminally-non-relevant information is legally constrained.

Another interesting finding from this evaluation was that they had no sense of conversion rate from email and social media campaigns. Impressions from click-throughs as well as hard/soft bounces were more important than tracking who actually generated revenue.

This is a very market-driven organization compared to other agencies. It actually does try to measure itself like a commercial enterprise and attempts to change in order to generate additional revenue for state programs benefiting the citizenry. I can only imagine what non-revenue-generating agencies (local, state or federal) do in this respect.  Is revenue-oriented thinking something the DoD, DoJ or Social Security should subscribe to?

Think tanks and political pundits are now looking at the trade-off between bringing democracy to every backyard on our globe and its long-term, budget ramifications. The DoD is looking to reduce the active component to its lowest in decades given the U.S. federal debt level.

Putting the data bits and pieces together for revenue

Putting the data bits and pieces together for revenue

recent article in HBR explains that cost cutting has never sustained an organization’s growth over a longer period of time, but new revenue sources did. Is your company or government agency only looking at cost and personnel productivity?

Disclaimer:

Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data Integration, Data Quality, Operational Efficiency, Real-Time | Tagged , , | Leave a comment

When It Comes to Data Integration Skills, Big Data and Cloud Projects Need the Most Expertise

Looking for a data integration expert? Join the club. As cloud computing and big data become more desirable within the Global 2000, an abundance of data integration talent is required to make both cloud and big data work properly.

The fact of the matter is that you can’t deploy a cloud-based system without some sort of data integration as part of the solution. Either from on-premise to cloud, cloud-to-cloud, or even intra-company use of private clouds, these projects need someone who knows what they are doing when it comes to data integration.

linthicum

While many cloud projects were launched without a clear understanding of the role of data integration, most people understand it now. As companies become more familiar with the could, they learn that data integration is key to the solution. For this reason, it’s important for teams to have at least some data integration talent.

The same goes for big data projects. Massive amounts of data need to be loaded into massive databases. You can’t do these projects using ad-hoc technologies anymore. The team needs someone with integration knowledge, including what technologies to bring to the project.

Generally speaking, big data systems are built around data integration solutions. Similar to cloud, the use of data integration architectural expertise should be a core part of the project. I see big data projects succeed and fail, and the biggest cause of failure is the lack of data integration expertise.

The demand for data integration talent has exploded with the growth of both big data and cloud computing. A week does not go by that I’m not asked for the names of people who have data integration, cloud computing and big data systems skills. I know several people who fit that bill, however they all have jobs and recently got raises.

The scary thing is, if these jobs go unfilled by qualified personnel, project directors may hire individuals without the proper skills and experience. Or worse, they may not hire anyone at all. If they plod along without the expertise required, in a year they’ll wonder why the systems are not sharing data the way they should, resulting in a big failure.

So, what can organizations do? You can find or build the talent you need before starting important projects. Thus, now is the time to begin the planning process, including how to find and hire the right resources. This might even mean internal training, hiring mentors or outside consultants, or working with data integration technology providers. Do everything necessary to make sure you get data integration done right the first time.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Operational Efficiency | Tagged , , | Leave a comment

Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

“Trying to improve the quality of asset data when you don’t have a solid data management infrastructure in place is like trying to save a sinking boat with a bailing bucket,” explained Dean Balog, a senior principal consultant at Noah Consulting, in this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

Dean Balog from Noah Consulting explains how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Dean has 15 years of experience in information management in the utilities industry. In this interview, Dean and I discuss the top issues facing utility executives and how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.

Q: Dean, what are the top issues facing utility executives?
A: The first issue is asset management / equipment maintenance. Knowing where to invest precious dollars is critical. Utility executives are engaged in a constant tug of war between two competing priorities: replacing aging infrastructure and regular maintenance.

Q. How are utility executives determining that balance?
A. You need to start with facts – the real costs and reliability information for each asset in your infrastructure. Without it, you are guessing. Basically, it is a data problem. Utility executives should ask themselves these questions:

  • Do we have the ability to capture and combine cost and reliability information from multiple sources?  Is it granular enough to be useful?
  • Do we know the maintenance costs of eight-year-old breakers versus three-year-old breakers?
  • Do our meters start failing around the average lifespan? For this example, let us say that is five years. Rather than falling uniformly into that average, do 30% of our meters fail in the first year and the rest last eight years? Those three extra years of life can certainly help out the bottom line.

Knowing your data makes all the difference. The right capital investment strategy requires combining performance, reliability, and cost data.

Quotations about data challenges faced by utility companies

Quotations about data challenges faced by utility companies

Q. Why is it difficult for utility executives to understand the real costs and reliability of assets?
A. I know this does not come as a shock, but most companies do not trust their data. Asset data is often inaccurate, inconsistent, and disconnected. Even the most basic data may not be available. For example, manufacture dates on breakers should be filled in, but they are not. If less than 50% of your breakers have manufacture dates, how can you build a preventative maintenance program? You do not even know what to address first!

A traditional approach to solving this data problem is to do a big data cleanup. You clean the data, and then before you know it, errors creep back in, and the trust in the data you have worked so hard to establish is lost.

I like to illustrate the pain of this issue by using the sinking boat analogy. Data cleanup is like bailing out the water collecting in the bottom of the boat. You think you are solving the problem but more water still seeps into the boat. You cannot stop bailing or you will sink. What you need to do is fix the leaks, and then bail out the boat. But, if you do not lift up your head from bailing long enough to see the leaks and make the right investments, you are fighting a losing battle.

Q. What can utility executives do to improve the quality of asset data?
A. First of all, you need to develop a data governance framework. Going back to the analogy, a data governance framework gives you the structure to find the leaks, fix the leaks, and monitor how much of the water has been bailed out. If the water level is still rising, you have not fixed all the leaks. But having a data governance framework is not the be-all and end-all.

You also need to appoint data stewards to be accountable for establishing and maintaining high quality asset data. The job of a data steward would be easy if there was only one system where all asset data resided. But the fact of the matter is that asset data is fragmented – scattered across multiple systems. Data stewards have a huge responsibility and they need to be supported by a solid data management infrastructure to ease the burden of managing business-critical asset information.

Webinar, Attention Utility Executives Bad Asset Data

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar.

Master Data Management (MDM) ensures business-critical asset data is consistent everywhere by pulling together data that is scattered across multiple applications. It manages and masters it in a central location on a continuous basis and shares it with any applications that need that data. MDM provides a user interface and workflow for data stewards to manage the tangled web of names and IDs these assets are known by across systems. It also gives utilities a disciplined approach to manage important relationships between the asset data, such as an asset’s performance reliability and its cost.

Q. Any other pressing issues facing utilities?
A. Yes. Another big issue is tightening regulations that consume investment dollars and become key inputs into rate case submissions and defenses. One of the complicating factors is the number of regulations is not only increasing, but the regulators are also requiring faster implementation times than ever before. So, utilities cannot just do what they have done in the past: throw more people at the problem in the short-term and resolve to fix it later by automating it “when things slow down.” That day never comes.

Q. How can utilities deal with these regulatory pressures?
A. Utilities need a new approach to deal with regulations. Start with the assumption that all data is fair game for regulators. All data must be accessible. You need to be able to report on it, not only to comply with regulations, but for competitive advantage. This requires the high quality asset information we talked about earlier, and an analytical application to:

  • Perform what-if analyses for your asset investment program;
  • Develop regulatory compliance or environmental reports quickly, because the hard work (integrating the data within your MDM program) has already been done; and
  • Get access to granular, observed reliability and cost information using your own utility’s data – not benchmark data that is already a couple of years old and highly summarized.

Q. What is your advice for utility company executives?
A. If you are the one responsible for signing off on regulatory reports and you do not look good in an orange jumpsuit, you need to invest in a plan that includes people, process, and technology to support regulatory reporting and asset management / equipment maintenance.

  • People – Data stewards have clear accountability for the quality of asset data.
  • Process – Data governance is your game plan.
  • Technology – A solid data management infrastructure consisting of data integration, data quality, and master data management is your means.

If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data

 Our panel of utility data experts:

  • Reveal the five toughest business challenges facing utility industry executives;
  • Explain how bad asset data could be costing you millions of dollars in operating costs;
  • Share three best practices for optimizing asset management / equipment maintenance and regulatory reporting with accurate, consistent, and connected asset information; and
  • Show you how to implement these best practices with a demonstration.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Integration, Data Quality, Enterprise Data Management, Master Data Management, Partners, Utilities & Energy, Vertical | Tagged , , , , , , , , , , , , , , , , | Leave a comment

Data: The Unsung Hero (or Villain) of every Communications Service Provider

The faceless hero of CSPs: Data

The faceless hero of CSPs: Data

Analyzing current business trends helps illustrate how difficult and complex the Communication Service Provider business environment has become. CSPs face many challenges. Clients expect high quality, affordable content that can move between devices with minimum advertising or privacy concerns. To illustrate this phenomenon, here are a few recent examples:

  • Apple is working with Comcast/NBC Universal on a new converged offering
  • Vodafone purchased the Spanish cable operator, Ono, having to quickly separate the wireless customers from the cable ones and cross-sell existing products
  • Net neutrality has been scuttled in the US and upheld in the EU so now a US CSP can give preferential bandwidth to content providers, generating higher margins
  • Microsoft’s Xbox community collects terabytes of data every day making effective use, storage and disposal based on local data retention regulation a challenge
  • Expensive 4G LTE infrastructure investment by operators such as Reliance is bringing streaming content to tens of millions of new consumers

To quickly capitalize on “new” (often old, but unknown) data sources, there has to be a common understanding of:

  • Where the data is
  • What state it is in
  • What it means
  • What volume and attributes are required to accommodate a one-off project vs. a recurring one

When a multitude of departments request data for analytical projects with their one-off, IT-unsanctioned on-premise or cloud applications, how will you go about it? The average European operator has between 400 and 1,500 (known) applications. Imagine what the unknown count is.

A European operator with 20-30 million subscribers incurs an average of $3 million per month due to unpaid invoices. This often results from incorrect or incomplete contact information. Imagine how much you would have to add for lost productivity efforts, including gathering, re-formatting, enriching, checking and sending  invoices. And this does not even account for late invoice payments or extended incorrect credit terms.

Think about all the wrong long-term conclusions that are being drawn from this wrong data. This single data problem creates indirect cost in excess of three times the initial, direct impact of unpaid invoices.

Want to fix your data and overcome the accelerating cost of change? Involve your marketing, CEM, strategy, finance and sales leaders to help them understand data’s impact on the bottom line.

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Data Governance, Data Integration, Data Quality, Operational Efficiency | Tagged , , , , | Comments Off

Why Healthcare Data Professionals belong at Informatica World 2014

This year, over one dozen healthcare leaders will share their knowledge on data driven insights at Informatica World 2014. These will be included in six tracks and over 100 breakout sessions during the conference. We are only five weeks away and I am excited that the healthcare path has grown 220% from 2013!

Informatica-World-Healthcare

Join us for these healthcare sessions:

  • Moving From Vision to Reality at UPMC : Structuring a Data Integration and Analytics Program: University of Pittsburgh Medical Center (UPMC) partnered with Informatica IPS to establish enterprise analytics as a core organizational competency through an Integration Competency Center engagement. Join IPS and UPMC to learn more.
  • HIPAA Validation for Eligibility and Claims Status in Real Time: Healthcare reform requires healthcare payers to exchange and process HIPAA messages in less time with greater accuracy. Learn how HealthNet tackled this challenge.
  • Application Retirement for Healthcare ROI : Dallas Children’s Hospital needed to retire outdated operating systems, hardware, and applications while retaining access to their legacy data for compliance purposes. Learn why application retirement is critical to the healthcare industry, how Dallas Children’s selected which applications to retire and the healthcare specific functionality that Informatica is delivering.
  • UPMC’s story of implementing a Multi-Domain MDM healthcare solution in support of Data Governance : This presentation will unfold the UPMC story of implementing a Multi-Domain MDM healthcare solution as part of an overall enterprise analytics / data warehousing effort.  MDM is a vital part of the overall architecture needed to support UPMC’s efforts to improve the quality of patient care and help create methods for personalized medicine. Today, the leading MDM solution developer will discuss how the team put together the roadmap, worked with domain specific workgroups, created the trust matrix and share his lessons learned. He will also share what they have planned for their consolidated and trusted Patient, Provider and Facility master data in this changing healthcare industry. This will also explain how the MDM program fits into the ICC (Integration Competency Center) currently implemented at UPMC.
  • Enterprise Codeset Repositories for Healthcare: Controlling the Chaos: Learn the benefit of a centralized storage point to govern and manage codes (ICD-9/10, CPT, HCPCS, DRG, SNOMED, Revenue, TOS, POS, Service Category, etc.), mappings and artifacts that reference codes.
  • Christus Health Roadmap to Data Driven Healthcare : To organize information and effectively deliver services in a hypercompetitive market, healthcare organizations must deliver data in an accurate, timely, efficient way while ensuring its clarity. Learn how CHRISTUS Health is developing and pursuing its vision for data management, including lessons adopted from other industries and the business case used to fund data management as a strategic initiative.
  • Business Value of Data Quality : This customer panel will address why data quality is a business imperative which significantly affects business success.
  • MD Anderson – Foster Business and IT Collaboration to Reveal Data Insights with Informatica: Is your integration team intimidated by the new Informatica 9.6 tools? Do your analysts and business users require faster access to data and answers about where data comes from. If so, this session is a must attend.
  • The Many Faces of the Healthcare Customer : In the healthcare industry, the customer paying for services (individuals, insurers, employers, the government) is not necessarily the decision-influencer (physicians) or even the patient — and the provider comes in just as many varieties. Learn how, Quest, the world’s leading provider of diagnostic information leverages master data management to resolve the chaos of serving 130M+ patients, 1200+ payers, and almost half of all US physicians and hospitals.
  • Lessons in Healthcare Enterprise Information Management from St. Joseph Health and Sutter Health St. Joseph : Health created a business case for enterprise information management, then built a future-proofed strategy and architecture to unlock, share, and use data. Sutter Health engaged the business, established a governance structure, and freed data from silos for better organizational performance and efficiency. Come hear these leading health systems share their best practices and lessons learned in making data-driven care a reality.
  • Navinet, Inc and Informatica – Delivering Network Intelligence, The Value to the Payer, Provider and Patient: Today, healthcare payers and providers must share information in unprecedented ways to reduce redundancy, cut costs, coordinate care, and drive positive outcomes. Learn how NaviNet’s vision of a “smart” communications network combines Big Data and network intelligence to share proactive real-time information between insurers and providers.
  • Providence Health Services takes a progressive approach to automating ETL development and documentation: A newly organized team of BI Generalists, most of whom have no ETL experience and even fewer with Informatica skills, were tasked with Informatica development when Providence migrated from Microsoft SSIS to Informatica. Learn how the team relied on Informatica to alleviate the burden of low value tasks.
  • Using IDE for Data On-boarding Framework at HMS : HMS’s core business is to onboard large amounts of external data that arrive in different formats. HMS developed a framework using IDE to standardize the on-boarding process. This tool can be used by non-IT analysts and provides standard profiling reports and reusable mapping “templates” which has improved the hand-off to IT and significantly reduced misinterpretations and errors.

Additionally, this year’s attendees are invited to:

  1. Over 100 breakout sessions: Customers from other industries, including financial services, insurance, retail, manufacturing, oil and gas will share their data driven stories.
  2. Healthcare networking reception on Wednesday, May 14th: Join your healthcare peers and Informatica’s healthcare team on Wednesday from 6-7:30pm in the Vesper bar of the Cosmopolitan Resort for a private Healthcare networking reception. Come and hear firsthand how others are achieving a competitive advantage by maximizing return on data while enjoying hors d’oeuvres and cocktails.
  3. Data Driven Healthcare Roundtable Breakfast on Wednesday, May 14th. Customer led roundtable discussion.
  4. Personal meetings: Since most of the Informatica team will be in attendance, this is a great opportunity to meet face to face with Informatica’s product, services and solution teams.
  5. Informatica Pavilion and Partner Expo: Interact with the latest Informatica and our partners provide.
  6. An expanded “Hands-on-Lab”: Learn from real-life case studies and talk to experts about your unique environment.

The Healthcare industry is facing extraordinary changes and uncertainty — both from a business and a technology perspective. Join us to learn about key drivers for change and innovative uses of data technology solutions to discover sources for operational and process improvement. There is still time to Register now!

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business/IT Collaboration, Data Integration, Healthcare, Informatica World 2014, Master Data Management | Tagged , , , | 2 Comments

Data Ambition: What Does It Take To Become an Analytics Leader?

shutterstock_136599614

Which comes first: innovation or analytics?

Bain & Company released some survey findings a few months back that actually put a value on big data.  Companies with advanced analytic capabilities, the consultancy finds, are twice as likely to be in the top quartile of financial performance within their industries; five times as likely to make decisions much faster than market peers; three times as likely to execute decisions as intended; and twice as likely to use data very frequently when making decisions.

This is all good stuff, and the survey, which covered the input of 400 executives, makes a direct correlation between big data analytics efforts and the business’s bottom line. However, it begs a question: How does an organization become one of these analytic leaders? And there’s a more brain-twisting question to this as well: would the type of organization supporting an advanced analytics culture be more likely to be ahead of its competitors because its management tends to be more forward-thinking on a lot of fronts, and not just big data?

You just can’t throw a big data or analytics program or solution set on top of the organization (or drop in a data scientist) and expect to be dazzled with sudden clarity and insight. If an organization is dysfunctional, with a lot of silos, fiefdoms, or calcified and uninspired management, all the big data in the world isn’t going to lift its intelligence quota.

The author of the Bain and Company study, Travis Pearson and Rasmus Wegener, point out that “big data isn’t just one more technology initiative” – “in fact, it isn’t a technology initiative at all; it’s a business program that requires technical savvy.”

Succeeding with big data analytics requires a change in the organization’s culture, and the way it approaches problems and opportunities. The enterprise needs to be open to innovation and change. And, as Pearson and Wegener point out, “you need to embed big data deeply into your organization. It’s the only way to ensure that information and insights are shared across business units and functions. This also guarantees the entire company recognizes the synergies and scale benefits that a well-conceived analytics capability can provide.”

Pearson and Wegener also point to the following common characteristics of big data leaders they have studied:

Pick the “right angle of entry”: There are many areas of the business that can benefit from big data analytics, but just a few key areas that will really impact the business. It’s important to focus big data efforts on the right things. Pearson and Wegener say there are four areas where analytics can be relevant: “improving existing products and services, improving internal processes, building new product or service offerings, and transforming business models.”

Communicate big data ambition: Make it clear that big data analytics is a strategy that has the full commitment of management, and it’s a key part of the organization’s strategy. Messages that need to be communicated: “We will embrace big data as a new way of doing business. We will incorporate advanced analytics and insights as key elements of all critical decisions.” And, the co-authors add, “the senior team must also answer the question: To what end? How is big data going to improve our performance as a business? What will the company focus on?”

Sell and evangelize: Selling big data is a long-term process, not just one or two announcements at staff meetings. “Organizations don’t change easily and the value of analytics may not be apparent to everyone, so senior leaders may have to make the case for big data in one venue after another,” the authors caution. Big data leaders, they observe, have learned to take advantage of the tools at their disposal: they “define clear owners and sponsors for analytics initiatives. They provide incentives for analytics-driven behavior, thereby ensuring that data is incorporated into processes for making key decisions. They create targets for operational or financial improvements. They work hard to trace the causal impact of big data on the achievement of these targets.”

Find an organizational “home” for big data analysis: A common trend seen among big data leaders is that they have created an organizational home for their advanced analytics capability, “often a Center of Excellence overseen by a chief analytics officer,” according to Pearson and Wegener. This is where matters such as strategy, collection and ownership of data across business functions come into play. Organizations also need to plan how to generate insights, and prioritize opportunities and allocation of data analysts’ scientists’ time.

There is a hope and perception that adopting data analytics will open up new paths to innovation. But it often takes a innovative spirit to open up analytics.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration | Tagged , | 2 Comments

The Need for Specialized SaaS Analytics

SaaS

SaaS companies are growing rapidly and becoming the top priority for most CIOs. With such high growth expectations, many SaaS vendors are investing in sales and marketing to acquire new customers even if it means having a negative net profit margin as a result. Moreover, with the pressure to grow rapidly, there is an increased urgency to ensure that the Average Sales Price (ASP) of every transaction increases in order to meet revenue targets.

The nature of the cloud allows these SaaS companies to release new features every few months, which sales reps can then promote to new customers. When new functionalities are not used nor understood, customers often feel that they have overpaid for a SaaS product. In such cases, customers usually downgrade to a lower-priced edition or worse, leave the vendor entirely. To make up for this loss, the sales representatives must work harder to acquire new leads, which results in less attention for existing customers. Preventing customer churn is very important. The Cost to Acquire a Customer (CAC) for upsells is 19% of the CAC to acquire new customer dollars. In comparison, the CAC to renew existing customers is only 15% of the CAC to acquire new customer dollars.

Accurate customer usage data helps determine which features customers use and which are under utilized. Gathering this data can help pinpoint high-value features that are not used, especially for customers that have recently upgraded to a higher edition. The process of collecting this data involves several touch points – from recording clicks within the app to analyzing the open rate of entire modules. This is where embedded cloud integration comes into play.

Embedding integration within a SaaS application allows vendors to gain operational insights into each aspect of how their app is being used. With this data, vendors are able to provide feedback to product management in regards to further improvements. Additionally, embedding integration can alert the customer success management team of potential churn, thereby allowing them to implement preventative measures.

To learn more about how a specialized analytics environment can be set up for SaaS apps, join Informatica and Gainsight on April 9th at 10am PDT for an informational webinar Powering Customer Analytics with Embedded Cloud Integration.

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Integration, SaaS | Tagged , , , , | Leave a comment

Fire your Data Scientists – They Don’t Add Value

Data ScientistYears ago, I was on a project to improve production and product quality through data analysis. During the project, I heard one man say: 

“If I had my way, I’d fire the statisticians – all of them – they don’t add value”. 

Surely not? Why would you fire the very people who were employed to make sense of the vast volumes of manufacturing data and guide future production?  But he was right. The problem was at that time data management was so poor that data was simply not available for the statisticians to analyze.

So, perhaps this title should be re-written to be: 

Fire your Data Scientists – They Aren’t Able to Add Value.

Although this statement is a bit extreme, the same situation may still exist. Data scientists frequently share frustrations such as:

  • “I’m told our data is 60% accurate, which means I can’t trust any of it.”
  • “We achieved our goal of an answer within a week by working 24 hours a day.”
  • “Each quarter we manually prepare 300 slides to anticipate all questions the CFO may ask.”
  • “Fred manually audits 10% of the invoices.  When he is on holiday, we just don’t do the audit.”

This is why I think the original quote is so insightful.  Value from data is not automatically delivered by hiring a statistician, analyst or data scientist. Even with the latest data mining technology, one person cannot positively influence a business without the proper data to support them.

Most organizations are unfamiliar with the structure required to deliver value from their data. New storage technologies will be introduced and a variety of analytics tools will be tried and tested. This change is crucial for to success. In order for statisticians to add value to a company, they must have access to high quality data that is easily sourced and integrated. That data must be available through the latest analytics technology. This new ecosystem should provide insights that can play a role in future production. Staff will need to be trained, as this new data will be incorporated into daily decision making.

With a rich 20-year history, Informatica understands data ecosystems. Employees become wasted investments when they do not have access to the trusted data they need in order to deliver their true value.

Who wants to spend their time recreating data sets to find a nugget of value only to discover it can’t be implemented?

Build a analytical ecosystem with a balanced focus on all aspects of data management. This will mean that value delivery is limited only by the imagination of your employees. Rather than questioning the value of an analytics team, you will attract some of the best and the brightest. Then, you will finally be able to deliver on the promised value of your data.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Data Integration, Data Integration Platform, Data Warehousing | Tagged , , , | 3 Comments

The Links Between Health Information Exchanges and Data Integration

Health Information ExchangeAccording to Health IT Portal, “Having an integrated health IT infrastructure allows a healthcare organization and its providers to streamline the flow of data from one department to the next. Not all health settings, however, find themselves in this situation. Either through business agreements or vendor selection processes, many a healthcare organization has to spend considerable time and resources getting their disparate health IT systems to talk to each.”

In other words, you can’t leverage Health Information Exchanges (HIEs) without a sound data integration strategy. This is something I’ve ranted about for years. The foundation of any entity-to-entity exchange, health, finance, or other, is that all relevant systems freely communicate, and thus able to consume and produce information that’s required by any information exchange.

The article cites the case of Memorial Healthcare, a community health care system in Owosso, MI. Memorial Healthcare has Meditech on the hospital side and Allscripts in its physician offices. Frank Fear, the CIO of Memorial Healthcare, spent the last few years working on solutions to enable data integration. The resulting solution between the two vendors’ offerings, as well as within the same system, is made up of both an EHR and a practice management solution.

Those in the world of healthcare are moving headlong into these exchanges. Most have no clue as to what must change within internal IT to get ahead of the need for the free flow of information. Moreover, there needs to be a good data governance strategy in place, as well as security, and a focus on compliance issues as well.

The reality is that, for the most part, data integration in the world of healthcare is largely ad-hoc, and tactical in nature. This has led to no standardized method for systems to talk one-to-another, and certainly no standard ways for data to flow out through exchanges. Think of plumbing that was built haphazardly and ad hoc over the years, with whatever was quick and easy. Now, you’ve finally turned on the water and there are many, many leaks.

In terms of data integration, healthcare has been underfunded for far too long. Now clear regulatory changes require better information management and security approaches. Unfortunately, healthcare IT is way behind, in terms of leveraging proper data integration approaches, as well as leveraging the right data integration technology.

As things change in the world of healthcare, including the move to HIEs, I suspect that data integration will finally get a hard look from those who manage IT in healthcare organizations. However, they need to do this with some sound planning, which should include an understanding of what the future holds in terms of information management, and how to create a common infrastructure that supports most of the existing and future use cases. Healthcare, you’re about 10 years behind, so let’s get moving this year.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Healthcare | Tagged , , , | Leave a comment

Architects: 8 Great Reasons To Be At Informatica World 2014

Architects INFA14For years, corporations have tried to solve business problems by “throwing new enterprise applications at them.” This strategy has created ‘silos of data’ that are attached to these applications and databases. As a result, it is now increasingly difficult to find, access and use the correct data for new projects.

Data fragmentation leads to slow project delivery and “shadow IT.” Without a change in enterprise data strategy, is it unlikely this problem will improve. On the contrary, the growth of cloud applications and platforms, mobile applications, NoSQL, and the “Internet of Things” create increasing urgency.  Unless a new approach to enterprise data management is taken, the house of cards is going to crumble.

I seriously doubt that this is a surprise to most of you. The question is, “What should we do about it?” The Informatica World 2014 event is a perfect place to find answers. Here are eight benefits architects will enjoy at Informatica World 2014:

  1. A dedicated track of breakout sessions for architects. This track explores reference architectures, design patterns, best practices and real-world examples for building and sustaining your next-generation information architecture. The track will begin with a keynote on the broader issues of enterprise data architecture. It will also include panel of architects from leading companies.
  2. Inspiration to participate in defining the enterprise data architecture of the future. (I can’t spoil it by divulging the details here, but I promise that it will be interesting and worth your while!)
  3. New insights on how to manage information for the data-centric enterprise. These will expand your thinking about data architecture and platforms.
  4. Chances to network with architect peers.
  5. A Hands-on Lab, were you can talk to Informatica experts about their products and solutions.
  6. Updates on what Informatica is doing to bring business subject matter experts in as full partners in the co-development of data-related projects.
  7. A chance to win a one-hour one-on-one session with an Informatica architect at Informatica World.
  8. A chance to learn to control your biggest enterprise system: The collection of all data-moving resources across your company.

We believe that architecture is the key to unleashing information potential. To compete in today’s global 24x7x365 economy, business requires well-designed information architectures that can continuously evolve to support the new heights of flexibility, efficiency, responsiveness, and trust. I hope you will join us in Las Vegas, May 12-15!

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Enterprise Data Management, Informatica World 2014 | Tagged , | Leave a comment