Tag Archives: Informatica
Strata 2015 – Making Data Work for Everyone with Cloud Integration, Cloud Data Management and Cloud Machine Learning
Are you ready to answer “Yes” to the questions:
a) “Are you Cloud Ready?”
b) “Are you Machine Learning Ready?”
I meet with hundreds of Informatica Cloud customers and prospects every year. While they are investing in Cloud, and seeing the benefits, they also know that there is more innovation out there. They’re asking me, what’s next for Cloud? And specifically, what’s next for Informatica in regards to Cloud Data Integration and Cloud Data Management? I’ll share more about my response throughout this blog post.
The spotlight will be on Big Data and Cloud at the Strata + Hadoop World conference taking place in Silicon Valley from February 17-20 with the theme “Make Data Work”. I want to focus this blog post on two topics related to making data work and business insights:
- How existing cloud technologies, innovations and partnerships can help you get ready for the new era in cloud analytics.
- How you can make data work in new and advanced ways for every user in your company.
Today, Informatica is announcing the availability of its Cloud Integration Secure Agent on Microsoft Azure and Linux Virtual Machines as well as an Informatica Cloud Connector for Microsoft Azure Storage. Users of Azure data services such as Azure HDInsight, Azure Machine Learning and Azure Data Factory can make their data work with access to the broadest set of data sources including on-premises applications, databases, cloud applications and social data. Read more from Microsoft about their news at Strata, including their relationship with Informatica, here.
“Informatica, a leader in data integration, provides a key solution with its Cloud Integration Secure Agent on Azure,” said Joseph Sirosh, Corporate Vice President, Machine Learning, Microsoft. “Today’s companies are looking to gain a competitive advantage by deriving key business insights from their largest and most complex data sets. With this collaboration, Microsoft Azure and Informatica Cloud provide a comprehensive portfolio of data services that deliver a broad set of advanced cloud analytics use cases for businesses in every industry.”
Even more exciting is how quickly any user can deploy a broad spectrum of data services for cloud analytics projects. The fully-managed cloud service for building predictive analytics solutions from Azure and the wizard-based, self-service cloud integration and data management user experience of Informatica Cloud helps overcome the challenges most users have in making their data work effectively and efficiently for analytics use cases.
The new solution enables companies to bring in data from multiple sources for use in Azure data services including Azure HDInsight, Azure Machine Learning, Azure Data Factory and others – for advanced analytics.
The broad availability of Azure data services, and Azure Machine Learning in particular, is a game changer for startups and large enterprises. Startups can now access cloud-based advanced analytics with minimal cost and complexity and large businesses can use scalable cloud analytics and machine learning models to generate faster and more accurate insights from their Big Data sources.
Success in using machine learning requires not only great analytics models, but also an end-to-end cloud integration and data management capability that brings in a wide breadth of data sources, ensures that data quality and data views match the requirements for machine learning modeling, and an ease of use that facilitates speed of iteration while providing high-performance and scalable data processing.
For example, the Informatica Cloud solution on Azure is designed to deliver on these critical requirements in a complementary approach and support advanced analytics and machine learning use cases that provide customers with key business insights from their largest and most complex data sets.
Using the Informatica Cloud solution on Azure connector with Informatica Cloud Data Integration enables optimized read-write capabilities for data to blobs in Azure Storage. Customers can use Azure Storage objects as sources, lookups, and targets in data synchronization tasks and advanced mapping configuration tasks for efficient data management using Informatica’s industry leading cloud integration solution.
As Informatica fulfills the promise of “making great data ready to use” to our 5,500 customers globally, we continue to form strategic partnerships and develop next-generation solutions to stay one step ahead of the market with our Cloud offerings.
My goal in 2015 is to help each of our customers say that they are Cloud Ready! And collaborating with solutions such as Azure ensures that our joint customers are also Machine Learning Ready!
To learn more, try our free Informatica Cloud trial for Microsoft Azure data services.
The signs that healthcare is becoming a more consumer (think patients, members, providers) driven industry are evident all around us. I see provider and payer organizations clamoring for more data, specifically data that is actionable, relatable and has integrity. Armed with this data, healthcare organizations are able to differentiate around a member/patient-centric view.
These consumer-centric views convey the total value of the relationships healthcare organizations have with consumers. Understanding the total value creates a more comprehensive understanding of consumers because they deliver a complete picture of an individual’s critical relationships including: patient to primary care provider, member to household, provider to network and even members to legacy plans. This is the type of knowledge that informs new treatments, targets preventative care programs and improves outcomes.
Payer organizations are collecting and analyzing data to identify opportunities for more informed care management and segmentation to reach new, high value customers in individual markets. By segmenting and targeting messaging to specific populations, health plans generate increased member satisfaction and cost effectively expands and manages provider networks.
How will they accomplish this? Enabling members to interact in health and wellness forums, analyzing member behavior and trends and informing care management programs with a 360 view of members… to name a few . Payers will also drive new member programs, member retention and member engagement marketing and sales programs by investigating complete views of member households and market segments.
In the provider space, this relationship building can be a little more challenging because often consumers as patients do not interact with their doctor unless they are sick, creating gaps in data. When provider organizations have a better understanding of their patients and providers, they can increase patient satisfaction and proactively offer preventative care to the sickest (and most likely to engage) of patients before an episode occurs. These activities result in increased market share and improved outcomes.
Where can providers start? By creating a 360 view of the patient, organizations can now improve care coordination, open new patient service centers and develop patient engagement programs.
Analyzing populations of patients, and fostering patient engagement based on Meaningful Use requirements or Accountable Care requirements, building out referral networks and developing physician relationships are essential ingredients in consumer engagement. Knowing your patients and providing a better patient experience than your competition will differentiate provider organizations.
You may say “This all sounds great, but how does it work?” An essential ingredient is clean, safe and connected data. Clean, safe and connected data requires an investment in data as an asset… just like you invest in real estate and human capital, you must invest in the accessibility and quality of your data. To be successful, arm your team with tools to govern data –ensuring ongoing integrity and quality of data, removes duplicate records and dynamically incorporates data validation/quality rules. These tools include master data management, data quality, metadata management and are focused on information quality. Tools focused on information quality support a total customer relationship view of members, patients and providers.
“Raw materials costs are the company’s single largest expense category,” said Steve Jenkins, Global IT Director at Valspar, at MDM Day in London. “Data management technology can help us improve business process efficiency, manage sourcing risk and reduce RFQ cycle times.”
Valspar is a $4 billion global manufacturing company, which produces a portfolio of leading paint and coating brands. At the end of 2013, the 200 year old company celebrated record sales and earnings. They also completed two acquisitions. Valspar now has 10,000 employees operating in 25 countries.
As is the case for many global companies, growth creates complexity. “Valspar has multiple business units with varying purchasing practices. We source raw materials from 1,000s of vendors around the globe,” shared Steve.
“We want to achieve economies of scale in purchasing to control spending,” Steve said as he shared Valspar’s improvement objectives. “We want to build stronger relationships with our preferred vendors. Also, we want to develop internal process efficiencies to realize additional savings.”
Poorly managed vendor and raw materials data was impacting Valspar’s buying power
The Valspar team, who sharply focuses on productivity, had an “Aha” moment. “We realized our buying power was limited by the age and quality of available vendor data and raw materials data,” revealed Steve.
The core vendor data and raw materials data that should have been the same across multiple systems wasn’t. Data was often missing or wrong. This made it difficult to calculate the total spend on raw materials. It was also hard to calculate the total cost of expedited freight of raw materials. So, employees used a manual, time-consuming and error-prone process to consolidate vendor data and raw materials data for reporting.
These data issues were getting in the way of achieving their improvement objectives. Valspar needed a data management solution.
Valspar needed a single trusted source of vendor and raw materials data
The team chose Informatica MDM, master data management (MDM) technology. It will be their enterprise hub for vendors and raw materials. It will manage this data centrally on an ongoing basis. With Informatica MDM, Valspar will have a single trusted source of vendor and raw materials data.
Informatica PowerCenter will access data from multiple source systems. Informatica Data Quality will profile the data before it goes into the hub. Then, after Informatica MDM does it’s magic, PowerCenter will deliver clean, consistent, connected and enriched data to target systems.
Better vendor and raw materials data management results in cost savings
Valspar expects to gain the following business benefits:
- Streamline the RFQ process to accelerate raw materials cost savings
- Reduce the total number of raw materials SKUs and vendors
- Increase productivity of staff focused on pulling and maintaining data
- Leverage consistent global data visibly to:
- increase leverage during contract negotiations
- improve acquisition due diligence reviews
- facilitate process standardization and reporting
Valspar’s vision is to tranform data and information into a trusted organizational assets
“Mastering vendor and raw materials data is Phase 1 of our vision to transform data and information into trusted organizational assets,” shared Steve. In Phase 2 the Valspar team will master customer data so they have immediate access to the total purchases of key global customers. In Phase 3, Valspar’s team will turn their attention to product or finished goods data.
Steve ended his presentation with some advice. “First, include your business counterparts in the process as early as possible. They need to own and drive the business case as well as the approval process. Also, master only the vendor and raw materials attributes required to realize the business benefit.”
Want more? Download the Total Supplier Information Management eBook. It covers:
- Why your fragmented supplier data is holding you back
- The cost of supplier data chaos
- The warning signs you need to be looking for
- How you can achieve Total Supplier Information Management
In my recent blog post, What Do Millennials Want?, I mentioned that we are bringing you a special blog series, “Informatica Interns Ideas: 2014.” Today’s post, the first in the series, comes from Amitha Narayanan, a Technical Writing Documentation Intern located in our Redwood City HQ office. Amitha is pursuing her Masters in Technical Communication at North Carolina State University. When Amitha is not authoring technical content, you can find her outdoors, exploring the city, reading books on park benches, or doing yoga.
A whole new world in 81 days…
I was caught between two very distinct emotions as I started my technical documentation internship. On the one hand, I was excited about rejoining the workforce after a year of being away in grad school. On the other, I had the apprehension of a six-year old on the first day of elementary school. I also had a nagging voice inside my head tell me that at 33, I might be the oldest intern in the group.
Well, seven weeks into the internship, I have kissed the voice good bye. And I can scream from any rooftop that this is by far the best decision I have made for myself.
If there is one phrase that sums up my intern experience, it is “a journey of (self) discovery.”
By the end of my first week here, I realized the following things:
- There is no such thing as a dumb question. This internship has taught me to ask without apology. The six words that made it possible are “that’s what we are here for.”
- It takes a leader to make another. I see that all around me in managers, mentors, and senior colleagues in the team, who go above and beyond to groom and instill in me the confidence to see myself becoming like them one day. Like they say, “it takes one to know one, show one, and grow one.”
- I am as important as any full-time employee. I might be here for just three months, but my growth and development are just as crucial as any full-time new hire’s.
- I won’t be spoon fed, which is the BEST part. Everyone is helpful and points me to where the information is or who the person is that can answer my questions. I love that it simply doesn’t come to me like breakfast in bed on a romantic weekend.
I also know myself better now. I am sure that I love being in an environment that expects me to hit the ground running; I work well with someone who is direct and simply cuts to the chase. I also now know that age is just a number and there is no better time than now to experience what it is like to be part of an organization that is truly young at heart.
“Inaccurate, inconsistent and disconnected supplier information prohibits us from doing accurate supplier spend analysis, leveraging discounts, comparing and choosing the best prices, and enforcing corporate standards.”
This is quotation from a manufacturing company executive. It illustrates the negative impact that poorly managed supplier information can have on a company’s ability to cut costs and achieve revenue targets.
Many supply chain and procurement teams at large companies struggle to see the total relationship they have with suppliers across product lines, business units and regions. Why? Supplier information is scattered across dozens or hundreds of Enterprise Resource Planning (ERP) and Accounts Payable (AP) applications. Too much valuable time is spent manually reconciling inaccurate, inconsistent and disconnected supplier information in an effort to see the big picture. All this manual effort results in back office administrative costs that are higher than they should be.
Do these quotations from supply chain leaders and their teams sound familiar?
“We have 500,000 suppliers. 15-20% of our supplier records are duplicates. 5% are inaccurate.”
“I get 100 e-mails a day questioning which supplier to use.”
“To consolidate vendor reporting for a single supplier between divisions is really just a guess.”
“Every year 1099 tax mailings get returned to us because of invalid addresses, and we play a lot of Schedule B fines to the IRS.”
“Two years ago we spent a significant amount of time and money cleansing supplier data. Now we are back where we started.”
Please join me and Naveen Sharma, Director of the Master Data Management (MDM) Practice at Cognizant for a Webinar, Supercharge Your Supply Chain Applications with Better Supplier Information, on Tuesday, July 29th at 11 am PT.
During the Webinar, we’ll explain how better managing supplier information can help you achieve the following goals:
- Accelerate supplier onboarding
- Mitiate the risk of supply disruption
- Better manage supplier performance
- Streamline billing and payment processes
- Improve supplier relationship management and collaboration
- Make it easier to evaluate non-compliance with Service Level Agreements (SLAs)
- Decrease costs by negotiating favorable payment terms and SLAs
I hope you can join us for this upcoming Webinar!
I have a little fable to tell you…
This fable has nothing to do with Big Data, but instead deals with an Overabundance of Food and how to better digest it to make it useful.
And it all started when this SEO copywriter from IT Corporation walked into a bar, pub, grill, restaurant, liquor establishment, and noticed 2 large crowded tables. After what seemed like an endless loop, an SQL programmer sauntered in and contemplated the table problem. “Mind if I join you?”, he said? Since the tables were partially occupied and there were no virtual tables available, the host looked on the patio of the restaurant at 2 open tables. “Shall I do an outside join instead?” asked the programmer? The host considered their schema and assigned 2 seats to the space.
The writer told the programmer to look at the menu, bill of fare, blackboard – there were so many choices but not enough real nutrition. “Hmmm, I’m hungry for the right combination of food, grub, chow, to help me train for a triathlon” he said. With that contextual information, they thought about foregoing the menu items and instead getting in the all-you-can-eat buffer line. But there was too much food available and despite its appealing looks in its neat rows and columns, it seemed to be mostly empty calories. They both realized they had no idea what important elements were in the food, but came to the conclusion that this restaurant had a “Big Food” problem.
They scoped it out for a moment and then the writer did an about face, reversal, change in direction and the SQL programmer did a commit and quick pivot toward the buffer line where they did a batch insert of all of the food, even the BLOBS of spaghetti, mash potatoes and jello. There was far too much and it was far too rich for their tastes and needs, but they binged and consumed it all. You should have seen all the empty dishes at the end – they even caused a stack overflow. Because it was a batch binge, their digestive tracts didn’t know how to process all of the food, so they got a stomach ache from “big food” ingestion – and it nearly caused a core dump – in which case the restaurant host would have assigned his most dedicated servers to perform a thorough cleansing and scrubbing. There was no way to do a rollback at this point.
It was clear they needed relief. The programmer did an ad hoc query to JSON, their Server who they thought was Active, for a response about why they were having such “big food” indigestion, and did they have packets of relief available. No response. Then they asked again. There was still no response. So the programmer said to the writer, “Gee, the Quality Of Service here is terrible!”
Just then, the programmer remembered a remedy he had heard about previously and so he spoke up. “Oh, it’s very easy just <SELECT>Vibe.Data.Stream from INFORMATICA where REAL-TIME is NOT NULL.”
Informatica’s Vibe Data Stream enables streaming food collection for real-time Big food analytics, operational intelligence, and traditional enterprise food warehousing from a variety of distributed food sources at high scale and low latency. It enables the right food ingested at the right time when nutrition is needed without any need for binge or batch ingestion.
And so they all lived happily ever after and all was good in the IT Corporation once again.
Download Now and take your first steps to rapidly developing applications that sense and respond to streaming food (or data) in real-time.
“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.
In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.
Here are my MDM Day fun facts and key takeaways:
- Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?
GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services. Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.
- Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”
- Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?
Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.
- Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?
Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.
While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:
- Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
- Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”
I also had the opportunity to speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.
There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.
“Trying to improve the quality of asset data when you don’t have a solid data management infrastructure in place is like trying to save a sinking boat with a bailing bucket,” explained Dean Balog, a senior principal consultant at Noah Consulting, in this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data
Dean has 15 years of experience in information management in the utilities industry. In this interview, Dean and I discuss the top issues facing utility executives and how to improve the quality of mission-critical asset data for asset management / equipment maintenance and regulatory reporting, such as rate case submissions.
Q: Dean, what are the top issues facing utility executives?
A: The first issue is asset management / equipment maintenance. Knowing where to invest precious dollars is critical. Utility executives are engaged in a constant tug of war between two competing priorities: replacing aging infrastructure and regular maintenance.
Q. How are utility executives determining that balance?
A. You need to start with facts – the real costs and reliability information for each asset in your infrastructure. Without it, you are guessing. Basically, it is a data problem. Utility executives should ask themselves these questions:
- Do we have the ability to capture and combine cost and reliability information from multiple sources? Is it granular enough to be useful?
- Do we know the maintenance costs of eight-year-old breakers versus three-year-old breakers?
- Do our meters start failing around the average lifespan? For this example, let us say that is five years. Rather than falling uniformly into that average, do 30% of our meters fail in the first year and the rest last eight years? Those three extra years of life can certainly help out the bottom line.
Knowing your data makes all the difference. The right capital investment strategy requires combining performance, reliability, and cost data.
Q. Why is it difficult for utility executives to understand the real costs and reliability of assets?
A. I know this does not come as a shock, but most companies do not trust their data. Asset data is often inaccurate, inconsistent, and disconnected. Even the most basic data may not be available. For example, manufacture dates on breakers should be filled in, but they are not. If less than 50% of your breakers have manufacture dates, how can you build a preventative maintenance program? You do not even know what to address first!
A traditional approach to solving this data problem is to do a big data cleanup. You clean the data, and then before you know it, errors creep back in, and the trust in the data you have worked so hard to establish is lost.
I like to illustrate the pain of this issue by using the sinking boat analogy. Data cleanup is like bailing out the water collecting in the bottom of the boat. You think you are solving the problem but more water still seeps into the boat. You cannot stop bailing or you will sink. What you need to do is fix the leaks, and then bail out the boat. But, if you do not lift up your head from bailing long enough to see the leaks and make the right investments, you are fighting a losing battle.
Q. What can utility executives do to improve the quality of asset data?
A. First of all, you need to develop a data governance framework. Going back to the analogy, a data governance framework gives you the structure to find the leaks, fix the leaks, and monitor how much of the water has been bailed out. If the water level is still rising, you have not fixed all the leaks. But having a data governance framework is not the be-all and end-all.
You also need to appoint data stewards to be accountable for establishing and maintaining high quality asset data. The job of a data steward would be easy if there was only one system where all asset data resided. But the fact of the matter is that asset data is fragmented – scattered across multiple systems. Data stewards have a huge responsibility and they need to be supported by a solid data management infrastructure to ease the burden of managing business-critical asset information.
Master Data Management (MDM) ensures business-critical asset data is consistent everywhere by pulling together data that is scattered across multiple applications. It manages and masters it in a central location on a continuous basis and shares it with any applications that need that data. MDM provides a user interface and workflow for data stewards to manage the tangled web of names and IDs these assets are known by across systems. It also gives utilities a disciplined approach to manage important relationships between the asset data, such as an asset’s performance reliability and its cost.
Q. Any other pressing issues facing utilities?
A. Yes. Another big issue is tightening regulations that consume investment dollars and become key inputs into rate case submissions and defenses. One of the complicating factors is the number of regulations is not only increasing, but the regulators are also requiring faster implementation times than ever before. So, utilities cannot just do what they have done in the past: throw more people at the problem in the short-term and resolve to fix it later by automating it “when things slow down.” That day never comes.
Q. How can utilities deal with these regulatory pressures?
A. Utilities need a new approach to deal with regulations. Start with the assumption that all data is fair game for regulators. All data must be accessible. You need to be able to report on it, not only to comply with regulations, but for competitive advantage. This requires the high quality asset information we talked about earlier, and an analytical application to:
- Perform what-if analyses for your asset investment program;
- Develop regulatory compliance or environmental reports quickly, because the hard work (integrating the data within your MDM program) has already been done; and
- Get access to granular, observed reliability and cost information using your own utility’s data – not benchmark data that is already a couple of years old and highly summarized.
Q. What is your advice for utility company executives?
A. If you are the one responsible for signing off on regulatory reports and you do not look good in an orange jumpsuit, you need to invest in a plan that includes people, process, and technology to support regulatory reporting and asset management / equipment maintenance.
- People – Data stewards have clear accountability for the quality of asset data.
- Process – Data governance is your game plan.
- Technology – A solid data management infrastructure consisting of data integration, data quality, and master data management is your means.
If you are responsible for asset management / equipment maintenance or regulatory reporting, particularly rate case submissions, check out this webinar, Attention Utility Executives: Don’t Waste Millions in Operating Costs Due to Bad Asset Data
Our panel of utility data experts:
- Reveal the five toughest business challenges facing utility industry executives;
- Explain how bad asset data could be costing you millions of dollars in operating costs;
- Share three best practices for optimizing asset management / equipment maintenance and regulatory reporting with accurate, consistent, and connected asset information; and
- Show you how to implement these best practices with a demonstration.
Bad data is bad for business. Ovum Research reported that poor quality data is costing businesses at least 30% of revenues. Never before have business leaders across a broad range of roles recognized the importance of using high quality information to drive business success. Leaders in functions ranging from marketing and sales to risk management and compliance have invested in world-class applications, six sigma processes, and the most advanced predictive analytics. So why are you not seeing more return on that investment? Simply put, if your business-critical data is a mess, the rest doesn’t matter.
Not all business leaders know there’s a better way to manage their business-critical data. So, I asked Dennis Moore, the senior vice president and general manager of Informatica’s MDM business, who clocked hundreds of thousands of airline miles last year visiting business leaders around the world, to talk about the impact of using accurate, consistent and connected data and the value business leaders can gain through master data management (MDM).
Q. Why are business leaders focusing on business-critical data now?
A. Leaders have always cared about their business-critical data, the master data on which their enterprises depend most — their customers, suppliers, the products they sell, the locations where they do business, the assets they manage, the employees who make the business perform. Leaders see the value of having a clear picture, or “best version of the truth,” describing these “master data” entities. But, this is hard to come by with competing priorities, mergers and acquisitions and siloed systems.
As companies grow, business leaders start realizing there is a huge gap between what they do know and what they should know about their customers, suppliers, products, assets and employees. Even worse, most businesses have lost their ability to understand the relationships between business-critical data so they can improve business outcomes. Line of business leaders have been asking questions such as:
- How can we optimize sales across channels when we don’t know which customers bought which products from which stores, sites or suppliers?
- How can we quickly execute a recall when we don’t know which supplier delivered a defective part to which factory and where those products are now?
- How can we accelerate time-to-market for a new drug, when we don’t know which researcher at which site used which combination of compounds on which patients?
- How can we meet regulatory reporting deadlines, when we don’t know which model of a product we manufactured in which lot on which date?
Q. What is the crux of the problem?
A. The crux of the problem is that as businesses grow, their business-critical data becomes fragmented. There is no big picture because it’s scattered across applications, including on premise applications (such as SAP, Oracle and PeopleSoft) and cloud applications (such as Salesforce, Marketo, and Workday). But it gets worse. Business-critical data changes all the time. For example,
- a customer moves, changes jobs, gets married, or changes their purchasing habits;
- a suppliers moves, goes bankrupt or acquires a competitor;
- you discontinue a product or launch a new one; or
- you onboard a new asset or retire an old one.
As all this change occurs, business-critical data becomes inconsistent, and no one knows which application has the most up-to-date information. This costs companies money. It saps productivity and forces people to do a lot of manual work outside their best-in-class processes and world-class applications. One question I always ask business leaders is, “Do you know how much bad data is costing your business?”
Q. What can business leaders do to deal with this issue?
A. First, find out where bad data is having the most significant impact on the business. It’s not hard – just about any employee can share stories of how bad data led to a lost sale, an extra “truck roll,” lost leverage with suppliers, or a customer service problem. From the call center to the annual board planning meeting, bad data results in sub-optimal decisions and lost opportunities. Work with your line of business partners to reach a common understanding of where an improvement can really make a difference. Bad master data is everywhere, but bad master data that has material costs to the business is a much more pressing and constrained problem. Don’t try to boil the ocean or bring a full-blown data governance maturity level 5 approach to your organization if it’s not already seeing success from better data!
Second, focus on the applications and processes used to create, share, and use master data. Many times, some training, a tweak to a process, or a new interface can be created between systems, resulting in very significant improvements for the users without major IT work or process changes.
Lastly, look for a technology that is purpose-built to deal with this problem. Master data management (MDM) helps companies better manage business-critical data in a central location on an ongoing basis and then share that “best version of the truth” with all on premise and cloud applications that need it.
Let’s use customer data as an example. If valuable customer data is located in applications such as Salesforce, Marketo, Seibel CRM, and SAP, MDM brings together all the business-critical data, the core that’s the same across all those applications, and creates the “best version of the truth.” It also creates the total customer relationship view across functions, product lines and regions, which CRM promised but never delivered.
MDM then shares that “mastered” customer data and the total customer relationship view with the applications that want it. MDM can be used to master the relationships between customers, such as legal entity hierarchies. This helps sales and customer service staff be more productive, while also improving legal compliance and management decision making. Advanced MDM products can also manage relationships across different types of master data. For example, advanced MDM enables you to relate an employee to a project to a contract to an asset to a commission plan. This ensures accurate and timely billing, effective expense management, managed supplier spend, and even improved workforce deployment.
When your sales team has the best possible customer information in Salesforce and the finance team has the best possible customer information in SAP, no one wastes time pulling together spreadsheets of information outside of their world-class applications. Your global workforce doesn’t waste time trying to investigate whether Jacqueline Geiger in one system and Jakki Geiger in another system is one or two customers, sending multiple bills and marketing offers at high cost in postage and customer satisfaction. All employees who have access to mastered customer information can be confident they have the best possible customer information available across the organization to do their jobs. And with the most advanced and intelligent data platform, all this information can be secured so only the authorized employees, partners, and systems have access.
Q. Which industries stand to gain the most from mastering their data?
A. In every industry there is some transformation going on that’s driving the need to know people, places and things better. Take insurance for example. Similar to the transformation in the travel industry that reduced the need for travel agents, the insurance industry is experiencing a shift from the agent/broker model to a more direct model. Traditional insurance companies now have an urgent need to know their customers so they can better serve them across all channels and across multiple lines of business.
In other industries, there is an urgent need to get a lot better at supply-chain management or to accelerate new product introductions to compete better with an emerging rival. Business leaders are starting to make the connection between transformation failures and a more critical need for the best possible data, particularly in industries undergoing rapid transformation, or with rapidly changing regulatory requirements.
Q. Which business functions seem most interested in mastering their business-critical data?
A. It varies by industry, but there are three common threads that seem to span most industries:
- MDM can help the marketing team optimize the cross-sell and up-sell process with high quality data about customers, their households or company hierarchies, the products and services they have purchased through various channels, and the interactions their organizations have had with these customers.
- MDM can help the procurement team optimize strategic sourcing including supplier spend management and supplier risk management with high quality data about suppliers, company hierarchies, contracts and the products they supply.
- MDM can help the compliance teams manage all the business-critical data they need to create regulatory reports on time without burning the midnight oil.
Q. How is the use of MDM evolving?
A. When MDM technology was first introduced a decade ago, it was used as a filter. It cleaned up business-critical data on its way to the data warehouse so you’d have clean, consistent, and connected information (“conformed dimensions”) for reporting. Now business leaders are investing in MDM technology to ensure that all of their global employees have access to high quality business-critical data across all applications. They believe high quality data is mission-critical to their operations. High quality data is viewed as the the lifeblood of the company and will enable the next frontier of innovation.
Second, many companies mastered data in only one or two domains (customer and product), and used separate MDM systems for each. One system was dedicated to mastering customer data. You may recall the term Customer Data Integration (CDI). Another system was dedicated to mastering product data. Because the two systems were in silos and business-critical data about customers and products wasn’t connected, they delivered limited business value. Since that time, business leaders have questioned this approach because business problems don’t contain themselves to one type of data, such as customer or product, and many of the benefits of mastering data come from mastering other domains including supplier, chart of accounts, employee and other master or reference data shared across systems.
The relationships between data matter to the business. Knowing what customer bought from which store or site is more valuable than just knowing your customer. The business insights you can gain from these relationships is limitless. Over 90% of our customers last year bought MDM because they wanted to master multiple types of data. Our customers value having all types of business-critical data in one system to deliver clean, consistent and connected data to their applications to fuel business success.
One last evolution we’re seeing a lot involves the types and numbers of systems connecting to the master data management system. In the past, there were a small number of operational systems pushing data through the MDM system into a data warehouse used for analytical purposes. Today, we have customers with hundreds of operational systems communicating with each other via an MDM system that has just a few milliseconds to respond, and which must maintain the highest levels of availability and reliability of any system in the enterprise. For example, one major retailer manages all customer information in the MDM system, using the master data to drive real-time recommendations as well as a level of customer service in every interaction that remains the envy of their industry.
Q. Dennis, why should business leaders consider attending MDM Day?
A. Business leaders should consider attending MDM Day at InformaticaWorld 2014 on Monday, May 12, 2014. You can hear first-hand the business value companies are gaining by using clean, consistent and connected information in their operations. We’re excited to have fantastic customers who are willing to share their stories and lessons learned. We have presenters from St. Jude Medical, Citrix, Quintiles and Crestline Geiger and panelists from Thomson Reuters, Accenture, EMC, Jones Lang Lasalle, Wipro, Deloitte, AutoTrader Group, McAfee-Intel, Abbvie, Infoverity, Capgemini, and Informatica among others.
Last year’s Las Vegas event, and the events we held in London, New York and Sao Paolo were extremely well received. This year’s event is packed with even more customer sessions and opportunities to learn and to influence our product road map. MDM Day is one day before InformaticaWorld and is included in the cost of your InformaticaWorld registration. We’d love to see you there!
See the MDM Day Agenda.