Tag Archives: MDM

Master Data Management in Oil and Gas Industry

MDM_Oil+Gas Industry

Master Data Management in Oil and Gas Industry

The Oil and Gas (O&G) industry is an important backbone of every economy. It is also an industry that has weathered the storm of constantly changing economic trends, regulations and technological innovations. O&G companies by nature have complex and intensive data processes. For a profitable functioning under changing trends, policies and guidelines, O&G’s need to manage these data processes really well.

The industry is subject to pricing volatility based on microeconomic pattern of supply and demand affected by geopolitical developments, economic meltdown and public scrutiny. The competition from other sources such as cheap natural gas and low margins are adding fuel to the burning fire making it hard for O&G’s to have a sustainable and predictable outcome.

A recent PWC survey of oil and gas CEOs similarly concluded that “energy CEOs can’t control market factors such as the world’s economic health or global oil supply, but can change how they respond to market conditions, such as getting the most out of technology investments, more effective use of partnerships and diversity strategies.”  The survey also revealed that nearly 80% of respondents agreed that digital technologies are creating value for their companies when it comes to data analysis and operational efficiency.

O&G firms run three distinct business operations; upstream exploration & production (E&P’s), midstream (storage & transportation) and downstream (refining & distribution). All of these operations need a few core data domains being standardized for every major business process. However, a key challenge faced by O&G companies is that this critical core information is often spread across multiple disparate systems making it hard to take timely decisions. To ensure effective operations and to grow their asset base, it is vital for these companies to capture and manage critical data related to these domains.

E&P’s core data domains include wellhead/materials (asset), geo-spatial location data and engineer/technicians (associate). Midstream includes trading partners and distribution and downstream includes commercial and residential customers. Classic distribution use cases emerge around shipping locations, large-scale clients like airlines and other logistics providers buying millions of gallons of fuel and industrial lube products down to gas station customers. The industry also relies heavily on reference data and chart of accounts for financial cost and revenue roll-ups.

The main E&P asset, the well, goes through its life cycle and changes characteristics (location, ID, name, physical characterization, depth, crew, ownership, etc.), which are all master data aspects to consider for this baseline entity. If we master this data and create a consistent representation across the organization, it can then be linked to transaction and interaction data so that O&G’s can drive their investment decisions, split cost and revenue through reporting and real-time processes around

  • Crew allocation
  • Royalty payments
  • Safety and environmental inspections
  • Maintenance and overall production planning

E&P firms need a solution that allows them to:

  • Have a flexible multidomain platform that permits easier management of different entities under one solution
  • Create a single, cross-enterprise instance of a wellhead master
  • Capture and master the relationships between the well, equipment, associates, land and location
  • Govern end-to-end management of assets, facilities, equipment and sites throughout their life cycle

The upstream O&G industry is uniquely positioned to take advantage of vast amount of data from its operations. Thousands of sensors at the well head, millions of parts in the supply chain, global capital projects and many highly trained staff create a data-rich environment. A well implemented MDM brings a strong foundation for this data driven industry providing clean, consistent and connected information about the core master data so these companies can cut the material cost, IT maintenance and increase margins.

To know more on how you can achieve upstream operational excellence with Informatica Master Data Management, check out this recorded webinar with @OilAndGasData

~Prash
@MDMGeek
www.mdmgeek.com

Share
Posted in Data Quality, Operational Efficiency, Utilities & Energy | Tagged , , , | Leave a comment

IDMP: Do You have Time to Implement MDM?

It is not quite a year since I have been looking into the Idenfication of Medicinal Products (IDMP) ISO standard, and the challenging EMA IDMP implementation deadline of July 1st, 2016.  Together with HighPoint Solutions, we have proposed that using a Master Data Management (MDM) system as a product data integration layer is the best way to ensure the product data required by IDMP is consolidated and delivered to the regulator.  This message has been well received with almost all the pharmaceutical companies we talked to.

MDM for IDMP

MDM for IDMP

During the past few months, the support for using MDM as a key part of the IDMP solution stack is growing.  Supporters of using MDM now include people that have been looking into the IDMP challenge and solutions for far longer than me:  independent consultants, representatives of pharma companies with active projects and leading analysts have expressed their support.  At the IDMP Compliance Challenge and Regulatory Information Management conference held in Berlin last month, using MDM within the solution stack was a common theme – with a large percentage of presentations referencing the technology.

However, at this conference an objection to the use of MDM was circulating during the coffee break.  Namely:

Do we really have time to implement MDM, and achieve compliance before July 2016?

First, let’s revisit why MDM is a favoured approach for IDMP.  The data required for compliance is typically dispersed across 10+ internal and external data silos, building a data integration layer to prepare data for submission is the current favoured approach.  It has popular support from both industry insiders and independent consultants.  The data integration layer is seen as a good technical approach to overcome a number of challenges which IDMP is posing in their initial draft guidance, namely:

  • Organisational: It has to make it easy for data owners to contribute to the quality of data
  • Technical: It needs to integrate data from multiple systems, cleaning and resolving attributes using as much automation as possible
  • Co-ordination: The layer must ensure data submitted is consistent across regulations, and also within internal transactional systems
  • Timing: Projects must begin now, and pose low technical risk in order to meet the deadline.

MDM technology is an excellent fit to address these challenges (for a high level summary, see here).

So back to the time objection, it seems a bit out of place if you follow the logic:

  • In order to comply with IDMP, you need to collect, cleanse, resolve and relate a diverse set of product data
  • A data integration layer is the best technical architecture
  • MDM is a proven fit for the data integration layer

So why would you not have time to implement MDM, if this is the best (and available) technical solution for the data collection and consolidation necessary to achieve compliance?

My feeling is the objection comes down to the definition of MDM, which is (correctly) seen as something more than technology.  Expert and analyst definitions variously include the words ‘discipline’, ‘method’, ‘process’ or ‘practice’.  Current consensus is that the underlying technology merely enables the processes which allow an organisation to collect and curate a single, trusted source of master data.  In this light, MDM implies the need for senior level sponsorship and organisational change.

The truth is IDMP compliance needs senior level sponsorship and organisational change.  In all the discussions I have had, these points come out clearly.  Many pharma insiders who understand the challenge are grappling with how to get the required attention from execs that IDMP needs in order to achieve compliance.  July 1 2016 is not only a deadline, it is the start of a new discipline in managing a broad range of pharma product data.  This new discipline will require organisational change in order to ensure the high quality data can be produced on a continuous basis.

So the definitions of MDM actually make the case for using MDM as part of the technology stack stronger.  MDM will not only provide technology for the data integration layer, but also a support structure for the new product data processes that will be required for sustained compliance.  Without new product data management processes as part of the IDMP submission process, there will be few guarantees around data quality or lineage.

I fear that many of those with the objection that they don’t have time for MDM are really saying they don’t have enough time to implement IDMP as a new discipline, process or practice.  This will expose them to the risk of non-compliance fines of up to 5% of revenue, and recurring fines of 2.5% of revenue.

In my mind, the challenge is not ‘do we have time to implement MDM?’, but rather ‘can we be successful both by and beyond July 2016, without implementing MDM?’  By MDM I am referring to the technology and the organisational aspects of creating and delivering complete and accurate data.

Share
Posted in Data Governance, Life Sciences | Tagged , | Leave a comment

Building an Impactful Data Governance – One Step at a Time

Let’s face it, building a Data Governance program is no overnight task.  As one CDO puts it:  ”data governance is a marathon, not a sprint”.  Why? Because data governance is a complex business function that encompasses technology, people and process, all of which have to work together effectively to ensure the success of the initiative.  Because of the scope of the program, Data Governance often calls for participants from different business units within an organization, and it can be disruptive at first.

Why bother then?  Given that data governance is complex, disruptive, and could potentially introduce additional cost to a company?  Well, the drivers for data governance can vary for different organizations.  Let’s take a close look at some of the motivations behind data governance program.

For companies in heavily regulated industries, establishing a formal data governance program is a mandate.  When a company is not compliant, consequences can be severe. Penalties could include hefty fines, brand damage, loss in revenue, and even potential jail time for the person who is held accountable for being noncompliance. In order to meet the on-going regulatory requirements, adhere to data security policies and standards, companies need to rely on clean, connected and trusted data to enable transparency, auditability in their reporting to meet mandatory requirements and answer critical questions from auditors.  Without a dedicated data governance program in place, the compliance initiative could become an on-going nightmare for companies in the regulated industry.

A data governance program can also be established to support customer centricity initiative. To make effective cross-sells and ups-sells to your customers and grow your business,  you need clear visibility into customer purchasing behaviors across multiple shopping channels and touch points. Customer’s shopping behaviors and their attributes are captured by the data, therefore, to gain thorough understanding of your customers and boost your sales, a holistic Data Governance program is essential.

Other reasons for companies to start a data governance program include improving efficiency and reducing operational cost, supporting better analytics and driving more innovations. As long as it’s a business critical area and data is at the core of the process, and the business case is loud and sound, then there is a compelling reason for launching a data governance program.

Now that we have identified the drivers for data governance, how do we start?  This rather loaded question really gets into the details of the implementation. A few critical elements come to consideration including: identifying and establishing various task forces such as steering committee, data governance team and business sponsors; identifying roles and responsibilities for the stakeholders involved in the program; defining metrics for tracking the results.  And soon you will find that on top of everything, communications, communications and more communications is probably the most important tactic of all for driving the initial success of the program.

A rule of thumb?  Start small, take one-step at a time and focus on producing something tangible.

Sounds easy, right? Think this is easy?!Well, let’s hear what the real-world practitioners have to say. Join us at this Informatica webinar to hear Michael Wodzinski, Director of Information Architecture, Lisa Bemis, Director of Master Data, Fabian Torres, Director of Project Management from Houghton Mifflin Harcourt, global leader in publishing, as well as David Lyle, VP of product strategy from Informatica to discuss how to implement  a successful data governance practice that brings business impact to an enterprise organization.

If you are currently kicking the tires on setting up data governance practice in your organization,  I’d like to invite you to visit a member-only website dedicated to Data Governance:  http://governyourdata.com/. This site currently has over 1,000 members and is designed to foster open communications on everything data governance. There you will find conversations on best practices, methodologies, frame works, tools and metrics.  I would also encourage you to take a data governance maturity assessment to see where you currently stand on the data governance maturity curve, and compare the result against industry benchmark.  More than 200 members have taken the assessment to gain better understanding of their current data governance program,  so why not give it a shot?

Governyourdata.com

Governyourdata.com

Data Governance is a journey, likely a never-ending one.  We wish you best of the luck on this effort and a joyful ride! We love to hear your stories.

Share
Posted in Big Data, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Master Data Management | Tagged , , , , , , , , , , , | 1 Comment

Asia-Pacific Ecommerce surpassed Europe and North America

With a total B2C e-commerce turnover of $567.3bn in 2013, Asia-Pacific was the strongest e-commerce region in the world in 2013, as it surpassed Europe ($482.3bn) and North America ($452.4bn). Online sales in Asia-Pacific expected to have reached $799.2 billion in 2014, due to latest report from the Ecommerce Foundation.

Revenue: China, followed by Japan and Australia
As a matter of fact, China was the second-largest e-commerce market in the world, only behind the US ($419.0 billion), and for 2014 it is estimated that China even surpassed the US ($537.0 billion vs. $456.0 billion). In terms of B2C e-commerce turnover, Japan ($136.7 billion) ranked second, followed by Australia ($35.7 billion), South Korea ($20.2 billion) and India ($10.7 billion).

On average, Asian-Pacific e-shoppers spent $1,268 online in 2013
Ecommerce Europe’s research reveals that 235.7 million consumers in Asia-Pacific purchased goods and services online in 2013. On average, APAC online consumers each spent $1,268 online in 2013. This is slightly lower than the global average of $1,304. At $2,167, Australian e-shopper were the biggest spenders online, followed by the Japanese ($1,808) and China ($1,087).

Mobile: Japan and Australia lead the pack
In the frequency of mobile purchasing Japan shows the highest adoption, followed by Japan. An interesting fact is that 50% of transactions are done at home, 20% at work and 10% on the go.

frequency mobile shopping APAC

You can download the full report here. What does this mean for your business opportunity? Read more on the omnichannel trends 2015, which are driving customer experience. Happy to discuss @benrund.

Share
Posted in CMO, Customer Acquisition & Retention, DaaS, Data Quality, Manufacturing, Master Data Management, PiM, Product Information Management, Retail | Tagged , , , , | Leave a comment

How Insurance Companies Benefits from Information Management

insurance

A single trusted source of master data fuels applications with clean, consistent and connected customer, policy and claims information

Insurance is a highly competitive business that hinges on providing the right products and industry-leading service to customers. Accurate data is the lifeblood of this business. To overcome their struggle with fragmented data across product lines, functions and channels, a leading US-based insurance company has built a world-class information management practice to enable the company to quickly collect and analyze data, whether it is financial, claims, policy or customer data.

“Our advances in technology and distribution channel diversification along with increased brand awareness and innovative products and services — are moving us closer to our goal of becoming a top-five personal lines carrier, “ said the president of the insurance company.


Delivering real-time access to the Total Customer Relationship across channels, functions and product lines

The insurance company’s data integration and data management initiative included:

  • an Enterprise Data Warehouse (EDW) from Teradata,
  • reporting from MicroStrategy, and
  • data integration and master data management (MDM) technology from Informatica to better manage customer and policy data.

This provides the information infrastructure required to propel the insurance company’s personal insurance business into the top tier of personal insurers in the country.

Business analysts in claims, marketing, policy and finance as well as agents in the field, sales people and claims adjusters now have access to clean, consistent and connected customer information for analytics, performance measurement and decision-making.

Within their business applications, the Information Management team has delivered real-time access to the total customer relationship across product lines, channels and functions.

How did they do it?

The team identified the data sources that contain valuable customer information. They’re accessing that customer information using data integration and bringing into a central location, the Informatica MDM hub, where it’s managed, mastered and shared with downstream systems on an ongoing basis, again using data integration. The company now has a “golden record” for each customer, policy and claim and the information continuously updated.

Who benefited?

  1. Claims: The customer information management initiative was instrumental in the successful implementation of a new system to streamline and optimize the company’s claims process. The data integration/data management team leveraged a strong relationship with the claims team to delve into business needs, design the system, and build it from the ground up. Better knowledge of the total customer relationship is accelerating the claims process, leading to increased customer satisfaction and employee performance.
  2. Sales & Customer Service: Now when a customer logs into the company’s website, the system makes a call to the Informatica MDM hub and immediately finds out every policy, every claim, and all other relevant data on the customer and displays it on the screen. Better knowledge of the total customer relationship is leading to better and more relevant insurance products and services, higher customer satisfaction and better sales, marketing and agent performance.

Please join us on Thursday, February 5th at 11am PT for a webinar. You will learn how OneAmerica®, which offers life insurance, retirement services and employee benefits, is shifting from a policy-centric to a customer-centric operation. Nicolas Lance, Vice President of Retirement Income Strategies and Head of Customer Data will explain how this shift will enable the company to improve customer experience and marketing, distribution partner and call and service center effectiveness.

Register here: http://infa.media/1xWlHop

Share
Posted in Master Data Management, Total Customer Relationship | Tagged , , | Leave a comment

Are You Ready to Compete on Customer Experience?

 

This blog post initially appeared on CMSwire.com and is reblogged here with their consent.

Are You Ready to Compete on Customer Experience?

Are You Ready to Compete on Customer Experience?

Friends of mine were remodeling their master bath. After searching for a claw foot tub in stores and online, they found the perfect one that fit their space. It was only available for purchase on the retailer’s e-commerce site, they bought it online.

When it arrived, the tub was too big. The dimensions online were incorrect. They went to return it to the closest store, but were told they couldn’t — because it was purchased online, they had to ship it back.

The retailer didn’t have a total customer relationship view or a single view of product information or inventory across channels and touch points. This left the customer representative working with a system that was a silo of limited information. She didn’t have access to a rich customer profile. She didn’t know that Joe and his wife spent almost $10,000 with the brand in the last year. She couldn’t see the products they bought online and in stores. Without this information, she couldn’t deliver a great customer experience.

It was a terrible customer experience. My friends share it with everyone who asks about their remodel. They name the retailer when they tell the story. And, they don’t shop there anymore. This terrible customer experience is negatively impacting the retailer’s revenue and brand reputation.

Bad customer experiences happen a lot. Companies in the US lose an estimated $83 billion each year due to defections and abandoned purchases as a direct result of a poor experience, according to a Datamonitor/Ovum report.

Customer Experience is the New Marketing

Gartner believes that by 2016, companies will compete primarily on the customer experiences they deliver. So who should own customer experience?

Twenty-five percent of CMOs say that their CEOs expect them to lead customer experience. What’s their definition of customer experience? “The practice of centralizing customer data in an effort to provide customers with the best possible interactions with every part of the company, from marketing to sales and even finance.”

Mercedes Benz USA President and CEO, Steve Cannon said, “Customer experience is the new marketing.”

The Gap Between Customer Expectations + Your Ability to Deliver

My previous post, 3 Barriers to Delivering Omnichannel Experiences, explained how omnichannel is all about seeing your business through the eyes of your customer. Customers don’t think in terms of channels and touch points, they just expect a seamless, integrated and consistent customer experience. It’s one brand to the customer. But there’s a gap between customer expectations and what most businesses can deliver today.

Most companies who sell through multiple channels operate in silos. They are channel-centric rather than customer-centric. This business model doesn’t empower employees to deliver seamless, integrated and consistent customer experiences across channels and touch points. Different leaders manage each channel and are held accountable to their own P&L. In most cases, there’s no incentive for leaders to collaborate.

Old Navy’s CMO, Ivan Wicksteed got it right when he said,

“Seventy percent of searches for Old Navy are on a mobile device. Consumers look at the product online and often want to touch it in the store. The end goal is not to get them to buy in the store. The end goal is to get them to buy.”

The end goal is what incentives should be based on.

Executives at most organizations I’ve spoken with admit they are at the very beginning stages of their journey to becoming omnichannel retailers. They recognize that empowering employees with a total customer relationship view and a single view of product information and inventory across channels are critical success factors.

Becoming an omnichannel business is not an easy transition. It forces executives to rethink their definition of customer-centricity and whether their business model supports it. “Now that we need to deliver seamless, integrated and consistent customer experiences across channels and touch points, we realized we’re not as customer-centric as we thought we were,” admitted an SVP of marketing at a financial services company.

You Have to Transform Your Business

“We’re going through a transformation to empower our employees to deliver great customer experiences at every stage of the customer journey,” said Chris Brogan, SVP of Strategy and Analytics at Hyatt Hotels & Resorts. “Our competitive differentiation comes from knowing our customers better than our competitors. We manage our customer data like a strategic asset so we can use that information to serve customers better and build loyalty for our brand.”

Hyatt uses data integration, data quality and master data management (MDM) technology to connect the numerous applications that contain fragmented customer data including sales, marketing, e-commerce, customer service and finance. It brings the core customer profiles together into a single, trusted location, where they are continually managed. Now its customer profiles are clean, de-duplicated, enriched and validated. Members of a household as well as the connections between corporate hierarchies are now visible. Business and analytics applications are fueled with this clean, consistent and connected information so customer-facing teams can do their jobs more effectively.

When he first joined Hyatt, Brogan did a search for his name in the central customer database and found 13 different versions of himself. This included the single Chris Brogan who lived across the street from Wrigley Field with his buddies in his 20s and the Chris Brogan who lives in the suburbs with his wife and two children. “I can guarantee those two guys want something very different from a hotel stay,” he joked. Those guest profiles have now been successfully consolidated.

According to Brogan,

“Successful marketing, sales and customer experience initiatives need to be built on a solid customer data foundation. It’s much harder to execute effectively and continually improve if your customer data is a mess.”

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Improving How You Manage, Use and Analyze Data is More Important Than Ever

Some companies lack a single view of product information across channels and touch points. About 60 percent of retail managers believe that shoppers are better connected to product information than in-store associates. That’s a problem. The same challenges exist for product information as customer information. How many different systems contain valuable product information?

Harrods overcame this challenge. The retailer has a strategic initiative to transform from a single iconic store to an omnichannel business. In the past, Harrods’ merchants managed information for about 500,000 products for the store point of sale system and a few catalogs. Now they are using product information management technology (PIM) to effectively manage and merchandise 1.7 million products in the store and online.

Because they are managing product information centrally, they can fuel the ERP system and e-commerce platform with full, searchable multimedia product information. Harrods has also reduced the time it takes to introduce new products and generate revenue from them. In less than one hour, buyers complete the process from sourcing to market readiness.

It Ends with Satisfied Customers

By 2016, you will need to be ready to compete primarily on the customer experiences you deliver across channels and touch points. This means really knowing who your customers are so you can serve them better. Many businesses will transform from a channel-centric business model to a truly customer-centric business model. They will no longer tolerate messy data. They will recognize the importance of arming marketing, sales, e-commerce and customer service teams with the clean, consistent and connected customer, product and inventory information they need to deliver seamless, integrated and consistent experiences across touch points. And all of us will be more satisfied customers.

Share
Posted in 5 Sales Plays, CMO, Data Governance, Data Integration, Data Quality, Master Data Management, PiM | Tagged , , , , | 1 Comment

8 Information Quality Predictions for 2015 And Beyond

Information Quality Predictions

Information Quality Predictions

Andy Hayler of Information Difference wrote in October last year that it’s been 10 years since the master data management (MDM) industry emerged. Andy sees MDM technology maturing and project success rates rising. He concluded that MDM has moved past its infancy and has a promising future as it is approaching its teenage years.

The last few months have allowed me to see MDM, data quality and data governance from a completely different perspective. I sat with other leaders here at Informatica, analysts who focus on information quality and spent time talking to our partners who work closely with customers on data management initiatives. As we collectively attempted to peer into the crystal ball and forecast what will be hot – and what will not – in this year and beyond for MDM and data quality, here are few top predictions that stood out.

1. MDM will become a single platform for all master entities
“The classical notion of boundaries that existed where we would say, this is MDM versus this is not MDM is going to get blurred,” says Dennis Moore – SVP, Information Quality Solutions (IQS), Informatica. “Today, we master a fairly small number of attributes in MDM. Rather than only mastering core attributes, we need to master business level entities, like customer, product, location, assets, things, etc., and combine all relevant attributes into a single platform which can be used to develop new “data fueled” applications. This platform will allow mastering of data, aggregate data from other sources, and also syndicate that data out into other systems.”

Traditionally MDM was an invisible hub that was connected to all the spokes. Instead, Dennis says – “MDM will become more visible and will act as an application development platform.”

2. PIM is becoming more integrated environment that covers all information about products and related data in single place
More and more customers want to have single interface which will allow them to manage all product information. Along with managing a product’s length, width, height, color, cost etc., they probably want to see data about the history, credit rating, previous quality rating, sustainability scorecard, returns, credits and so on. Dennis says – “All the product information in one place helps make better decisions with embedded analytics, giving answers to questions such as:

  • What were my sales last week?
  • Which promotions are performing well and poorly?
  • Which suppliers are not delivering on their SLAs?
  • Which stores aren’t selling according to plan?
  • How are the products performing in specific markets?”

Essentially, PIM will become a sovereign supplier of product data that goes in your catalog and ecommerce system that will be used by merchandisers, buyers, and product and category managers. It will become the buyer’s guide and a desktop for the person whose job is to figure out how to effectively promote products to meet sales targets.

3. MDM will become an integral part of big data analytics projects
“Big data analytics suffers from the same challenges as traditional data warehouses – bad data quality produces sub-optimal intelligence. MDM has traditionally enabled better analysis and reporting with high quality master data. Big data analytics will also immensely benefit from MDM’s most trustworthy information.” – Said Ravi Shankar – VP of Product Marketing, MDM, Informatica

Naveen Sharma who heads Enterprise Data Management practice at Cognizant reemphasized what I heard from Dennis. He says – “With big data and information quality coming together, some of the boundaries between a pure MDM system and a pure analytical system will start to soften”. Naveen explains – “MDM is now seen as an integral part of big data analytics projects and it’s a huge change from a couple of years ago. Two of large retailers we work with are going down the path of trying to bring not only the customer dimension but the associated transactional data to derive meaning into an extended MDM platform. I see this trend continuing in 2015 and beyond with other verticals as well.”

4. Business requirements are leading to the creation of solutions
There are several business problems being solved by MDM, such as improving supplier spend management and collaboration with better supplier data. Supply chain, sourcing and procurement teams gain significant cost savings and a boost in productivity by mastering supplier, raw materials and product information and fueling their business and analytical applications with that clean, consistent and connected information. Jakki Geiger, Senior Director of IQS Solutions Marketing at Informatica says, “Business users want more than just the underlying infrastructure to manage business-critical data about suppliers, raw materials, and products. They want to access this information directly through a business-friendly user interface. They want a business process-driven workflow to manage the full supplier lifecycle, including: supplier registration, qualification, verification, onboarding and off-boarding. Instead of IT building these business-user focused solutions on top of an MDM foundation, vendors are starting to build ready-to-use MDM solutions like the Total Supplier Relationship solution.” Read more about Valspar’s raw materials spend management use case.

5. Increased adoption of matching and linking capabilities on Hadoop 
“Many of our customers have significantly increased the amount of data they want to master,” says Dennis Moore. Days when tens of millions of master records were a lot are long gone and having hundreds of millions of master records and billions of source records is becoming almost common. An increasing number of master data sources –internal and external to organization – are contributing significantly to the rise in data volumes. To accommodate these increasing volumes, Dennis predicts that large enterprises will look at running complex matching and linking capabilities on Hadoop – a cost effective and flexible way to analyze large amount of data.

6. Master insight management is going to be next big step
“MDM will evolve into master insight management as organizations try to relate trusted data they created in MDM with transactional and social interaction data,” said Rob Karel – VP of Product Strategy and Product Marketing, IQS, Informatica. “The innovations in machine and deep learning techniques will help organizations such as healthcare prescribe next best treatment based on history of patients, retailers suggest best offers based on customer interest and behavior, public sector companies will see big steps in social services, etc.”

Rob sees MDM at the heart of this innovation bringing together relevant information about multiple master entities and acting as a core system for insight management innovations.

7. MDM and Data Governance
Aaron Zornes – Chief research officer at the MDM Institute predicts that in 2014-15, vendor MDM solutions will move from “passive-aggressive” mode to “proactive” data governance mode. Data governance for MDM will move beyond simple stewardship to convergence of task management, workflow, policy management and enforcement according to Aaron.

8. The market will solidify for cloud based MDM adoption
Aaron says – “Cloud-innate services for DQ and DG will be more prevalent; however, enterprise MDM will remain on premise with increasing integration to cloud applications in 2015.

Naveen sees lot of synergy around cloud based MDM offerings and says – “The market is solidifying for MDM on cloud but the flood gates are yet to open”.  Naveen does not see any reason why MDM market will not go to cloud and gives the example of CRM which was at similar junction before Saleforce came into play. Naveen sees similar shift for MDM and says – “The fears companies have about their data security on cloud is eventually going to fade. If you look closely at any of the recent breaches, these all involved hacks into company networks and not into cloud provider networks. The fact that cloud service providers spend more dollars on data security than any one company can spend on their on-premise security layer will be a major factor affecting the transition”. Naveen sees that big players in MDM will include cloud offerings as part of their toolkit in coming years.

Ravi also predicts an increase in cloud adoption for MDM in future as the concern for placing master data in the cloud becomes less with maximum security provided by cloud vendors.

So, what do you predict? I would love to hear your opinions and comments.

~Prash
@MDMGeek
www.mdmgeek.com

Share
Posted in Big Data, Cloud, Data Governance, Data Quality, Enterprise Data Management, Master Data Management, Product Information Management | Tagged , , , , , | Leave a comment

PIM is not Product MDM, Part 2

Part 1 of this blog touched on the differences between PIM and Product MDM.  Since both play a role in ensuring the availability of high quality product data, it is easy to see the temptation to extend the scope of either product to play a more complete part.  However, there are risks involved in customising software.  PIM and MDM are not exceptions, and any customisations will carry some risk.

In the specific case of looking to extend the role of PIM, the problems start if you just look at the data and think:  “oh, this is just a few more product attributes to add”.  This will not give you a clear picture of the effort or risk associated with customisations.  A complete picture requires looking beyond the attributes as data fields, and considering them in context:  which processes and people (roles) are supported by these attributes?

Recently we were asked to assess the risk of PIM customisation for a customer.  The situation was that data to be included in PIM was currently housed in separate, home grown and aging legacy systems.  One school of thought was to move all the data, and their management tasks, into PIM and retire the three systems.  That is, extending the role of PIM beyond a marketing application and into a Product MDM role.  In this case, we found three main risks of customising PIM for this purpose.  Here they are in more detail:

1. Decrease speed of PIM deployment

  • Inclusion of the functionality (not just the data) will require customisations in PIM, not just additional attributes in the data model.
    • Logic customisations are required for data validity checks, and some value calculations.
    • Additional screens, workflows, integrations and UI customisations will be required for non-marketing roles
    • PIM will become the source for some data, which is used in critical operational systems (e.g. SAP).  Reference checks & data validation cannot be taken lightly due to risks of poor data elsewhere.
  • Bottom line:  A non-standard deployment with drive up implementation cost, time and risk.

2.  Reduce marketing agility

  • In the case concerned, whilst the additional data was important to marketing, it is primarily supporting by non-marketing users and processes including Product Development, Sales and Manufacturing
  • These systems are key systems in their workflow in terms of creating and distributing technical details of new products to other systems, e.g. SAP for production
  • If the systems are retired and replaced with PIM, these non-marketing users will need to be equal partners in PIM:
    • Require access and customised roles
    • Influence over configuration
    • Equal vote in feature/function prioritisation
  • Bottom Line:  Marketing will no longer completely own the PIM system, and may have to sacrifice new functionality to prioritise supporting other roles.

3.  Risk of marketing abandoning the hybrid tool in the mid-term

  • An investment in PIM is usually an investment by Marketing to help them rapidly adapt to a dynamic external market.
    • System agility (point 2) is key to rapid adaption, as is the ability to take advantage of new features within any packaged application.

PiM

  • As more customisations are made, the cost of upgrades can become prohibitive, driven by the cost to upgrade customisations.
    • Cost often driven by consulting fees to change what could be poorly documented code.
    • Risk of falling behind on upgrades, and hence sacrificing access to the newest PIM functionality
  • If upgrades are more expensive than new tools, PIM will be abandoned by Marketing, and they will invest in a new tool.
  • Bottom line:  In a worst case scenario, a customised PIM solution could be left supporting non-marketing functionality with Marketing investing in a new tool.

The first response to the last bullet point is normally “no they wouldn’t”.  Unfortunately this is a pattern both I and some of my colleagues have seen in the area of marketing & eCommerce applications.  The problem is that these areas are so fast moving, that nobody can afford to fall behind in terms of new functionality.  If upgrades are large projects which need lengthy approval and implementation cycles, marketing is unlikely to wait.  It is far easier to start again with a smaller budget under their direct control.  (Which is where PIM should be in the first place.)

In summary:

  • Making PIM look and behave like Product MDM could have some undesirable consequences – both in the short term (current deployment) and in the longer term (application abandonment).
  • A choice for customising PIM vs. enhancing your landscape with Product MDM should be made not on data attributes alone.
  • Your business and data processes should guide you in terms of risk assessment for customisation of your PIM solution.

Bottom Line:  If the risks seem too large, then consider enhancing your IT landscape with Product MDM.  Trading PIM cost & risk for measurable business value delivered by MDM will make a very attractive business case.

 

Share
Posted in PiM, Product Information Management | Tagged , , , | 2 Comments

How Citrix is Using Great Data to Build Fortune Teller-Like Marketing

Build Fortune Teller-Like MarketingCitrix: You may not realize you know them, but chances are pretty good that you do.  And chances are also good that we marketers can learn something about achieving fortune teller-like marketing from them!

Citrix is the company that brought you GoToMeeting and a whole host of other mobile workspace solutions that provide virtualization, networking and cloud services.  Their goal is to give their 100 million users in 260,000 organizations across the globe “new ways to work better with seamless and secure access to the apps, files and services they need on any device, wherever they go.”

Citrix LogoCitrix is a company that has been imagining and innovating for over 25 years, and over that time, has seen a complete transformation in their market – virtual solutions and cloud services didn’t even exist when they were founded. Now it’s the backbone of their business.  Their corporate video proudly states that the only constant in this world is change, and that they strive to embrace the “yet to be discovered.”

Having worked with them quite a bit over the past few years, we have seen first-hand how Citrix has demonstrated their ability to embrace change.

The Problem:

Back in 2011, it became clear to Citrix that they had a data problem, and that they would have to make some changes to stay ahead in this hyper competitive market.  Sales & Marketing had identified data as their #1 concern – their data was incomplete, inaccurate, and duplicated in their CRM system.  And with so many different applications in the organization, it was quite difficult to know which application or data source had the most accurate and up-to-date information.  They realized they needed a single source of the truth – one system of reference where all of their global data management practices could be centralized and consistent.

The Solution:

The marketing team realized that they needed to take control of the solution to their data concerns, as their success truly depended upon it.  They brought together their IT department and their systems integration partner, Cognizant to determine a course of action.  Together they forged an overall data governance strategy which would empower the marketing team to manage data centrally – to be responsible for their own success.

Citrix Marketing EnvironmentAs a key element of that data governance / management strategy, they determined that they needed a Master Data Management (MDM) solution to serve as their Single Trusted Source of Customer & Prospect Data.  They did a great deal of research into industry best practices and technology solutions, and decided to select Informatica as their MDM partner. As you can see, Citrix’s environment is not unlike most marketing organizations.  The difference is that they are now able to capture and distribute better customer and prospect data to and from these systems to achieve even better results.  They are leveraging internal data sources and systems like CRM (Salesforce) and marketing automation (Marketo).  Their systems live all over the enterprise, both on premises and in the cloud.  And they leverage analytical tools to analyze and dashboard their results.

The Results:

Citrix strategized and implemented their Single Trusted Source of Customer & Prospect solution in a phased approach throughout 2013 and 2014, and we believe that what they’ve been able to accomplish during that short period of time has been nothing short of phenomenal.  Here are the higlights:

Citrix Achieved Tremendous Results

  • Used Informatica MDM to provide clean, consistent and connected channel partner, customer and prospect data and the relationships between them for use in operational applications (SFDC, BI Reporting and Predictive Analytics)
  • Recognized 20% increase in lead-to-opportunity conversion rates
  • Realized 20% increase in marketing team’s operational efficiency
  • Achieved 50% increase in quality of data at the point of entry, and a 50% reduction in the rate of junk and duplicate data for prospects, existing accounts and contact
  • Delivered a better channel partner and customer experience by renewing all of a customers’ user licenses across product lines at one time and making it easy to identify whitespace opportunities to up-sell more user licenses

That is huge!  Can you imagine the impact on your own marketing organization of a 20% increase in lead-to-opportunity conversion?  Can you imagine the impact of spending 20% less time questioning and manually massaging data to get the information you need?  That’s game changing!

Because Citrix now has great data and great resulting insight, they have been able to take the next step and embark on new fortune teller-like marketing strategies.   As Citrix’s Dagmar Garcia discussed during a recent webinar, “We monitor implicit and explicit behavior of transactional leads and accounts, and then we leverage these insights and previous behaviors to offer net new offers and campaigns to our customers and prospects…  And it’s all based on the quality of data we have within our database.”

I encourage you to take a few minutes to listen to Dagmar discuss Citrix’s project on a recent webinar.  In the webinar, she dives deeper into their project, the project scope and timeline, and to what she means by “fortune telling abilities”.  Also, take a look at the customer story section of the Informatica.com website for the PDF case study.  And, if you’re in the mood to learn more, you can download a complimentary copy of the 2014 Gartner Magic Quadrant for MDM of Customer Data Solutions.

Hat’s off to you Citrix, and we look forward to working with you to continue to change the game even more in the coming months and years!

Share
Posted in CMO, Customers, Master Data Management | Tagged , , , , , , , , | Leave a comment

PIM is Not Product MDM – Product MDM is not PIM

Working for Informatica has many advantages.  One of them is that I clearly understand the difference between Product Information Management (PIM) and Master Data Management (MDM) for product data[i].  Since I have this clear in my own mind, it is easy to forget that this may not be as obvious to others.  As frequently happens, it takes a customer to help us articulate why PIM is not the same as Product MDM.  Now that this is fresh in my mind again, I thought I would share why the two are different, and when you should consider each one, or both.

In a lengthy discussion with our customer, many points were raised, discussed and classified.  In the end, all arguments essentially came down to each technology’s primary purpose.  A different primary purpose means that typical capabilities of the two products are geared towards different audiences and use cases.

PIM is a business application that centralizes and streamlines the creation and enhancement of consistent, but localised product content across channels.  (Figure 1)

PIM image

Figure 1:  PIM Product Data Creation Flow

Product MDM is an infrastructure component that consolidates the core global product data that should be consistent across multiple and diverse systems and business processes, but typically isn’t. (Figure 2)

MDM image

Figure 2:  MDM Product Data Consolidation Hub

The choice between the two technologies really comes down the current challenge you are trying to solve.  If you cannot get clean and consistent data out through all your sales channels fast enough, then a PIM solution is the correct choice for you.  However, if your organisation is making poor decisions and seeing bloated costs (e.g. procurement or inventory costs) due to poor internal product data, then MDM technology is the right choice.

But, if it is so simple – why I am even writing this down?  Why are the lines blurring now?

Here is my 3-part theory:

  1. A focus on good quality product data is relatively recent trend.  Different industries started by addressing different challenges.
    1. PIM has primarily been used in retail B2C environments and distributor B2B or B2C environments.  That is, organisations which are primarily focused around the sale of a product, rather than the design and production of the product.
    2. Product MDM has been used predominately by manufacturers of goods, looking to standardise and support global processes, reporting and analytics across departments.
  2. Now, manufacturers are increasingly looking to take control of their product information outside their organisation.
    1. This trend is most notable in Consumer Goods (CG) companies.
    2. Increasingly consistent, appealing and high quality data in the consumer realm is making the difference between choosing your product vs. a competitor’s.
    3. CG must ensure all channels – their own and their retail partner’s – are fed with high quality product data.
  3. So PIM is now entering organisations which should already have a Product MDM tool.  If they don’t, confusion arises.
    1. When Marketing buys PIM (and it normally is Marketing), quite frankly this shows up the poor product data management upstream of marketing.
    2. It becomes quite tempting to try to jam as much product data into a PIM system as possible, going beyond the original scope of PIM.

The follow-on question is clear:  why can’t we just make a few changes and use PIM as our MDM technology, or MDM as our PIM solution?  It is very tempting.  Both data models can be extended to add extra fields.  In Informatica’s case, both are supported by a common, feature-rich workflow tool.  However, there are inherent risks in using PIM where Product MDM is needed or Product MDM where PIM is needed.

After discussions with our customer, we identified 3 risks of modifying PIM when it is really Product MDM functionality that is needed:

  • Decrease speed of PIM deployment
  • Reduce marketing agility
  • Risk of marketing abandoning the hybrid tool in the mid-term

The last turned out to be the least understood, but that doesn’t make it any less real.  Since each of these risks deserves more explanation, I will discuss them in Part 2 of this Blog. (Still to be published)

In summary, PIM and Product MDM are designed to play different roles in the quest for the availability of high quality product data both internally and externally.  There are risks and costs associated with modifying one to take on the role of the other.  In many cases there is place for both PIM and MDM, but you will still need to choose a starting point.  Each journey to high quality product data will be different, but the goal is still the same – to turn product data into business value.

I (or one of my colleagues in a city near you) will be happy to help you understand what the best starting point is for your organisation.


[i] In case you were wondering, this is not the benefit that I joined Informatica for.

Share
Posted in Master Data Management, Product Information Management | Tagged , | Leave a comment