Category Archives: Master Data Management

Will Social Search Replace Search Engines?

More 60% of shopping journeys start with Google. This is what I wrote in one of my white papers on product information and their impact on omni-channel purchasing decisions. But how long will that be true? We all learned that shopping which is influenced by digital can dramatically change, everyday.

Marketing Sherpa reports this week which channels e-commerce companies are investing in. E-mail marketing, social media, SEO and paid search are listed as top 4 invests.

Channels investments due to MarketingSherpa

Channels investments due to MarketingSherpa

Did you notice the phenomena of social product search? We had BBQ with friends last Saturday and my friend Marco told about his new digital radio he can use for playing music from his mobile devices. That made me think about looking for a Bluetooth or Wifi ready and stylish gadget for my living room. Should not be to big, but pound enough and cool. Wifi or Bluetooth is important because I don’t want any visible cable.

This is what I did next: I posted my question to Facebook, not to a search engine.
As you know, the always connected customer is always online, on his “informed purchase journey”. Within minuted I had a series of recommendations from friends and colleagues. Some posted links to products they recommend. As some friends are know for having much more knowledge than me on consumer electronics, and both confirm same brand names…

social search

My snapshot of a “Social Search”.

What does this mean for easy access to product information and omni-channel commerce?

Keep Us Posted

Internet Retailing Conference & Exhibition (IRCE) is around the corner. See something you absolutely love? Let us know! Keep us posted by using @InformaticaCorp #IRCE2014

FacebookTwitterLinkedInEmailPrintShare
Posted in Master Data Management, PiM, Product Information Management, Real-Time, Retail | Tagged , , | Leave a comment

Creating a Differentiated Retail Customer Experience

Michael Porter

Michael Porter

In my marketing classes, I like to share on the works of Michael Porter’s Competitive Strategy. This includes discussing his three generic business strategies. We discuss, for example, the difference between an “efficiency strategy” (aka Walmart) and an “effectiveness strategy” (aka Target or even better, a high end service oriented retailer). I always make sure that students include in their thinking on differentiation the impact of customer service.

customer service One of these high end service oriented retailers is using technology to increase its customer intimacy as well as holistic customer knowledge. Driving this for them involves understanding when customers use their full price and off price customer purchase channels. I was so fascinated about their question that I decided to ask the font of all wisdom, my wife. She said that her choice of channel is based on my current salary or her projected length of use of an item. So if she is buying a jacket that she wants to use for years, she will go to the full price channel but for a dress or pair of shoes for one time use like a Wedding, she will go to the lower priced channel. Clearly, there is more than one answer to these questions. This retailer wants to understand the answers by customer segments.

singleviewTo create an understanding of each customer segment, this retailer wants to create a “high fidelity” view of data coming from customers, markets, and transactional interactions. This means that that they need two new business capabilities. First is a single integrated view of their customers across channels and the ability to see the cause and effect of customer channel selection decisions. Do customers spend more time at the full price channel option when, for example, sale offerings are going on?

To solve these problems, the retailer has implemented two technology approaches, master data management to bring together its disparate views of customer and big data for quick hypothesis testing of customer data from structured and unstructured sources. With Master Data, they get a single view of customer across differing IT systems. For separate customer specific analysis they have created operational and analytic views on top of the MDM system. And while they have an enterprise data warehouse and multiple analytical data marts, they have also created a HADOOP cluster to test hypothesis about the cross channel customer segments. They are using the single view of customer regardless of channels and transaction history to understand when customers use which channel and as well what marketing or other campaigns pulled the customer in. With this, they are creating inferred attributes for customer market segments.

Clearly, the smarter the retailer gets, the greater the differentiation the retailer services can be to customers. At the same time, the data let’s the retailer optimize marketing between channels. This is using data to create service differentiation.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Master Data Management, Retail | Tagged | Leave a comment

Health Plans, Create Competitive Differentiation with Risk Adjustment

improve risk adjustmentExploring Risk Adjustment as a Source of Competitive Differentiation

Risk adjustment is a hot topic in healthcare. Today, I interviewed my colleague, Noreen Hurley to learn more. Noreen tell us about your experience with risk adjustment.

Before I joined Informatica I worked for a health plan in Boston. I managed several programs  including CMS Five Start Quality Rating System and Risk Adjustment Redesign.  We recognized the need for a robust diagnostic profile of our members in support of risk adjustment. However, because the information resides in multiple sources, gathering and connecting the data presented many challenges. I see the opportunity for health plans to transform risk adjustment.

As risk adjustment becomes an integral component in healthcare, I encourage health plans to create a core competency around the development of diagnostic profiles. This should be the case for health plans and ACO’s.  This profile is the source of reimbursement for an individual. This profile is also the basis for clinical care management.  Augmented with social and demographic data, the profile can create a roadmap for successfully engaging each member.

Why is risk adjustment important?

Risk Adjustment is increasingly entrenched in the healthcare ecosystem.  Originating in Medicare Advantage, it is now applicable to other areas.  Risk adjustment is mission critical to protect financial viability and identify a clinical baseline for  members.

What are a few examples of the increasing importance of risk adjustment?

1)      Centers for Medicare and Medicaid (CMS) continues to increase the focus on Risk Adjustment. They are evaluating the value provided to the Federal government and beneficiaries.  CMS has questioned the efficacy of home assessments and challenged health plans to provide a value statement beyond the harvesting of diagnoses codes which result solely in revenue enhancement.   Illustrating additional value has been a challenge. Integrating data across the health plan will help address this challenge and derive value.

2)      Marketplace members will also require risk adjustment calculations.  After the first three years, the three “R’s” will dwindle down to one ‘R”.  When Reinsurance and Risk Corridors end, we will be left with Risk Adjustment. To succeed with this new population, health plans need a clear strategy to obtain, analyze and process data.  CMS processing delays make risk adjustment even more difficult.  A Health Plan’s ability to manage this information  will be critical to success.

3)      Dual Eligibles, Medicaid members and ACO’s also rely on risk management for profitability and improved quality.

With an enhanced diagnostic profile — one that is accurate, complete and shared — I believe it is possible to enhance care, deliver appropriate reimbursements and provide coordinated care.

How can payers better enable risk adjustment?

  • Facilitate timely analysis of accurate data from a variety of sources, in any  format.
  • Integrate and reconcile data from initial receipt through adjudication and  submission.
  • Deliver clean and normalized data to business users.
  • Provide an aggregated view of master data about members, providers and the relationships between them to reveal insights and enable a differentiated level of service.
  • Apply natural language processing to capture insights otherwise trapped in text based notes.

With clean, safe and connected data,  health plans can profile members and identify undocumented diagnoses. With this data, health plans will also be able to create reports identifying providers who would benefit from additional training and support (about coding accuracy and completeness).

What will clean, safe and connected data allow?

  • Allow risk adjustment to become a core competency and source of differentiation.  Revenue impacts are expanding to lines of business representing larger and increasingly complex populations.
  • Educate, motivate and engage providers with accurate reporting.  Obtaining and acting on diagnostic data is best done when the member/patient is meeting with the caregiver.  Clear and trusted feedback to physicians will contribute to a strong partnership.
  • Improve patient care, reduce medical cost, increase quality ratings and engage members.
FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Data Governance, Data Integration, Enterprise Data Management, Healthcare, Master Data Management, Operational Efficiency | Tagged , , | Leave a comment

Who Has the Heart to Adopt this Orphan Oil Well?

As I browsed my BBC app a few weeks ago, I ran into this article about environmental contamination of oil wells in the UK, which were left to their own devices. The article explains that a lack of data and proper data management is causing major issues for gas and oil companies. In fact, researchers found no data for more than 2,000 inactive wells, many of which have been abandoned or “orphaned”(sealed and covered up). I started to scratch my head imagining what this problem looks like in places like Brazil, Nigeria, Malaysia, Angola and the Middle East. In these countries and regions, regulatory oversight is, on average, a bit less regulated.

Data Management

Like Oliver, this well needs a home!

On top of that, please excuse my cynicism here, but an “Orphan” well is just as ridiculous a concept as a “Dry” well.  A hole without liquid inside is not a well but – you guessed it – a hole.  Also, every well has a “Parent”, meaning

  • The person or company who drilled it
  • A  land owner who will get paid from its production and allowed the operation (otherwise it would be illegal)
  • A financier who fronted the equipment and research cost
  • A regulator, who is charged with overseeing the reservoir’s exploration

Let the “hydrocarbon family court judge” decide whose problem this orphan is with well founded information- no pun intended.  After all, this “domestic disturbance” is typically just as well documented as any police “house call”, when you hear screams from next door. Similarly, one would expect that when (exploratory) wells are abandoned and improperly capped or completed, there is a long track record about financial or operational troubles at the involved parties.  Apparently I was wrong.  Nobody seems to have a record of where the well actually was on the surface, let alone subsurface, to determine perforation risks in itself or from an actively managed bore nearby.

This reminds me of a meeting with an Asian NOC’s PMU IT staff, who vigorously disagreed with every other department on the reality on the ground versus at group level. The PMU folks insisted on having fixed all wells’ key attributes:

  1. Knowing how many wells and bores they had across the globe and all types of commercial models including joint ventures
  2. Where they were and are today
  3. What their technical characteristics were and currently are

The other departments, from finance to strategy, clearly indicated that 10,000 wells across the globe currently being “mastered” with (at least initially) cheap internal band aid fixes has a margin of error of up to 10%.   So much for long term TCO.  After reading this BBC article, this internal disagreement made even more sense.

If this chasm does not make a case for proper mastering of key operational entities, like wells, I don’t know what does. It also begs the question how any operation with potentially very negative long term effects can have no legally culpable party being capture in some sort of, dare I say, master register.  Isn’t this the sign of “rule of law” governing an advanced nation, e.g. having a land register, building permits, wills, etc.?

I rest my case, your honor.  May the garden ferries forgive us for spoiling their perfectly manicured lawn.  With more fracking and public scrutiny on the horizon, maybe regulators need to establish their own “trusted” well master file, rather than rely on oil firms’ data dumps.  After all, the next downhole location may be just a foot away from perforating one of these “orphans” setting your kitchen sink faucet on fire.

Do you think another push for local government to establish “well registries” like they did ten years ago for national IDs, is in order?

Disclaimer: Recommendations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks.  While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Master Data Management | Tagged , , | Leave a comment

A Data-Driven Healthcare Culture is Foundational to Delivering Personalized Medicine in Healthcare

According to a recent article in the LA Times, healthcare costs in the United States far exceed costs in other countries. For example, heart bypass surgery costs an average of $75,345 in the U.S. compared to $15,742 in the Netherlands and $16,492 in Argentina. In the U.S. healthcare accounts for 18% of the U.S. GDP and is increasing. 

Michelle Blackmer is an healthcare industry expert at Informatica

Michelle Blackmer is an healthcare industry expert at Informatica

Michelle Blackmer is an healthcare industry expert at Informatica. In this interview, she explains why business as usual isn’t good enough anymore. Healthcare organizations are rethinking how they do business in an effort to improve outcomes, reduce costs, and comply with regulatory pressures such as the Affordable Care Act (ACA). Michelle believes a data-driven healthcare culture is foundational to personalized medicine and discusses the importance of clean, safe and connected data in executing a successful transformation.

Q. How is the healthcare industry responding to the rising costs of healthcare?
In response to the rising costs of healthcare, regulatory pressures (i.e. Affordable Care Act (ACA)), and the need to better patient outcomes at lower costs, the U.S. healthcare industry is transforming from a volume-based to a value-based model. In this new model, healthcare organizations need to invest in delivering personalized medicine.

To appreciate the potential of personalized medicine, think about your own healthcare experience. It’s typically reactive. You get sick, you go to the doctor, the doctor issues a prescription and you wait a couple of days to see if that drug works. If it doesn’t, you call the doctor and she tries another drug. This process is tedious, painful and costly.

Now imagine if you had a chronic disease like depression or cancer. On average, any given prescription drug only works for half of those who take it. Among cancer patients, the rate of ineffectiveness jumps to 75 percent. Anti-depressants are effective in only 62 percent of those who take them.

Video: MD Anderson Cancer CenterOrganizations like MD Anderson and UPMC aim to put an end to cancer. They are combining scientific research with access to clean, safe and connected data (data of all types including genomic data). The insights revealed will empower personalized chemotherapies. Personalized medicine offers customized treatments based on patient history and best practices. Personalized medicine will transform healthcare delivery. Click on the links to watch videos about their transformational work.

Q. What role does data play in enabling personalized medicine?
Data is foundational to value-based care and personalized medicine. Not just any data will do. It needs to be clean, safe and connected data. It needs to be delivered rapidly across hallways and across networks.

As an industry, healthcare is at a stage where meaningful electronic data is being generated. Now you need to ensure that the data is accessible and trustworthy so that it can be rapidly analyzed. As data is aggregated across the ecosystem, married with financial and genomic data, data quality issues become more obvious. It’s vital that you can define the data issues so the people can spend their time analyzing the data to gain insights instead of wading through and manually resolving data quality issues.

The ability to trust data will differentiate leaders from the followers. Leaders will advance personalized medicine because they rely on clean, safe and connected data to:

1)      Practice analytics as a core competency
2)      Define evidence, deliver best practice care and personalize medicine
3)      Engage patients and collaborate to foster strong, actionable relationships

Healthcare e-bookTake a look at this Healthcare eBook for more on this topic: Potential Unlocked: Transforming Healthcare by Putting Information to Work.

Q. What is holding healthcare organizations back from managing their healthcare data like other mission-critical assets?
When you say other mission-critical assets, I think of facilitates, equipment, etc. Each of these assets has people and money assigned to manage and maintain them. The healthcare organizations I talk to who are highly invested in personalized medicine recognize that data is mission-critical. They are investing in the people, processes and technology needed to ensure data is clean, safe and connected. The technology includes data integration, data quality and master data management (MDM).

What’s holding other healthcare organizations back is that while they realize they need data governance, they wrongly believe they need to hire big teams of “data stewards” to be successful. In reality, you don’t need to hire a big team. Use the people you already have doing data governance. You may not have made this a formal part of their job description and they might not have data governance technologies yet, but they do have the skillset and they are already doing the work of a data steward.

So while a technology investment is required and you need people who can use the technology, start by formalizing the data stewardship work people are doing already as part of their current job. This way you have people who understand the data, taking an active role in the management of the data and they even get excited about it because their work is being recognized. IT takes on the role of enabling these people instead of having responsibility for all things data.

Q. Can you share examples of how immature information governance is a serious impediment to healthcare payers and providers?
Cost of Bad DataSure, without information governance, data is not harmonized across sources and so it is hard to make sense of it. This isn’t a problem when you are one business unit or one department, but when you want to get a comprehensive view or a view that incorporates external sources of information, this approach falls apart.

For example, let’s say the cardiology department in a healthcare organization implements a dashboard. The dashboard looks impressive. Then a group of physicians sees the dashboard, point out erroes and ask where the information (i.e. diagnosis or attending physician) came from. If you can’t answer these questions, trace the data back to its sources, or if you have data inconsistencies, the dashboard loses credibility. This is an example of how analytics fail to gain adoption and fail to foster innovation.

Q. Can you share examples of what data-driven healthcare organizations are doing differently?
Certainly, while many are just getting started on their journey to becoming data-driven, I’m seeing some inspiring  examples, including:

  • Implementing data governance for healthcare analytics. The program and data is owned by the business and enabled by IT and supported by technology such as data integration, data quality and MDM.
  • Connecting information from across the entire healthcare ecosystem including 3rd party sources like payers, state agencies, and reference data like credit information from Equifax, firmographics from Dun & Bradstreet or NPI numbers from the national provider registry.
  • Establishing consistent data definitions and parameters
  • Thinking about the internet of things (IoT) and how to incorporate device data into analysis
  • Engaging patients through non-traditional channels including loyalty programs and social media; tracking this information in a customer relationship management (CRM) system
  • Fostering collaboration by understanding the relationships between patients, providers and the rest of the ecosystem
  • Analyzing data to understand what is working and what is not working so  that they can drive out unwanted variations in care

Q. What advice can you give healthcare provider and payer employees who want access to high quality healthcare data?
As with other organizational assets that deliver value—like buildings and equipment—data requires a foundational investment in people and systems to maximize return. In other words, institutions and individuals must start managing their mission-critical data with the same rigor they manage other mission-critical enterprise assets.

Q. Anything else you want to add?
Yes, I wanted to thank our 14 visionary customer executives at data-driven healthcare organizations such as MD Anderson, UPMC, Quest Diagnostics, Sutter Health, St. Joseph Health, Dallas Children’s Medical Center and Navinet for taking time out of their busy schedules to share their journeys toward becoming data-driven at Informatica World 2014.  In our next post, I’ll share some highlights about how they are using data, how they are ensuring it is clean, safe and connected and a few data management best practices. InformaticaWorld attendees will be able to download presentations starting today! If you missed InformaticaWorld 2014, stay tuned for our upcoming webinars featuring many of these examples.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Healthcare, Informatica World 2014, Master Data Management, Vertical | Tagged , , , , , , , , , , , , , , | Leave a comment

Business Beware! Corporate IT Is “Fixing” YOUR Data

It is troublesome to me to repeatedly get into conversations with IT managers who want to fix data “for the sake of fixing it”.  While this is presumably increasingly rare, due to my department’s role, we probably see a higher occurrence than the normal software vendor employee.  Given that, please excuse the inflammatory title of this post.

Nevertheless, once the deal is done, we find increasingly fewer of these instances, yet still enough, as the average implementation consultant or developer cares about this aspect even less.  A few months ago a petrochemical firm’s G&G IT team lead told me that he does not believe that data quality improvements can or should be measured.  He also said, “if we need another application, we buy it.  End of story.”  Good for software vendors, I thought, but in most organizations $1M here or there do not lay around leisurely plus decision makers want to see the – dare I say it – ROI.

This is not what a business - IT relationship should feel like

This is not what a business – IT relationship should feel like

However, IT and business leaders should take note that a misalignment due to lack OR disregard of communication is a critical success factor.  If the business does not get what it needs and wants AND it differs what Corporate IT is envisioning and working on – and this is what I am talking about here – it makes any IT investment a risky proposition.

Let me illustrate this with 4 recent examples I ran into:

1. Potential for flawed prioritization

A retail customer’s IT department apparently knew that fixing and enriching a customer loyalty record across the enterprise is a good and financially rewarding idea.  They only wanted to understand what the less-risky functional implementation choices where. They indicated that if they wanted to learn what the factual financial impact of “fixing” certain records or attributes, they would just have to look into their enterprise data warehouse.  This is where the logic falls apart as the warehouse would be just as unreliable as the “compromised” applications (POS, mktg, ERP) feeding it.

Even if they massaged the data before it hit the next EDW load, there is nothing inherently real-time about this as all OLTP are running processes of incorrect (no bidirectional linkage) and stale data (since the last load).

I would question if the business is now completely aligned with what IT is continuously correcting. After all, IT may go for the “easy or obvious” fixes via a weekly or monthly recurring data scrub exercise without truly knowing, which the “biggest bang for the buck” is or what the other affected business use cases are, they may not even be aware of yet.  Imagine the productivity impact of all the roundtripping and delay in reporting this creates.  This example also reminds me of a telco client, I encountered during my tenure at another tech firm, which fed their customer master from their EDW and now just found out that this pattern is doomed to fail due to data staleness and performance.

2. Fix IT issues and business benefits will trickle down

Client number two is a large North American construction Company.  An architect built a business case for fixing a variety of data buckets in the organization (CRM, Brand Management, Partner Onboarding, Mobility Services, Quotation & Requisitions, BI & EPM).

Grand vision documents existed and linked to the case, which stated how data would get better (like a sick patient) but there was no mention of hard facts of how each of the use cases would deliver on this.  After I gave him some major counseling what to look out and how to flesh it out – radio silence. Someone got scared of the math, I guess.

3. Now that we bought it, where do we start

The third culprit was a large petrochemical firm, which apparently sat on some excess funds and thought (rightfully so) it was a good idea to fix their well attributes. More power to them.  However, the IT team is now in a dreadful position having to justify to their boss and ultimately the E&P division head why they prioritized this effort so highly and spent the money.  Well, they had their heart in the right place but are a tad late.   Still, I consider this better late than never.

4. A senior moment

The last example comes from a South American communications provider. They seemingly did everything right given the results they achieved to date.  This gets to show that misalignment of IT and business does not necessarily wreak havoc – at least initially.

However, they are now in phase 3 of their roll out and reality caught up with them.  A senior moment or lapse in judgment maybe? Whatever it was; once they fixed their CRM, network and billing application data, they had to start talking to the business and financial analysts as complaints and questions started to trickle in. Once again, better late than never.

So what is the take-away from these stories. Why wait until phase 3, why have to be forced to cram some justification after the purchase?  You pick, which one works best for you to fix this age-old issue.  But please heed Sohaib’s words of wisdom recently broadcast on CNN Money “IT is a mature sector post bubble…..now it needs to deliver the goods”.  And here is an action item for you – check out the new way for the business user to prepare their own data (30 minutes into the video!).  Agreed?

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, Customer Acquisition & Retention, Customer Services, Data Aggregation, Data Governance, Data Integration, Data Quality, Data Warehousing, Enterprise Data Management, Master Data Management | Leave a comment

Data-Powered Insights Fueled by the “Internet of Master Data”

Master data management (MDM) has come a long way in the past decade or so.  When I was supporting my company’s customer master implementation back in 2001, my management was thrilled to simply have a customer master that brought a bit of order to the chaos sharing customer data between our CRM and ERP applications and downstream into our marketing data warehouse.

karel

Fast forward to 2014 and mastering customer data alone is often table stakes for leadership trying to transform their business from a product- or account-centric to a customer-centric organizations.

Here at Informatica, we’ve seen over 75% of our MDM customers in the past year purchase for multidomain use cases – meaning the scope of their initiative often spans mastering data such as Customers, Suppliers and Products as part of a coordinated effort.  These organizations have built compelling business cases to demonstrate that mastering multiple domains – and the relationships among those domains – is necessary.  Only a true 360 degree view of relationships among any data can provide the necessary insights to deliver on the desired operational efficiencies, optimized customer experiences, and growth objectives for their companies.

The progress we’ve all made in multidomain MDM is impressive, but it’s just scratching the surface of what’s possible.  What happens when MDM meets Cloud, Social, the Internet of Things and other master data enrichment sources such as D&B and Acxiom?  Dennis Moore, Informatica’s GM and SVP for MDM, envisions that a new “Internet of Master Data” will be formed that can include a massive new set of sensor and social data which it leverages to infer and recommend a new class of relationship insights.   For example, in addition to sentiment and relationships from social networks, location data from mobile devices and sensors can now inform customer – and product – behaviors that span beyond direct transactions and interactions within your traditional business applications.

Those of you who have invested in building a foundation of clean, consistent and connected data have a huge advantage as the value of MDM grows exponentially with the exponential growth of data. You are well-positioned to take advantage of the deeper insights and potential innovations now possible by adding Cloud, Social, and Machine data to optimizing analytics and operations.

This week at Informatica World 2014 in Las Vegas, we kicked off with our fantastic MDM Day pre-conference event with over 500 attendees.  During the event, we shared some early insights into our MDM 10 release planned for later this year which integrates the Informatica Vibe engine and incorporates other elements of the just unveiled Informatica Intelligent Data Platform vision to make it easier for customers to gain a 360 degree view of their most critical business entities, including customers, suppliers, products and assets.

We continue to be inspired by our awesome MDM customers and partners, and we’re excited to see what they can do to harness the power of the Internet of Master Data!

FacebookTwitterLinkedInEmailPrintShare
Posted in Informatica World 2014, Master Data Management, Vibe | Tagged , , , , | Leave a comment

MDM Day Advice: Connect MDM to a Tangible Business Outcome or You Will Fail

“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information  Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.

A record 500 people attended MDM Day in Las Vegas

A record 500 people attended MDM Day in Las Vegas

In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.

Here are my MDM Day fun facts and key takeaways:

  • Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    Ginny Walker, Chief Enterprise Architect at GE Aviation

    GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services.  Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.

  •  Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
    Barbara Latulippe, EMC

    Barbara Latulippe, Senior Director, Enterprise Information Management at EMC

    EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”

  • Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?

    Michael Delgado, Citrix

    Michael Delgado, Information Management Director at Citrix

    Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.

  •  Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?

    John Poonnen, Quintiles

    John Poonnen, Director Infosario Data Factory at Quintiles

    Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.

While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:

  • Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
  • Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”

I also had the opportunity MDM Day Sponsorsto speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.

There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Business/IT Collaboration, CIO, CMO, Customer Acquisition & Retention, Customers, Data Governance, Data Integration, Data Quality, Enterprise Data Management, Informatica World 2014, Master Data Management, Partners, PiM, Product Information Management, Uncategorized | Tagged , , , , , , , , , , , , , , , , , , , , | 1 Comment

Comparative Costs and Uses for Data Integration Platforms – A Study from Bloor Research

Data Integration PlatformsFor years, companies have wrestled with the promise of data integration platforms. Along the way, businesses have asked many questions, including:

  • Does Data Integration technology truly provide a clear path toward unified data?
  • Can businesses truly harness the potential of their information?
  • Can companies take powerful action as a result?

Recently, Bloor Research set out to evaluate how things were actually playing out on the ground. In particular, they wanted to determine which data integration projects were actually taking place, at what scale, and with what results. The study, “Comparative Costs and Uses for Data Integration Platforms,” was authored by Philip Howard, research director at Bloor. The study examined data integration tool suitability across a range of scenarios, including:

  1. Data migration and consolidation projects
  2. Master data management (MDM) and associated solutions
  3. Application-to-application integration
  4. Data warehousing and business intelligence implementations
  5. Synching data with SaaS applications
  6. B2B data exchange

To draw conclusions, Bloor examined 292 responses from a range of companies. The responders used a variety of data integration approaches, from commercial data integration tools to “hand-coding.”

Informatica is pleased to be able to offer you a copy of this research for your review. The research covers areas like:

  • Suitability
  • Productivity
  • Reusability
  • Total Cost of Ownership (TCO)

We welcome you to download a copy of “Comparative Costs and Uses for Data Integration Platforms” today. We hope these findings offer you insights as you implement and evaluate your data integration projects and options.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Data Migration, Master Data Management | Tagged , , , | Leave a comment