Category Archives: Business/IT Collaboration
After I graduated from business school, I started reading Fortune Magazine. I guess that I became a regular reader because each issue largely consists of a set of mini-business cases. And over the years, I have even started to read the witty remarks from the managing editor, Andy Serwer. However, this issue’s comments were even more evocative than usual.
Connectivity is perhaps the biggest opportunity of our time
Andy wrote, “Connectivity is perhaps the biggest opportunity of our time. As technology makes the world smaller, it is clear that the countries and companies that connect the best—either in terms of, say traditional infrastructure or through digital networks are in the drivers’ seat”. Andy sees differentiated connectivity as involving two elements–access and content. This is important to note because Andy believes the biggest winners going forward are going to be the best connectors to each.
Enterprises need to evaluate how the collect, refine, and make useful data
But how do enterprises establish world class connectivity to content? I would argue–whether you are talking about large or small data—it comes from improving an enterprise’s abiity collect, refine, and create useful data. In recent CFO research, the importance of enterprise data gathering capabilities was stressed. CFOs said that their enterprises need to “get data right” at the same time as they confirmed that their enterprises in fact have a data issue. The CFOs said that they are worried about the integrity of data from the source forward. And once they manually create clean data, they worry about making this data useful to their enterprises. Why does this data matter so much to the CFO? Because as CFOs get more strategic, they are trying to make sure their firms drive synergies across their businesses.
Business need to make sense of data and get it to business users faster
One CFO said it this way, “data is potentially the only competitive advantage left”. Yet another said, “our businesses needs to make better decisions from data. We need to make sense of data faster.” At the same time leading edge thinkers like Geoffrey Moore has been suggesting that businesses need to move from “systems of record” applications to “system of engagement” applications. This notion suggests the importance of providing more digestible apps, but also the importance of recognizing that the most important apps for business users will provide relevant information for decision making. Put another way, data is clearly becoming fuel to the enterprise decision making.
“Data Fueled Apps” will provide a connectivity advantage
For this reason, “data fueled” apps will be increasingly important to the business. Decision makers these days want to practice “management by walking around” to quote Tom Peter’s Book, “In Search of Excellence”. And this means having critical, fresh data at their fingertips for each and every meeting. And clearly, organizations that provide this type of data connectivity will establish the connectivity advantage that Serwer suggested in his editor comments. This of course applies to consumer facing apps as well. Server, also, comments on the impacts of Apple and Facebook. Most consumers today are far better informed before they make a purchase. The customer facing apps, for example Amazon, that have led the way have provided the relevant information for the consumer to inform them on their purchase journey.
Delivering “Data Fueled Apps” to the Enterprise
But how do you create the enterprise wide connectivity to power the “Data Fueled Apps?” It is clear from the CFOs comments work is needed here. That work involves creating data which is systematically clean, safe, and connected. Why does this data need to be clean? The CFOs we talked to said that when the data is not clean then they have to manually massage the data and then move from system to system. This is not providing the kind of system of engagement envisioned by Geoffrey Moore. What this CFO wants to move to a world where he can access the numbers easily, timely, and accurately”.
Data, also, needs to be safe. This means that only people with access should be able to see data whether we are talking about transactional or analytical data. This may sound obvious, but very few isolate and secure data as it moves from system to system. And lastly, data needs to be connected. Yet another CFO said, “the integration of the right systems to provide the right information needs to be done so we have the right information to manage and make decisions at the right time”. He continued by saying “we really care about technology integration and getting it less manual. It means that we can inspect the books half way through the cycle. And getting less manual means we can close the books even faster. However, if systems don’t talk (connect) to one another, it is a big issue”.
Finally, whether we are discussing big data or small data, we need to make sure the data collected is more relevant and easier to consume. What is needed here is a data intelligence layer provides easy ways to locate useful data and recommend or guide ways to improve the data. This way analysts and leaders can spend less time on searching or preparing data and more time on analyzing the data to connect the business dots. This can involve mapping data relationship across all applications and being able to draw inferences from data to drive real time responses.
So in this new connected world, we need to first set up a data infrastructure to continuously make data clean, safe, and connected regardless of use case. It might not be needed to collect data, but the data infrastructure may be needed to define the connectivity (in the shape of access and content). We also need to make sure that the infrastructure for doing this is reusable so that the time from concept to new data fueled app is minimized. And then to drive informational meaning, we need to layer on top the intelligence. With this, we can deliver “data fueled apps” that enable business users the access and content to drive better business differentiation and decisioning!
Before I joined Informatica I worked for a health plan in Boston. I managed several programs including CMS Five Start Quality Rating System and Risk Adjustment Redesign. We recognized the need for a robust diagnostic profile of our members in support of risk adjustment. However, because the information resides in multiple sources, gathering and connecting the data presented many challenges. I see the opportunity for health plans to transform risk adjustment.
As risk adjustment becomes an integral component in healthcare, I encourage health plans to create a core competency around the development of diagnostic profiles. This should be the case for health plans and ACO’s. This profile is the source of reimbursement for an individual. This profile is also the basis for clinical care management. Augmented with social and demographic data, the profile can create a roadmap for successfully engaging each member.
Why is risk adjustment important?
Risk Adjustment is increasingly entrenched in the healthcare ecosystem. Originating in Medicare Advantage, it is now applicable to other areas. Risk adjustment is mission critical to protect financial viability and identify a clinical baseline for members.
What are a few examples of the increasing importance of risk adjustment?
1) Centers for Medicare and Medicaid (CMS) continues to increase the focus on Risk Adjustment. They are evaluating the value provided to the Federal government and beneficiaries. CMS has questioned the efficacy of home assessments and challenged health plans to provide a value statement beyond the harvesting of diagnoses codes which result solely in revenue enhancement. Illustrating additional value has been a challenge. Integrating data across the health plan will help address this challenge and derive value.
2) Marketplace members will also require risk adjustment calculations. After the first three years, the three “R’s” will dwindle down to one ‘R”. When Reinsurance and Risk Corridors end, we will be left with Risk Adjustment. To succeed with this new population, health plans need a clear strategy to obtain, analyze and process data. CMS processing delays make risk adjustment even more difficult. A Health Plan’s ability to manage this information will be critical to success.
3) Dual Eligibles, Medicaid members and ACO’s also rely on risk management for profitability and improved quality.
With an enhanced diagnostic profile — one that is accurate, complete and shared — I believe it is possible to enhance care, deliver appropriate reimbursements and provide coordinated care.
How can payers better enable risk adjustment?
- Facilitate timely analysis of accurate data from a variety of sources, in any format.
- Integrate and reconcile data from initial receipt through adjudication and submission.
- Deliver clean and normalized data to business users.
- Provide an aggregated view of master data about members, providers and the relationships between them to reveal insights and enable a differentiated level of service.
- Apply natural language processing to capture insights otherwise trapped in text based notes.
With clean, safe and connected data, health plans can profile members and identify undocumented diagnoses. With this data, health plans will also be able to create reports identifying providers who would benefit from additional training and support (about coding accuracy and completeness).
What will clean, safe and connected data allow?
- Allow risk adjustment to become a core competency and source of differentiation. Revenue impacts are expanding to lines of business representing larger and increasingly complex populations.
- Educate, motivate and engage providers with accurate reporting. Obtaining and acting on diagnostic data is best done when the member/patient is meeting with the caregiver. Clear and trusted feedback to physicians will contribute to a strong partnership.
- Improve patient care, reduce medical cost, increase quality ratings and engage members.
According to a recent article in the LA Times, healthcare costs in the United States far exceed costs in other countries. For example, heart bypass surgery costs an average of $75,345 in the U.S. compared to $15,742 in the Netherlands and $16,492 in Argentina. In the U.S. healthcare accounts for 18% of the U.S. GDP and is increasing.
Michelle Blackmer is an healthcare industry expert at Informatica. In this interview, she explains why business as usual isn’t good enough anymore. Healthcare organizations are rethinking how they do business in an effort to improve outcomes, reduce costs, and comply with regulatory pressures such as the Affordable Care Act (ACA). Michelle believes a data-driven healthcare culture is foundational to personalized medicine and discusses the importance of clean, safe and connected data in executing a successful transformation.
Q. How is the healthcare industry responding to the rising costs of healthcare?
In response to the rising costs of healthcare, regulatory pressures (i.e. Affordable Care Act (ACA)), and the need to better patient outcomes at lower costs, the U.S. healthcare industry is transforming from a volume-based to a value-based model. In this new model, healthcare organizations need to invest in delivering personalized medicine.
To appreciate the potential of personalized medicine, think about your own healthcare experience. It’s typically reactive. You get sick, you go to the doctor, the doctor issues a prescription and you wait a couple of days to see if that drug works. If it doesn’t, you call the doctor and she tries another drug. This process is tedious, painful and costly.
Now imagine if you had a chronic disease like depression or cancer. On average, any given prescription drug only works for half of those who take it. Among cancer patients, the rate of ineffectiveness jumps to 75 percent. Anti-depressants are effective in only 62 percent of those who take them.
Organizations like MD Anderson and UPMC aim to put an end to cancer. They are combining scientific research with access to clean, safe and connected data (data of all types including genomic data). The insights revealed will empower personalized chemotherapies. Personalized medicine offers customized treatments based on patient history and best practices. Personalized medicine will transform healthcare delivery. Click on the links to watch videos about their transformational work.
Q. What role does data play in enabling personalized medicine?
Data is foundational to value-based care and personalized medicine. Not just any data will do. It needs to be clean, safe and connected data. It needs to be delivered rapidly across hallways and across networks.
As an industry, healthcare is at a stage where meaningful electronic data is being generated. Now you need to ensure that the data is accessible and trustworthy so that it can be rapidly analyzed. As data is aggregated across the ecosystem, married with financial and genomic data, data quality issues become more obvious. It’s vital that you can define the data issues so the people can spend their time analyzing the data to gain insights instead of wading through and manually resolving data quality issues.
The ability to trust data will differentiate leaders from the followers. Leaders will advance personalized medicine because they rely on clean, safe and connected data to:
1) Practice analytics as a core competency
2) Define evidence, deliver best practice care and personalize medicine
3) Engage patients and collaborate to foster strong, actionable relationships
Take a look at this Healthcare eBook for more on this topic: Potential Unlocked: Transforming Healthcare by Putting Information to Work.
Q. What is holding healthcare organizations back from managing their healthcare data like other mission-critical assets?
When you say other mission-critical assets, I think of facilitates, equipment, etc. Each of these assets has people and money assigned to manage and maintain them. The healthcare organizations I talk to who are highly invested in personalized medicine recognize that data is mission-critical. They are investing in the people, processes and technology needed to ensure data is clean, safe and connected. The technology includes data integration, data quality and master data management (MDM).
What’s holding other healthcare organizations back is that while they realize they need data governance, they wrongly believe they need to hire big teams of “data stewards” to be successful. In reality, you don’t need to hire a big team. Use the people you already have doing data governance. You may not have made this a formal part of their job description and they might not have data governance technologies yet, but they do have the skillset and they are already doing the work of a data steward.
So while a technology investment is required and you need people who can use the technology, start by formalizing the data stewardship work people are doing already as part of their current job. This way you have people who understand the data, taking an active role in the management of the data and they even get excited about it because their work is being recognized. IT takes on the role of enabling these people instead of having responsibility for all things data.
Q. Can you share examples of how immature information governance is a serious impediment to healthcare payers and providers?
Sure, without information governance, data is not harmonized across sources and so it is hard to make sense of it. This isn’t a problem when you are one business unit or one department, but when you want to get a comprehensive view or a view that incorporates external sources of information, this approach falls apart.
For example, let’s say the cardiology department in a healthcare organization implements a dashboard. The dashboard looks impressive. Then a group of physicians sees the dashboard, point out erroes and ask where the information (i.e. diagnosis or attending physician) came from. If you can’t answer these questions, trace the data back to its sources, or if you have data inconsistencies, the dashboard loses credibility. This is an example of how analytics fail to gain adoption and fail to foster innovation.
Q. Can you share examples of what data-driven healthcare organizations are doing differently?
Certainly, while many are just getting started on their journey to becoming data-driven, I’m seeing some inspiring examples, including:
- Implementing data governance for healthcare analytics. The program and data is owned by the business and enabled by IT and supported by technology such as data integration, data quality and MDM.
- Connecting information from across the entire healthcare ecosystem including 3rd party sources like payers, state agencies, and reference data like credit information from Equifax, firmographics from Dun & Bradstreet or NPI numbers from the national provider registry.
- Establishing consistent data definitions and parameters
- Thinking about the internet of things (IoT) and how to incorporate device data into analysis
- Engaging patients through non-traditional channels including loyalty programs and social media; tracking this information in a customer relationship management (CRM) system
- Fostering collaboration by understanding the relationships between patients, providers and the rest of the ecosystem
- Analyzing data to understand what is working and what is not working so that they can drive out unwanted variations in care
Q. What advice can you give healthcare provider and payer employees who want access to high quality healthcare data?
As with other organizational assets that deliver value—like buildings and equipment—data requires a foundational investment in people and systems to maximize return. In other words, institutions and individuals must start managing their mission-critical data with the same rigor they manage other mission-critical enterprise assets.
Q. Anything else you want to add?
Yes, I wanted to thank our 14 visionary customer executives at data-driven healthcare organizations such as MD Anderson, UPMC, Quest Diagnostics, Sutter Health, St. Joseph Health, Dallas Children’s Medical Center and Navinet for taking time out of their busy schedules to share their journeys toward becoming data-driven at Informatica World 2014. In our next post, I’ll share some highlights about how they are using data, how they are ensuring it is clean, safe and connected and a few data management best practices. InformaticaWorld attendees will be able to download presentations starting today! If you missed InformaticaWorld 2014, stay tuned for our upcoming webinars featuring many of these examples.
A Data Integration Developer, a Data Analyst and a Business Analyst go into a bar… Heard that one? You probably didn’t. They never made it to the bar. They are still back at the office, going back and forth for the umpteenth time on the data requirements for the latest report…
Sounds familiar? If so, you are not alone. Many IT departments are struggling to meet the data needs of their business counterparts. Spreadsheets, emails and cocktail napkins have not proven themselves an effective tool for relaying data requirement by the business. The process takes too long and leaves both sides frustrated and dissatisfied with the outcome. IT does not have the bandwidth to meet the ever-increasing and rapidly changing data needs of the business.
The old-fashioned “waterfall” approach to data integration simply won’t cut it anymore in the fast-paced data-driven world. There has to be a better way. Here at Informatica, we believe that an end-to-end Agile Data Integration process can greatly increase business agility.
We start with a highly collaborative process, whereby IT and the Analyst work closely together through an iterative process to define data integration requirements. IT empowers the analyst with self-service tools that enable rapid prototyping and data profiling. Once the analyst is happy with the data they access and combine, they can use their tool to seamlessly share the output with IT for final deployment. This approach greatly reduces the time-to-data, and not just any data, the right data!
The ability to rapidly generate reports and deliver new critical data for decision-making is foundational to business agility. Another important aspect of business agility is the ability to scale your system as your needs grow to support more data, data types, users and projects. We accomplish that through advanced scaling capabilities, such as grid support and high availability, leading to zero downtime, as well as improved data insights through metadata management, lineage, impact analysis and business glossary.
Finally, we need to continue to ensure agility when our system is in production. Data validation should be performed to eliminate data defects. Trying to manually validate data is like looking for a needle in a haystack, very slowly… Automating your data validation process is fast and reliable, ensuring that the business gets accurate data all the time.
It is just as important to become more proactive and less reactive when it comes to your data in production. Early detection of data process and workflow problems through proactive monitoring is key to prevention.
Would you like to see a 5X increase in the speed of delivering data integration projects?
Would you like to provide the system reliability you need as your business grows, and ensure that your business continues to get the critical data it requires without defects and without interruption?
To learn more about how Agile Data Integration can enable business agility, please check out the demonstration of the newly-released PowerCenter 9.6, featuring David Lyle, VP Product Strategy at Informatica and the Informatica Product Desk experts. This demo webinar is available on demand.
It is troublesome to me to repeatedly get into conversations with IT managers who want to fix data “for the sake of fixing it”. While this is presumably increasingly rare, due to my department’s role, we probably see a higher occurrence than the normal software vendor employee. Given that, please excuse the inflammatory title of this post.
Nevertheless, once the deal is done, we find increasingly fewer of these instances, yet still enough, as the average implementation consultant or developer cares about this aspect even less. A few months ago a petrochemical firm’s G&G IT team lead told me that he does not believe that data quality improvements can or should be measured. He also said, “if we need another application, we buy it. End of story.” Good for software vendors, I thought, but in most organizations $1M here or there do not lay around leisurely plus decision makers want to see the – dare I say it – ROI.
However, IT and business leaders should take note that a misalignment due to lack OR disregard of communication is a critical success factor. If the business does not get what it needs and wants AND it differs what Corporate IT is envisioning and working on – and this is what I am talking about here – it makes any IT investment a risky proposition.
Let me illustrate this with 4 recent examples I ran into:
1. Potential for flawed prioritization
A retail customer’s IT department apparently knew that fixing and enriching a customer loyalty record across the enterprise is a good and financially rewarding idea. They only wanted to understand what the less-risky functional implementation choices where. They indicated that if they wanted to learn what the factual financial impact of “fixing” certain records or attributes, they would just have to look into their enterprise data warehouse. This is where the logic falls apart as the warehouse would be just as unreliable as the “compromised” applications (POS, mktg, ERP) feeding it.
Even if they massaged the data before it hit the next EDW load, there is nothing inherently real-time about this as all OLTP are running processes of incorrect (no bidirectional linkage) and stale data (since the last load).
I would question if the business is now completely aligned with what IT is continuously correcting. After all, IT may go for the “easy or obvious” fixes via a weekly or monthly recurring data scrub exercise without truly knowing, which the “biggest bang for the buck” is or what the other affected business use cases are, they may not even be aware of yet. Imagine the productivity impact of all the roundtripping and delay in reporting this creates. This example also reminds me of a telco client, I encountered during my tenure at another tech firm, which fed their customer master from their EDW and now just found out that this pattern is doomed to fail due to data staleness and performance.
2. Fix IT issues and business benefits will trickle down
Client number two is a large North American construction Company. An architect built a business case for fixing a variety of data buckets in the organization (CRM, Brand Management, Partner Onboarding, Mobility Services, Quotation & Requisitions, BI & EPM).
Grand vision documents existed and linked to the case, which stated how data would get better (like a sick patient) but there was no mention of hard facts of how each of the use cases would deliver on this. After I gave him some major counseling what to look out and how to flesh it out – radio silence. Someone got scared of the math, I guess.
3. Now that we bought it, where do we start
The third culprit was a large petrochemical firm, which apparently sat on some excess funds and thought (rightfully so) it was a good idea to fix their well attributes. More power to them. However, the IT team is now in a dreadful position having to justify to their boss and ultimately the E&P division head why they prioritized this effort so highly and spent the money. Well, they had their heart in the right place but are a tad late. Still, I consider this better late than never.
4. A senior moment
The last example comes from a South American communications provider. They seemingly did everything right given the results they achieved to date. This gets to show that misalignment of IT and business does not necessarily wreak havoc – at least initially.
However, they are now in phase 3 of their roll out and reality caught up with them. A senior moment or lapse in judgment maybe? Whatever it was; once they fixed their CRM, network and billing application data, they had to start talking to the business and financial analysts as complaints and questions started to trickle in. Once again, better late than never.
So what is the take-away from these stories. Why wait until phase 3, why have to be forced to cram some justification after the purchase? You pick, which one works best for you to fix this age-old issue. But please heed Sohaib’s words of wisdom recently broadcast on CNN Money “IT is a mature sector post bubble…..now it needs to deliver the goods”. And here is an action item for you – check out the new way for the business user to prepare their own data (30 minutes into the video!). Agreed?
Within every corporation there are lines of businesses, like Finance, Sales, Logistics and Marketing. And within those lines of businesses are business users who are either non-technical or choose to be non-technical.
These business users are increasingly using Next-Generation Business Intelligence Tools like Tableau, Qliktech, MicroStrategy Visual Insight, Spotfire or even Excel. A unique capability of these Next-Generation Business Intelligence Tools is that they allow a non-technical Business User to prepare data, themselves, prior to the ingestion of the prepared data into these tools for subsequent analysis.
Initially, the types of activities involved in preparing this data are quite simple. It involves, perhaps, putting together two excel files via a join on a common field. However, over time, the types of operations a non-technical user wishes to perform on the data become more complex. They wish to do things like join two files of differing grain, or validate/complete addresses, or even enrich company or customer profile data. And when a non-technical user reaches this point they require either coding or advanced tooling, neither of which they have access to. Therefore, at this point, they will pick up the phone, call their brethren in IT and ask nicely for help with combining, enhancing quality and enriching the data. Often times they require the resulting dataset back in a tight timeframe, perhaps a couple of hours. IT, will initially be very happy to oblige. They will get the dataset back to the business user in the timeframe requested and at the quality levels expected. No issues.
However, as the number of non-technical Business Users using Next-Generation Business Intelligence tools increase, the number of requests to IT for datasets also increase. And so, while initially IT was able to meet the “quick hit dataset” requests from the Business, over time, and to the best of their abilities, IT increasingly becomes unable to do so.
The reality is that over time, the business will see a gradual decrease in the quality of the datasets returned, as well as an increase the timeframe required for IT to provide the data. And at some point the business will reach a decision point. This is where they determine that for them to meet their business commitments, they will have to find other means by which to put together their “quick hit datasets.” It is precisely at this point that the business may do things like hire an IT contractor to sit next to them to do nothing but put together these “quick hit” datasets. It is also when IT begins to feel marginalized and will likely begin to see a drop in funding.
This dynamic is one that has been around for decades and has continued to worsen due to the increase in the pace of data driven business decision making. I feel that we at Informatica have a truly unique opportunity to innovate a technology solution that focuses on two related constituents, specifically, the Non-Technical Business User and the IT Data Provisioner.
The specific point of value that this technology will provide to the Non-Technical Business User will enable them to rapidly put together datasets for subsequent analysis in their Next-Generation BI tool of choice. Without this tool they might spend a week or two putting together a dataset or wait for someone else to put it together. I feel we can improve this division-of-labor and allow business users to spend 1-2 weeks performing meaningful analysis before spending 15 minutes putting the data set together themselves. Doing so, we allow non-technical business users to dramatically decrease their decision making time.
The specific point of value that this technology will provide the IT data provisioner is that they will now be able to effectively scale data provisioning as the number of requests for “quick hit datasets” rapidly increase. Most importantly, they will be able to scale, proactively.
Because of this, the Business and IT relationship has become a match made in heaven.
“Start your master data management (MDM) journey knowing how it will deliver a tangible business outcome. Will it help your business generate revenue or cut costs? Focus on the business value you plan to deliver with MDM and revisit it often,” advises Michael Delgado, Information Management Director at Citrix during his presentation at MDM Day, the InformaticaWorld 2014 pre-conference program. MDM Day focused on driving value from business-critical information and attracted 500 people.
In Ravi Shankar’s recent MDM Day preview blog, Part 2: All MDM, All Day at Pre-Conference Day at InformaticaWorld, he highlights the amazing line up of master data management (MDM) and product information management (PIM) customers speakers, Informatica experts as well as our talented partner sponsors.
Here are my MDM Day fun facts and key takeaways:
- Did you know that every 2 seconds an aircraft with GE engine technology is taking off somewhere in the world?
GE Aviation’s Chief Enterprise Architect, Ginny Walker, presented “Operationalizing Critical Business Processes: GE Aviation’s MDM Story.” GE Aviation is a $22 billion company and a leading provider of jet engines, systems and services. Ginny shared the company’s multi-year journey to improve installed-base asset data management. She explained how the combination of data, analytics, and connectivity results in productivity improvements such as reducing up to 2% of the annual fuel bill and reducing delays. The keys to GE Aviation’s analytical MDM success were: 1) tying MDM to business metrics, 2) starting with a narrow scope, and 3) data stewards. Ginny believes that MDM is an enabler for the Industrial Internet and Big Data because it empowers companies to get insights from multiple sources of data.
- Did you know that EMC has made a $17 billion investment in acquisitions and is integrating more than 70 technology companies?
EMC’s Barbara Latulippe, aka “The Data Diva,” is the Senior Director of Enterprise Information Management (EIM). EMC is a $21.7 billion company that has grown through acquisition and has 60,000 employees worldwide. In her presentation, “Formula for Success: EMC MDM Best Practices,” Barbara warns that if you don’t have a data governance program in place, you’re going to have a hard time getting an MDM initiative off the ground. She stressed the importance of building a data governance council and involving the business as early as possible to agree on key definitions such as “customer.” Barbara and her team focused on the financial impact of higher quality data to build a business case for operational MDM. She asked her business counterparts, “Imagine if you could onboard a customer in 3 minutes instead of 15 minutes?”
- Did you know that Citrix is enabling the mobile workforce by uniting apps, data and services on any device over any network and cloud?
Citrix’s Information Management Director, Michael Delgado, presented “Citrix MDM Case Study: From Partner 360 to Customer 360.” Citrix is a $2.9 billion Cloud software company that embarked on a multi-domain MDM and data governance journey for channel partner, hierarchy and customer data. Because 90% of the company’s product bookings are fulfilled by channel partners, Citrix started their MDM journey to better understand their total channel partner relationship to make it easier to do business with Citrix and boost revenue. Once they were successful with partner data, they turned to customer data. They wanted to boost customer experience by understanding the total customer relationship across products lines and regions. Armed with this information, Citrix employees can engage customers in one product renewal process for all products. MDM also helps Citrix’s sales team with white space analysis to identify opportunities to sell more user licenses in existing customer accounts.
- Did you know Quintiles helped develop or commercialize all of the top 5 best-selling drugs on the market?
Quintiles’ Director of the Infosario Data Factory, John Poonnen, presented “Using Multi-domain MDM to Gain Information Insights:How Quintiles Efficiently Manages Complex Clinical Trials.” Quintiles is the world’s largest provider of biopharmaceutical development and commercial outsourcing services with more than 27,000 employees. John explained how the company leverages a tailored, multi-domain MDM platform to gain a holistic view of business-critical entities such as investigators, research facilities, clinical studies, study sites and subjects to cut costs, improve quality, improve productivity and to meet regulatory and patient needs. “Although information needs to flow throughout the process – it tends to get stuck in different silos and must be manually manipulated to get meaningful insights,” said John. He believes master data is foundational — combining it with other data, capabilities and expertise makes it transformational.
While I couldn’t attend the PIM customer presentations below, I heard they were excellent. I look forward to watching the videos:
- Crestline/ Geiger: Dale Denham, CIO presented, “How Product Information in eCommerce improved Geiger’s Ability to Promote and Sell Promotional Products.”
- Murdoch’s Ranch and Home Supply: Director of Marketing, Kitch Walker presented, “Driving Omnichannel Customer Engagement – PIM Best Practices.”
I also had the opportunity to speak with some of our knowledgeable and experienced MDM Day partner sponsors. Go to Twitter and search for #MDM and #DataQuality to see their advice on what it takes to successfully kick-off and implement an MDM program.
There are more thought-provoking MDM and PIM customer presentations taking place this week at InformaticaWorld 2014. To join or follow the conversation, use #INFA14 #MDM or #INFA14 #PIM.