Tag Archives: Master Data Management
You know that old saying, “What’s the best way to eat an elephant?” “One bite at a time.”
Much like the daunting task of eating an elephant, implementing an MDM solution can seem staggering. Many times, the implementation team gets mired in the details, wanting to create a solution that is the answer to everyone’s problems and as a result never gets started due to the overwhelming nature of the undertaking. After many successful MDM implementations – I’d like to recommend the following approach to eating your elephant:
- Clearly establish the business problem you are trying to solve
According to the Gartner Group, “MDM is a business-driven, technology-enabled environment and program.” It is really important to remember that the reason you are working on an MDM project is to satisfy a specific business need aligned with the overall business strategy. Once aligned – it is easier to show value to the organization when it is successfully implemented.
For instance – if you want to reduce readmission costs for a specific segment of your member population, this could have an impact on several potential business goals in the payer market including improving customer satisfaction and lowering cost of care for specific populations. Improving customer satisfaction is a pretty big elephant – but focusing down on reducing readmission costs is much more manageable. And it will impact two of your business ! Sutter Health has taken this approach and is releasing new use cases every 90 days.
- Understand and document your current state
Now that you know what problem you are attempting to solve – you will need to understand the current level of maturity of your current approach and the resources it will take to implement your solution. You will need to consider your available resources, understand the amount of time and money it may take to implement a solution, and obtain visibility into where your organization stands currently in regards to the .
Using the readmission example – you are going to need to identify where readmission data is available. How often is this data updated? What is the quality of the data? How difficult is it to get access to the data? Who are the resources that are responsible for readmission data? Are they distracted by many competing requests? What are some of the data sources that could impact readmission rates that are not currently part of your data sources? For instance – understanding the living arrangement of the member prior to discharge can have an impact on readmission rates (members living alone have a much higher readmission rate than those living with others). If you can identify a household through social media data, you can better predict who will be going home alone.
- Clearly define success
There is business adage (much like eating an elephant!) that says “you can’t manage what you can’t measure.” Before you implement your MDM solution – you need to identify what metrics are most important and show success. These should align with the goals you set in #1 (which should align with the goals of the business). You should also identify specific business outcomes that metrics can apply to.
What would be a reasonable measure of success for reduction of readmission rates?
- Moving from a 15% readmission rate to an 8% readmission rate?
- Moving from a 15% readmission rate to a 10% readmission rate? In what amount of time?
- Perhaps moving from a 15% readmission rate to a 13% readmission rate in a year after implementation?
- Build a governance hierarchy
According to Gartner Group, “Governance is about decision making and ensuring that you have an authority framework that takes decisions and is able to measure the execution of those decisions.” An effective governance program requires a well-defined hierarchy, headed by a sponsor — someone in a position of authority who carries the necessary weight and cross-departmental authority to make MDM governance a reality. How can you find out how mature your organization is from a data governance perspective? There are several vendors and websites that can help with that including: www.governyourdata.com .
You need to be able to make assessments about the quality of the source data, the data lineage (where and how the data that is feeding your MDM solution has been modified) for compliance reporting requirements, as well as establishing a process for support of your data quality. There are 10 data governance insights from UPMC here.
Now – what is the smallest bite you can define for your organization? By laying out your business priorities, identifying your current state, creating a measurable goal and a method for governing – you have put clear boundaries on what you are trying to accomplish. Once you’ve established your credibility on your ability to finish your first bite successfully, the next bite will be easier!
If you’ve been contemplating to transform your business, making it a priority to embrace digital transformation, this year’s Informatica World 2015, in Las Vegas, has a series of B2B and B2C sessions for you. Here are some I wish to recommend:
B2B Commerce Ready Data
PartsSource Improve Customer Experience with Product Information in Medical Parts
Brian Thomas, Director of Application, PartsSource
A leading provider of medical replacement parts solutions, PartsSource will discuss how it uses Informatica PIM as the foundation for its omnichannel strategy and to support its industry-first one-stop shop online catalog for medical parts. PartsSource will explain how Informatica helped reduce the time needed to launch and update products from 120 minutes to 2 minutes using fewer employees, making it easier than ever to connect over 3,300 hospitals to thousands of OEMs and suppliers.
B2C Commerce Ready Data
Elkjop and Monsanto: PIM and MDM to Manage Product and Customer Data
Thomas Thykjer, Master Data Architect, Elkjop Nordics AS
Jim Stellern, US Commercial Data Management and BI IT Lead, Monsanto
Elkjop, the largest consumer electronics retailer in the Nordic countries, increased both its product range offering and the quality of its product information across its entire portfolio by leveraging the strong embedded Data Quality tools found in the Informatica PIM. Elkjop will also discuss how they reduced their yearly development costs and operating costs.
Monsanto developed a strategy to address customer data quality issues, clearly articulated the business value of implementing the strategy, and successfully implemented the solution leveraging Informatica MDM and a new data governance program.
Omnichannel Ready: What’s New in PIM 8?
Latest Features, Demo and Roadmap
Stefan Reinhardt, Senior Product Manager PIM, R&D DEU PIM, Informatica
Ben Rund, Sr. Director Product Marketing, Information Quality Solutions, Informatica
Are you commerce ready? Can you say the same about your data? Are you focused on the betterment of your supply chain, marketing, ecommerce, merchandising and category management This session, aimed at those wanting to sell more products faster.
Learn what’s new with PIM 8:
- Kits and bundles for superior cross-selling
- High data volumes architecture
- Role- and task-based web interfaces for increased efficiency on product content collaboration
- Business user dashboard
- 1WorldSync data pool syndication for compliant CPG data
- Product Data as a Service (DaaS) for price intelligence and content benchmarking.
We will be covering all those new features during the session through in a live demo.
Furthermore, we will also cover the PIM roadmap, showcasing how PIM is evolving and gaining MDM like features, by fueling product data apps for different use cases and industries, leveraging the Intelligent Data Platform.
User Group Session for Omnichannel
Are you looking to talk to the experts? Don’t miss out on our user group sessions where you will be able to discuss and work directly with the R&D and product management experts. They will be there to answer your questions as well as hear your thoughts and feedback. It’s your opportunity to be heard and get an answer to your question, don’t miss out!
37 Sessions to Master the Digital Transformation
An overall of 37 sessions, focused on different Master Data Management and Information Governance, key disciplines and the foundation of successfully mastering digital transformation, will be held on the 12th and 13th of May.
The Good, The Bad, and The Ugly! Experiencing Great (and Not So Great) Data-Driven Marketing in Day-to-Day Life
I spend my professional life helping marketers get the most out of their data. So it really hits me when I experience real life case studies in my day-to-day personal life. When I do run across these great (and not so great) experiences, it broadens my perspective and helps me figure out new ways that I can help our customers use clean, safe, and connected data to revolutionize their marketing efforts.
I feel compelled to blog about these great customer experiences in hopes that other marketers can learn from these encounters like I did. So this is the first installment of an ongoing series of blog posts about where I’ve experienced great (and not so great) data-driven marketing experiences.
I love my insurance company for many reasons, but the last experience I had with them was truly exceptional. I had been on their website getting an insurance quote for a new home. I still had a few questions, so I initiated an online chat. The online representative there quickly and efficiently answered my question and I left the experience with a thorough understanding of the potential policy. About a week later, seemingly unrelated, I put in a claim through a local vendor for a chipped windshield repair. Then two weeks later, I called back and the phone system recognized my mobile number and through the automated system asked me if I was calling about the homeowner’s policy quote I had received a few weeks ago. I pressed 1 for yes, and here’s where the impressive part begins…
A representative quickly answered the phone. He was able to view all of the information I had put into the online quote tool. He then referenced the question I had asked the representative in the online chat, and asked if I had any further questions about it. Within minutes I had my new policy set up, but it didn’t stop there. He asked if the windshield chip I had the previous week had been repaired to my satisfaction. Then he noticed that although I do have a credit card through them, I haven’t used it in some time, and offered me a 0% balance transfer offer for 36 months with no transfer fee on that card. Heck, we just bought a new house and I know there will be plenty of expenses, so sign me up! Finally, he noticed that because I was about to move to a new zip code, my car insurance rates would be going down slightly and offered to send me to the automobile insurance team to make the change to my policy.
The system clearly tied all of my recent and past activities from various channels together, analyzed them, and leveraged some sort of recommendation engine to guide the customer support representative to provide truly customized service. You can be assured I love my insurance company even more, and I will be dusting off a stagnant account that I hadn’t used in years. A+.
Oh how I wish my bank would embrace Total Customer Relationship. My husband, my children, and I have far too many accounts at our bank. Each child has a savings account, we have a joint checking account, we have a savings account, he has a personal savings account, I have a personal savings account, and then there’s the cash reserve line, the credit card and a few CD’s. We recently sold a home and the now-paid-off mortgage was through them. Plus we’ve been customers for almost 20 years. So sufficed to say, we have been loyal customers.
Well the other day, I had to go get a cashier’s check for a school activity for one of my children. I know, it’s pretty strange that they needed a cashier’s check, but I digress. I went to pull half out of my son’s minor savings account and half out of my individual checking account. Neither of those accounts have much money in there, nor have they been opened for very long. Call me spoiled, but I’m used to getting these types of service fees waived for my long tenure and deep relationship with our bank. But because the accounts I had pulled the money from didn’t have that kind of history, they weren’t willing to waive the fees. I was in a hurry because I was running late from shopping at a shoe store (see “The Ugly” below) so I didn’t have the time or energy to fight it, so I paid the darn $5 fee, but I was irritated. Clearly they couldn’t easily see the total customer relationship I have with their institution. The aren’t leveraging their data to tell a complete story, and missed an opportunity to show a loyal customer a great experience.
And The Ugly
A few days ago, I was at a shoe store picking up some new soccer cleats for one of my children. I had gotten an email offer for 30% off, so I pulled up the email and prepared to use it at the cash register. For whatever reason, the email didn’t had a blank where a code was supposed to be, and the woman at the register, despite her best efforts couldn’t use it. So, trying to be helpful, she looked at my loyalty account and, low and behold, I had $40 worth of rewards points that I didn’t even know existed. But I had to first download an app to try to issue a coupon using those points. I downloaded the app, put in my loyalty number, and no points were available.
Turns out, I had two accounts, but they weren’t linked despite having the same phone number and names. One had an address that was misspelled in the system, so it apparently wouldn’t merge with the other account – oh data quality and address validation how I missed you at that moment! She corrected the address, and informed me that it would now merge the two accounts and to try to log in again. Of course, I knew that there was no way this was a real time, or near real time process, but she was insistent. So I tried again, nothing.
The woman couldn’t have been nicer, but poor data quality processes and long batch windows had her hands tied. I was advised to call the customer support line, but of course it was a Sunday afternoon and nobody was there to pick up. So 45 minutes later, I left the store, irritated and very late, and without the shoes I was going to purchase out of principle. In the future, I’ll be going down the street and shopping at another shoe store – it’s my own personal strike against antiquated, inaccurate, incomplete, and painfully slow data processes which result in bad customer experiences!
In The End…
In the greater scheme of things, these varying degrees of customer experience “misses” aren’t exactly a crisis. It’s not curing cancer or solving world hunger, but to consumers, having a great customer experience is really important. Wouldn’t you rather have your customers raving about a great experience, than grumbling about a bad one, or losing a customer due to an ugly one?
We marketers can make the difference! We own that end-to-end omni channel experience. We need to make sure that our data is clean, safe, and connected so we can provide our customers what they expect and frankly deserve from us.
Informatica’s Total Customer Relationship Solution empowers organizations with confidence, knowing that they have access to the kind of great customer data that allows them to surpass customer acquisition and retention goals by providing consistent, integrated, and seamless customer experiences across channels. The end result? Great experiences that customers are inspired to share with their family and friends at dinner parties and on blog posts like this one.
Want to learn more? Check out these webinars to see how Informatica and our customers and partners are revolutionizing the customer experience.
Seven steps to clean, trusted and aligned well data.
Is your data ready to help you make the big decisions? “Data and analytics inputs” (37%) just nudged out “my own experience and intuition” (33%) when energy executives were asked as part of PWC’s 2014 Big Decisions survey on what they relied most for the last big decision they made.
But, what about the little decisions? The ones that add up and either create value or contribute to waste, primarily in the area of operational efficiency? It’s important to think about this aspect of decision making too.
In a recent article by Forbes magazine, Tom Morgan, Analyst and Corporate Counsel at Drillinginfo, credits the present stamina of the unconventional plays to efficiency gains. These gains have been as much as 25% and are altering the profitability thresholds for productive wells, changing many of the beliefs on which today’s headlines are focused.
A well’s profitability is driven by multiple variables, including well location, lease terms, number of wells drilled, uptime, partners, resources, and so on. These attributes are monitored and combined to ensure a well reaches its full productive potential. But how do you know if the information you’re depending on is ready to provide the answers you need when you need them?
Given that an unconventional well’s productivity declines after the first 2 years, every decision made in this short timeframe puts the well’s profitability at risk. Approaching bad or incomplete data as a ‘cost of doing business’ can mean millions of dollars in unnecessary waste – per well.
The best strategic and operational decisions stem from clean, trusted, and aligned data. Kudos to the upstream, unconventional teams that are leading the way to exploiting how data is used in the oil & gas industry. The most effective way to ensure your data is decision ready is to adopt these seven steps:
- Integrate well information. Bring together fragmented data you expect to be consistent across systems but isn’t.
- Evaluate the quality of data. Make sure the information is accurate and complete. If it isn’t, you can see what needs to be fixed.
- De-duplicate it. Automatically identify duplicate records and reconcile them into a single well profile.
- Enrich it. Enhance well profiles with 3rd party data such as state/local and IHS data.
- Validate it. Ensure you can identify the correct well when you need to. (It’s harder than it sounds.)
- Relate it. See relationships between wells, assets, associates, suppliers and partners, and the projects they’re associated with.
- Deliver it. Fuel business and analytical applications with clean, consistent and connected well information.
As O&G companies look to save on IT costs, they need to look to areas that add value or decrease other costs over the long term. Chris Niven, IDC analyst, sums it up best: “Oil and gas companies need to spend their investments wisely to help them become agile and operationally efficient.”
On Tuesday, May 12, during InformaticaWorld 2015, Devon Energy, a Fortune 500 company, will join other industry innovators during MDM Day. Devon will share how they’re using data virtualization to streamline their data processes and gain efficiencies and insights in the process. Be sure to register here.
The Oil and Gas (O&G) industry is an important backbone of every economy. It is also an industry that has weathered the storm of constantly changing economic trends, regulations and technological innovations. O&G companies by nature have complex and intensive data processes. For a profitable functioning under changing trends, policies and guidelines, O&G’s need to manage these data processes really well.
The industry is subject to pricing volatility based on microeconomic pattern of supply and demand affected by geopolitical developments, economic meltdown and public scrutiny. The competition from other sources such as cheap natural gas and low margins are adding fuel to the burning fire making it hard for O&G’s to have a sustainable and predictable outcome.
A recent PWC survey of oil and gas CEOs similarly concluded that “energy CEOs can’t control market factors such as the world’s economic health or global oil supply, but can change how they respond to market conditions, such as getting the most out of technology investments, more effective use of partnerships and diversity strategies.” The survey also revealed that nearly 80% of respondents agreed that digital technologies are creating value for their companies when it comes to data analysis and operational efficiency.
O&G firms run three distinct business operations; upstream exploration & production (E&P’s), midstream (storage & transportation) and downstream (refining & distribution). All of these operations need a few core data domains being standardized for every major business process. However, a key challenge faced by O&G companies is that this critical core information is often spread across multiple disparate systems making it hard to take timely decisions. To ensure effective operations and to grow their asset base, it is vital for these companies to capture and manage critical data related to these domains.
E&P’s core data domains include wellhead/materials (asset), geo-spatial location data and engineer/technicians (associate). Midstream includes trading partners and distribution and downstream includes commercial and residential customers. Classic distribution use cases emerge around shipping locations, large-scale clients like airlines and other logistics providers buying millions of gallons of fuel and industrial lube products down to gas station customers. The industry also relies heavily on reference data and chart of accounts for financial cost and revenue roll-ups.
The main E&P asset, the well, goes through its life cycle and changes characteristics (location, ID, name, physical characterization, depth, crew, ownership, etc.), which are all master data aspects to consider for this baseline entity. If we master this data and create a consistent representation across the organization, it can then be linked to transaction and interaction data so that O&G’s can drive their investment decisions, split cost and revenue through reporting and real-time processes around
- Crew allocation
- Royalty payments
- Safety and environmental inspections
- Maintenance and overall production planning
E&P firms need a solution that allows them to:
- Have a flexible multidomain platform that permits easier management of different entities under one solution
- Create a single, cross-enterprise instance of a wellhead master
- Capture and master the relationships between the well, equipment, associates, land and location
- Govern end-to-end management of assets, facilities, equipment and sites throughout their life cycle
The upstream O&G industry is uniquely positioned to take advantage of vast amount of data from its operations. Thousands of sensors at the well head, millions of parts in the supply chain, global capital projects and many highly trained staff create a data-rich environment. A well implemented MDM brings a strong foundation for this data driven industry providing clean, consistent and connected information about the core master data so these companies can cut the material cost, IT maintenance and increase margins.
To know more on how you can achieve upstream operational excellence with Informatica Master Data Management, check out this recorded webinar with @OilAndGasData
I recently got to meet with a very enlightened insurance company which was actively turning their SWOTT analysis (with the second T being trends) into concrete action. They shared with me that they view their go forward “right to win” being determined by the quality of customer experience they deliver to customers through their traditional channels and increasingly through “digital channels”. One marketing leader joked early on that “it’s no longer about the money; it is about the experience”. The marketing and business leaders that I met with made it extremely clear that they have a sense of urgency to respond to what it saw as significant market changes on the horizon. What this company wanted to achieve was a single view of customer across each of its distribution channel as well as their agent population. Typical of many businesses today, they had determined that they needed an automated, holistic view into things like its customer history. Smartly, this business wanted to put together its existing customer data with its customer leads.
Using Data to Accelerate the Percentage Customers that are Cross Sold
Taking this step was seen as allowing them to understand when an existing customer is also a lead for another product. With this knowledge, they wanted to provide them with special offers to accelerate their conversion from lead to being a customer with more than one product. What they wanted to do here reminded me of the capabilities of 58.com, eBay, and other Internet pure plays. The reason for doing this well was described recently by Gartner. Gartner suggests that increasing business success is determined by what they call “business moments”. Without a first rate experience that builds upon what this insurance company already knows about its customers, this insurance company worries it could be increasing at risk by Internet pure plays. As important, like many businesses, the degree of cross sell is for many businesses a major determinant of whether a customer is profitable or not.
Getting Customer Data Right is Key to Developing a Winning Digital Experience
To drive a first rate digital experience, this insurance company wanted to apply advanced analytics to a single view of customer and prospect data. This would allow them to do things like conduct nearest neighbor predictive analysis and modeling. In this form of analysis, “the goal is to predict whether a new customer will respond to an offer based on how other similar customers have responded” (Data Science for Business, Foster Provost, O’Reilly, 2013, page 147).
What has limiting this business like so many others is that their customer data is scattered across many enterprise systems. For just for one division, they have more than one Salesforce instance. Yet this company’s marketing team knew to keep its customers, it needed to be able to service them omnichannel and establish a single unified customer experience. To make this happen, they needed to for the first to share holistic customer information across their ecosystems. At the same time, they knew that they would needed to protect their customer’s privacy—i.e. only certain people would be able to see certain information. They wanted by role that the ability to selective mask data and protect their customer in particular consumers by only allowing certain users in defense parlance, with a need to know, to see a subset of the holistic set of information collected. When asked about the need for a single view of customer, the digital marketing folks openly shared that they perceived the potential for external digital market entrants—ala Porter’s five forces of competition. This firm saw them either as taking market share from them or effectively disintermediating them over time them from their customers as more and more customers move their insurance purchasing of Insurance to the Web. Given the risk, their competitive advantage needed to move to knowing better their customer and being able to respond better to them on the web. This clearly included new customers that are trying to win in the language of Theodore Levitt.
Competing on Customer Experience
In sum, this insurance company smartly felt that they needed to compete on customer experience to pull out a new phrase for me and this required superior knowledge of existing and new customers. This means they needed as complete and correct view of customers as possible including addresses, connection preferences, and increasingly social media responses. This means competitively responding directly to those that have honed their skills in web design, social presence, and advanced analytics. To do this, they will create predictive capabilities that will make use of their superior customer data. Clearly, without this prescience of thinking, this moment will not be like the strategic collision of Starbucks and Fast Food Vendors where the desire to grow forced competition between the existing player and new entrants wanting to claim a portion of the existing market player’s business.
Blogs and Articles
Let’s face it, building a Data Governance program is no overnight task. As one CDO puts it: ”data governance is a marathon, not a sprint”. Why? Because data governance is a complex business function that encompasses technology, people and process, all of which have to work together effectively to ensure the success of the initiative. Because of the scope of the program, Data Governance often calls for participants from different business units within an organization, and it can be disruptive at first.
Why bother then? Given that data governance is complex, disruptive, and could potentially introduce additional cost to a company? Well, the drivers for data governance can vary for different organizations. Let’s take a close look at some of the motivations behind data governance program.
For companies in heavily regulated industries, establishing a formal data governance program is a mandate. When a company is not compliant, consequences can be severe. Penalties could include hefty fines, brand damage, loss in revenue, and even potential jail time for the person who is held accountable for being noncompliance. In order to meet the on-going regulatory requirements, adhere to data security policies and standards, companies need to rely on clean, connected and trusted data to enable transparency, auditability in their reporting to meet mandatory requirements and answer critical questions from auditors. Without a dedicated data governance program in place, the compliance initiative could become an on-going nightmare for companies in the regulated industry.
A data governance program can also be established to support customer centricity initiative. To make effective cross-sells and ups-sells to your customers and grow your business, you need clear visibility into customer purchasing behaviors across multiple shopping channels and touch points. Customer’s shopping behaviors and their attributes are captured by the data, therefore, to gain thorough understanding of your customers and boost your sales, a holistic Data Governance program is essential.
Other reasons for companies to start a data governance program include improving efficiency and reducing operational cost, supporting better analytics and driving more innovations. As long as it’s a business critical area and data is at the core of the process, and the business case is loud and sound, then there is a compelling reason for launching a data governance program.
Now that we have identified the drivers for data governance, how do we start? This rather loaded question really gets into the details of the implementation. A few critical elements come to consideration including: identifying and establishing various task forces such as steering committee, data governance team and business sponsors; identifying roles and responsibilities for the stakeholders involved in the program; defining metrics for tracking the results. And soon you will find that on top of everything, communications, communications and more communications is probably the most important tactic of all for driving the initial success of the program.
A rule of thumb? Start small, take one-step at a time and focus on producing something tangible.
Sounds easy, right? Well, let’s hear what the real-world practitioners have to say. Join us at this Informatica webinar to hear Michael Wodzinski, Director of Information Architecture, Lisa Bemis, Director of Master Data, Fabian Torres, Director of Project Management from Houghton Mifflin Harcourt, global leader in publishing, as well as David Lyle, VP of product strategy from Informatica to discuss how to implement a successful data governance practice that brings business impact to an enterprise organization.
If you are currently kicking the tires on setting up data governance practice in your organization, I’d like to invite you to visit a member-only website dedicated to Data Governance: http://governyourdata.com/. This site currently has over 1,000 members and is designed to foster open communications on everything data governance. There you will find conversations on best practices, methodologies, frame works, tools and metrics. I would also encourage you to take a data governance maturity assessment to see where you currently stand on the data governance maturity curve, and compare the result against industry benchmark. More than 200 members have taken the assessment to gain better understanding of their current data governance program, so why not give it a shot?
Data Governance is a journey, likely a never-ending one. We wish you best of the luck on this effort and a joyful ride! We love to hear your stories.
Achieving and maintaining a single, semantically consistent version of master data is crucial for every organization. As many companies are moving from an account or product-centric approach to a customer-centric model, master data management is becoming an important part of their enterprise data management strategy. MDM provides the clean, consistent and connected information your organizations need for you to –
- Empower customer facing teams to capitalize on cross-sell and up-sell opportunities
- Create trusted information to improve employee productivity
- Be agile with data management so you can make confident decisions in a fast changing business landscape
- Improve information governance and be compliant with regulations
But there are challenges ahead for the organizations. As Andrew White of Gartner very aptly wrote in a blog post, we are only half pregnant with Master Data Management. Andrew in his blog post talked about increasing number of inquiries he gets from organizations that are making some pretty simple mistakes in their approach to MDM without realizing the impact of those decisions on a long run.
Over last 10 years, I have seen many organizations struggle to implement MDM in a right way. Few MDM implementations have failed and many have taken more time and incurred cost before showing value.
So, what is the secret sauce?
A key factor for a successful MDM implementation lays in mapping your business objectives to features and functionalities offered by the product you are selecting. It is a phase where you ask right questions and get them answered. There are few great ways in which organizations can get this done and talking to analysts is one of them. The other option is to attend MDM focused events that allow you to talk to experts, learn from other customer’s experience and hear about best practices.
We at Informatica have been working hard to deliver you a flexible MDM platform that provides complete capabilities out of the box. But MDM journey is more than just technology and product features as we have learnt over the years. To ensure our customer success, we are sharing knowledge and best practices we have gained with hundreds of successful MDM and PIM implementations. The Informatica MDM Day, is a great opportunity for organizations where we will –
- Share best practices and demonstrate our latest features and functionality
- Show our product capabilities which will address your current and future master data challenges
- Provide you opportunity to learn from other customer’s MDM and PIM journeys.
- Share knowledge about MDM powered applications that can help you realize early benefits
- Share our product roadmap and our vision
- Provide you an opportunity to network with other like-minded MDM, PIM experts and practitioners
So, join us by registering today for our MDM Day event in New York on 24th February. We are excited to see you all there and walk with you towards MDM Nirvana.
That’s right, Valentine’s Day is upon us, the day that symbolizes the power of love and has the ability to strengthen relationships between people. I’ve personally experienced 53 Valentine’s Days so I believe I speak with no small measure of authority on the topic of how to make the best of it. Here are my top five suggestions for having a great day:
- Know everything you can about the people you have relationships with
- Quality matters
- ALL your relationships matter
- Uncover your hidden or anonymous relationships
- Treat your relationships with respect all year long
OK, I admit, this is not the most romantic list ever and might get you in more trouble with your significant other than actually forgetting Valentine’s Day altogether! But, what did you expect? I work for a software company, not eHarmony!
Right. Software. Let’s put this list into the context of government agencies.
- Know everything – If your agency’s mission involves delivering services to citizens, likely, you have multiple “systems of record”, each with a supposed accurate record of all the people being tracked by each system. In reality though, it’s rare that the data about individuals is consistently accurate and complete from system to system. The ability to centralize all the data about individuals into a single, authoritative “record” is key to improving service delivery. Such a record will enable you to ensure the citizens you serve are able to take full advantage of all the services available to them. Further, having a single record for each citizen has the added benefit of reducing fraud, waste and abuse.
- Quality matters – Few things hinder the delivery of services more than bad data, data with errors, inconsistencies and gaps in completeness. It is difficult, at best, to make sound business decisions with bad data. At the individual level and at the macro level, agency decision makers need complete and accurate data to ensure each citizen is fully served.
- All relationships matter – In this context, going beyond having single records to represent people, it’s also important to have single, authoritative views of other entities – programs, services, providers, deliverables, places, etc.
- Uncover hidden relationships – Too often, in the complex eco-system of government programs and services, the inability to easily recognize relationships between people and the additional entities mentioned above creates inefficiencies in the “system”. For example, it can go unnoticed that a single parent is not enrolled in a special program designed for their unique life circumstances. Flipping the coin, not having a full view of hidden relationships also opens the door for the less scrupulous in society, giving them the ability to hide their fraudulent activities in plain sight.
- Treat relationships respectfully all year – Data hygiene is not a one-time endeavor. Having the right mindset, processes and tools to implement and automate the process of “mastering” data as an on-going process will better ensure the relationship between your agency and those it serves will remain positive and productive.
I may not win the “Cupid of the Year” award, but, I hope my light-hearted Valentine’s Day message has given you a thing or two to think about. Maybe Lennon and McCartney are right, between people, “Love is All You Need”. But, we at Informatica believe for Government-Citizen relationships, a little of the right software can go a long way.