Tag Archives: Data Quality
If you’ve spent some time studying and practicing data governance, you would agree that data governance is a challenging yet rewarding endeavor. Across industries, a growing number of organizations have put data governance programs in place so they can more effectively manage their data to drive the business value. But the reality is, data governance is a complex process, and most companies practicing data governance today are still at the early phase of this very long journey. In fact, according to the result from over 240 completed data governance assessments on http://governyourdata.com/, a community website dedicated to everything data governance, the average score for data governance maturity is only 1.6 out of 5. It’s no surprise that data governance was a hot topic at last week’s Informatica World 2015. Over a dozen presentations and panel discussions on data governance were delivered; practitioners across various industries shared their real-world stories on topics ranging from how to kick-start a data governance program, how to build business cases for data governance, frameworks and stewardship management, to the choice of technologies. For me, the key takeaways are:
- Old but still true – To do data governance the right way, you must start small and focus on achieving tangible results. Leverage the small victories to advance to the next phase.
- Be prepared to fail more than once while building a data governance program. But don’t quit, because your data will not.
- One-size doesn’t fit all when it comes to building a data governance framework, which is a challenge for organizations, as there is no magic formula that companies can immediately adopt. Should you build a centralized or federated data governance operation? Well, that really depends on what works within your existing environment.
In fact, when asked “what’s the most challenging area for your data governance effort” in our recent survey conducted at Informatica World 2015, “Identify roles and responsibilities” got the most mentions. Basic principle? – Choose a framework that blends well with your company‘s culture.
- Let’s face it, data governance is not an IT project, nor is it about fixing data problems. It is a business function that calls for people, process and technology working together to obtain the most value from your data. Our seasoned practitioners recommend a systematic approach: Your first priority should be people gathering – identifying the right people with the right skills and most importantly, those who have a passion for data; next is figuring out the process. Things to consider include: What’s the requirement for data quality? What metrics and measurements should be used for examining the data; how to handle exceptions and remediate data issues? How to quickly identify and apply security measures to the various data sets? Third priority is selecting the right technologies to implement and facilitate those processes to transform the data so it can be used to help meet business goals.
- “Engage your business early on” is another important tip from our customers who have achieved early success with their data governance program. A data governance program will not be sustainable without participation from the business. The reason is simple – the business owns the data, they are the consumers of the data and have specific requirements for the data they want to use. IT needs to work collaboratively with business to meet those requirements so the data is fit for use, and provides good value for the business.
- Scalability, flexibility and interoperability should be the key considerations when it comes to selecting data governance technologies. Your technology platform should be able to easily adapt to the new requirements arising from the changes in your data environment. A Big Data project, for example, introduces new data types, increased data speed and volume. Your data management solution should be agile enough to address those new challenges with minimum disruption to your workflow.
Data governance is HOT! The well-attended sessions at Informatica World, as well as some of our previously hosted webinars is testimony of the enthusiasm among our customers, partners, and our own employees on this topic. It’s an exciting time for us at Informatica because we are in a great position to help companies build an effective data governance program. In fact, many of our customers have been relying on our industry-leading data management tools to support their data governance program, and have achieved results in many business areas such as meeting compliance requirements, improving customer centricity and enabling advanced analytics projects. To continue the dialogue and facilitate further learning, I’d like to invite you to an upcoming webinar on May 28, to hear some insightful, pragmatic tips and tricks for building a holistic data governance program from industry expert David Loshin, Principal at Knowledge Integrity, Inc, and Informatica’s own data governance guru Rob Karel.
“Better data is everyone’s job” – well said by Terri Mikol, director of Data Governance at University of Pittsburgh Medical Center. For companies striving to leverage data to deliver business value, everyone within the company should treat data as a strategic asset and take on responsibilities for delivering clean, connected and safe data. Only then can your organization be considered truly “Data Ready”.
What is a Data Ready Enterprise?
One that is able to treat data as a strategic asset throughout the organization. One that invests in its enterprise architecture to build a foundation of data ‘readiness’. One that incorporates data into its culture and uses it to drive high performance teams.
Data Drives Profit
In a recent research study, ’Data drives profit in a data-ready enterprise’, more than half of respondents agreed that ‘An effective data strategy can be a competitive advantage for companies’. Yet less than 25% of those same respondents stated ’Our data management is good enough to satisfy our current needs’.
So we know that data is important – yet are we challenged in creating a data ready culture? Are we really only mediocre at best when it comes to managing data? With digital transformation initiatives in full swing (or behind us for those movers and shakers) and our key business processes automated, how do we pivot from focusing on the application to focusing on the data? It starts with defining business strategies at the top and building capabilities throughout necessary to become data ready.
What are the unique business strategies and capabilities required to be data-ready?
In most enterprises, there is a function that focuses specifically on the corporate strategy typically as part of a C-Suite. In order to transform to the data ready enterprise, this function will need to define policies, corporate goals and initiatives that focus the rest of the organization to adopt data management best practices and drive necessary change. This may even include creating an organization dedicated to the organization’s data management function and capabilities with a leader, such as a Chief Data Officer, who may not necessarily report to the CIO.
In order to make key business decisions based on a data-ready framework, organizations should leverage their enterprise architecture teams to identify data assets that are required to support each business function’s needs. Once those assets have been identified, key capabilities – such as data quality, data integration, mastering data, and data governance – need to be developed and matured. Rather than boiling the ocean, start with a key business initiative that would benefit the most from a focus on the data itself.
How do you get started?
If your organization is just starting on its transformational journey to becoming a data ready enterprise, focus on one or two key business initiatives that will significantly benefit from a data ready framework. For example, most Informatica customers start with one business initiative, such as ‘Total Customer Experience’ or ‘Next Generation Analytics’, to build their data ready organizational capabilities. By leveraging the enterprise architecture team, identify common requirements and technologies that can be leveraged across multiple initiatives, and focus on quick wins. Here is where an Intelligent Data Platform approach can offer significant value over point solutions or projects (deploy once, leverage everywhere).
If you are at the very beginning just trying to get support for why this transformation is critical for your business, download the ‘Data drives profit in the data ready enterprise’ ebook. If your executive management team agrees and is looking for how to get started, download the ‘How to organize the data ready enterprise’ ebook.
It’s time to get ready – data-ready!
The rapid advancement we are seeing in social, mobile and other digital technologies have transformed the way of our life significantly. My commute to airport for business travel takes three taps on an app on my smart phone and a leading retailer I shop frequently, knows what I like, which product I have put on the cart using my iPad, which products I “liked” on their social channels so they can do real-time recommendations while I am at their physical store. Their employees are now armed with information that is integrated in ways that empower them be more customer-centric than ever before.
All these rapid changes just over a decade are bought to us by companies that pioneered the digital transformation and data is at the center of this revolution. These organizations gained competitive advantage from social, mobile, analytics, cloud, and internet of things technologies. In a world filled with data of different variety, volume and velocity, it’s more important for organizations to become data-ready. While the potential for insight in big data is massive, we need a new generation of Master Data Management to realize all the potential.
In the contrast of this rapid change fueled by data, growing number of companies are realizing that they now have a massive opportunity in front of them. At the centers of this digital transformation is master data that provides an opportunity for organizations to:
- Better understand customers, their household and relationships so they can do effective cross-sell and up-sell
- Identify customers interacting with company via different channels so they can push relevant offer to these customers in real time.
- Offer better product recommendations to customers based on their purchasing and browsing behavior
- Optimize the way they manage their inventory leading to significant cost savings
- Manage supplier relationships more effectively so they can negotiate better rate
- Provide superior patient care and cure harmful deceases at early stage by creating patient-centric solutions that connect health information from more and more sources
- Master wellhead and other upstream exploration and production assets so they can do better crew allocation and production planning
- Be compliant to complex and ever changing government regulations leading to significant savings in terms of fines and punishments
We will talk about all this and more at Informatica World 2015 which is happening next week in Las Vegas. Join us for the MDM Day on May 12 followed by Information Quality and Governance track sessions on May 13 and 14. Register now.
We have 37 sessions that cover Master Data Management, Omnichannel Commerce, Data Quality, Data as a Service and Big Data Relationship Management. You get a chance to learn from Informatica’s customers about their experience, best practices from our partners and our vision and roadmap straight from our product management team. We will also talk about master data fueled Total Customer Relationship and Total Supplier Relationship applications that leverage our industry leading multidomain MDM platform.
Here is your guide to sessions that will be covered. I will see you there. If you want to say hello in person, reach out to me at @MDMGeek and follow @InfaMDM twitter handle for all the latest news. The hash tag for this event is #INFA15
The big news out of my hometown, is their battle over the past few weeks to be named Kraft’s Hockeyville — a competition to find the biggest ice hockey town in the United States. Johnstown, a former steel city in Western Pennsylvania — famous for the movie Slapshot, has made it to the final round. The community has been encouraging each other to cast their votes through every means necessary. They have been using social media posts, email blasts, and are even calling local hockey enthusiasts, which is how I heard about this competition — by mistake.
Earlier this week I received a call to my mobile phone asking me to vote in the competition. However, the person on the other end was looking for someone named “Tyler” – apparently a frequent shopper at a local sporting goods store not far from where my parents live.
When I moved away years ago, like so many others, I never changed my mobile phone number. I still have a Pennsylvania number even though I live in North Carolina now. The local sporting goods store called to reach “Tyler,” someone they knew was interested in hockey based on his purchasing behavior. When they couldn’t reach “Tyler” with the inaccurate phone number that they had in their system, they changed the final digit by one and contacted me instead. While this isn’t a great way to source a better phone number for your customers, it made me think. What if this call was more critical? What if it was urgent that this company reach “Tyler”? What if “Tyler” was waiting for this phone call? Talk about a negative customer experience!
A Missed Call, a Missed Customer Opportunity
This brings me full circle to our newest product offering, an update to our Phone Validation product that verifies phone numbers throughout the world. For telephone numbers in over 240 countries and territories, the Phone Validation offering from Informatica Data as a Service will ensure accuracy, identify the phone type (mobile, landline, free, VoIP), as well as the phone carrier (such as Verizon, AT&T, or Vodafone). The service will also identify geographic data such as locality information including time zone – so that you don’t call “Tyler” in the middle of the night to encourage him to vote.
The real power of the Phone Validation service comes when it’s combined with our Email Verification and Address Verification services to validate all of the most critical components of a customer’s contact record. This Contact Record Verification service can be easily implemented anywhere that contact data, such as phone numbers, email addresses, and postal addresses, are acquired or stored. This will ensure that when you try to reach a contact in an urgent situation that you can reach the right person, the first time – without fail.
Does Your Company Reach the Right Customer, the First Time?
To learn more about our Phone Validation and Contact Record Verification services, check out this informative webinar recording.
Oh – and good luck to Johnstown. You have my vote, and I’ll be interested to see this weekend if you bring home the prize to Flood City!
You know that old saying, “What’s the best way to eat an elephant?” “One bite at a time.”
Much like the daunting task of eating an elephant, implementing an MDM solution can seem staggering. Many times, the implementation team gets mired in the details, wanting to create a solution that is the answer to everyone’s problems and as a result never gets started due to the overwhelming nature of the undertaking. After many successful MDM implementations – I’d like to recommend the following approach to eating your elephant:
- Clearly establish the business problem you are trying to solve
According to the Gartner Group, “MDM is a business-driven, technology-enabled environment and program.” It is really important to remember that the reason you are working on an MDM project is to satisfy a specific business need aligned with the overall business strategy. Once aligned – it is easier to show value to the organization when it is successfully implemented.
For instance – if you want to reduce readmission costs for a specific segment of your member population, this could have an impact on several potential business goals in the payer market including improving customer satisfaction and lowering cost of care for specific populations. Improving customer satisfaction is a pretty big elephant – but focusing down on reducing readmission costs is much more manageable. And it will impact two of your business ! Sutter Health has taken this approach and is releasing new use cases every 90 days.
- Understand and document your current state
Now that you know what problem you are attempting to solve – you will need to understand the current level of maturity of your current approach and the resources it will take to implement your solution. You will need to consider your available resources, understand the amount of time and money it may take to implement a solution, and obtain visibility into where your organization stands currently in regards to the .
Using the readmission example – you are going to need to identify where readmission data is available. How often is this data updated? What is the quality of the data? How difficult is it to get access to the data? Who are the resources that are responsible for readmission data? Are they distracted by many competing requests? What are some of the data sources that could impact readmission rates that are not currently part of your data sources? For instance – understanding the living arrangement of the member prior to discharge can have an impact on readmission rates (members living alone have a much higher readmission rate than those living with others). If you can identify a household through social media data, you can better predict who will be going home alone.
- Clearly define success
There is business adage (much like eating an elephant!) that says “you can’t manage what you can’t measure.” Before you implement your MDM solution – you need to identify what metrics are most important and show success. These should align with the goals you set in #1 (which should align with the goals of the business). You should also identify specific business outcomes that metrics can apply to.
What would be a reasonable measure of success for reduction of readmission rates?
- Moving from a 15% readmission rate to an 8% readmission rate?
- Moving from a 15% readmission rate to a 10% readmission rate? In what amount of time?
- Perhaps moving from a 15% readmission rate to a 13% readmission rate in a year after implementation?
- Build a governance hierarchy
According to Gartner Group, “Governance is about decision making and ensuring that you have an authority framework that takes decisions and is able to measure the execution of those decisions.” An effective governance program requires a well-defined hierarchy, headed by a sponsor — someone in a position of authority who carries the necessary weight and cross-departmental authority to make MDM governance a reality. How can you find out how mature your organization is from a data governance perspective? There are several vendors and websites that can help with that including: www.governyourdata.com .
You need to be able to make assessments about the quality of the source data, the data lineage (where and how the data that is feeding your MDM solution has been modified) for compliance reporting requirements, as well as establishing a process for support of your data quality. There are 10 data governance insights from UPMC here.
Now – what is the smallest bite you can define for your organization? By laying out your business priorities, identifying your current state, creating a measurable goal and a method for governing – you have put clear boundaries on what you are trying to accomplish. Once you’ve established your credibility on your ability to finish your first bite successfully, the next bite will be easier!
If you’ve been contemplating to transform your business, making it a priority to embrace digital transformation, this year’s Informatica World 2015, in Las Vegas, has a series of B2B and B2C sessions for you. Here are some I wish to recommend:
B2B Commerce Ready Data
PartsSource Improve Customer Experience with Product Information in Medical Parts
Brian Thomas, Director of Application, PartsSource
A leading provider of medical replacement parts solutions, PartsSource will discuss how it uses Informatica PIM as the foundation for its omnichannel strategy and to support its industry-first one-stop shop online catalog for medical parts. PartsSource will explain how Informatica helped reduce the time needed to launch and update products from 120 minutes to 2 minutes using fewer employees, making it easier than ever to connect over 3,300 hospitals to thousands of OEMs and suppliers.
B2C Commerce Ready Data
Elkjop and Monsanto: PIM and MDM to Manage Product and Customer Data
Thomas Thykjer, Master Data Architect, Elkjop Nordics AS
Jim Stellern, US Commercial Data Management and BI IT Lead, Monsanto
Elkjop, the largest consumer electronics retailer in the Nordic countries, increased both its product range offering and the quality of its product information across its entire portfolio by leveraging the strong embedded Data Quality tools found in the Informatica PIM. Elkjop will also discuss how they reduced their yearly development costs and operating costs.
Monsanto developed a strategy to address customer data quality issues, clearly articulated the business value of implementing the strategy, and successfully implemented the solution leveraging Informatica MDM and a new data governance program.
Omnichannel Ready: What’s New in PIM 8?
Latest Features, Demo and Roadmap
Stefan Reinhardt, Senior Product Manager PIM, R&D DEU PIM, Informatica
Ben Rund, Sr. Director Product Marketing, Information Quality Solutions, Informatica
Are you commerce ready? Can you say the same about your data? Are you focused on the betterment of your supply chain, marketing, ecommerce, merchandising and category management This session, aimed at those wanting to sell more products faster.
Learn what’s new with PIM 8:
- Kits and bundles for superior cross-selling
- High data volumes architecture
- Role- and task-based web interfaces for increased efficiency on product content collaboration
- Business user dashboard
- 1WorldSync data pool syndication for compliant CPG data
- Product Data as a Service (DaaS) for price intelligence and content benchmarking.
We will be covering all those new features during the session through in a live demo.
Furthermore, we will also cover the PIM roadmap, showcasing how PIM is evolving and gaining MDM like features, by fueling product data apps for different use cases and industries, leveraging the Intelligent Data Platform.
User Group Session for Omnichannel
Are you looking to talk to the experts? Don’t miss out on our user group sessions where you will be able to discuss and work directly with the R&D and product management experts. They will be there to answer your questions as well as hear your thoughts and feedback. It’s your opportunity to be heard and get an answer to your question, don’t miss out!
37 Sessions to Master the Digital Transformation
An overall of 37 sessions, focused on different Master Data Management and Information Governance, key disciplines and the foundation of successfully mastering digital transformation, will be held on the 12th and 13th of May.
The Good, The Bad, and The Ugly! Experiencing Great (and Not So Great) Data-Driven Marketing in Day-to-Day Life
I spend my professional life helping marketers get the most out of their data. So it really hits me when I experience real life case studies in my day-to-day personal life. When I do run across these great (and not so great) experiences, it broadens my perspective and helps me figure out new ways that I can help our customers use clean, safe, and connected data to revolutionize their marketing efforts.
I feel compelled to blog about these great customer experiences in hopes that other marketers can learn from these encounters like I did. So this is the first installment of an ongoing series of blog posts about where I’ve experienced great (and not so great) data-driven marketing experiences.
I love my insurance company for many reasons, but the last experience I had with them was truly exceptional. I had been on their website getting an insurance quote for a new home. I still had a few questions, so I initiated an online chat. The online representative there quickly and efficiently answered my question and I left the experience with a thorough understanding of the potential policy. About a week later, seemingly unrelated, I put in a claim through a local vendor for a chipped windshield repair. Then two weeks later, I called back and the phone system recognized my mobile number and through the automated system asked me if I was calling about the homeowner’s policy quote I had received a few weeks ago. I pressed 1 for yes, and here’s where the impressive part begins…
A representative quickly answered the phone. He was able to view all of the information I had put into the online quote tool. He then referenced the question I had asked the representative in the online chat, and asked if I had any further questions about it. Within minutes I had my new policy set up, but it didn’t stop there. He asked if the windshield chip I had the previous week had been repaired to my satisfaction. Then he noticed that although I do have a credit card through them, I haven’t used it in some time, and offered me a 0% balance transfer offer for 36 months with no transfer fee on that card. Heck, we just bought a new house and I know there will be plenty of expenses, so sign me up! Finally, he noticed that because I was about to move to a new zip code, my car insurance rates would be going down slightly and offered to send me to the automobile insurance team to make the change to my policy.
The system clearly tied all of my recent and past activities from various channels together, analyzed them, and leveraged some sort of recommendation engine to guide the customer support representative to provide truly customized service. You can be assured I love my insurance company even more, and I will be dusting off a stagnant account that I hadn’t used in years. A+.
Oh how I wish my bank would embrace Total Customer Relationship. My husband, my children, and I have far too many accounts at our bank. Each child has a savings account, we have a joint checking account, we have a savings account, he has a personal savings account, I have a personal savings account, and then there’s the cash reserve line, the credit card and a few CD’s. We recently sold a home and the now-paid-off mortgage was through them. Plus we’ve been customers for almost 20 years. So sufficed to say, we have been loyal customers.
Well the other day, I had to go get a cashier’s check for a school activity for one of my children. I know, it’s pretty strange that they needed a cashier’s check, but I digress. I went to pull half out of my son’s minor savings account and half out of my individual checking account. Neither of those accounts have much money in there, nor have they been opened for very long. Call me spoiled, but I’m used to getting these types of service fees waived for my long tenure and deep relationship with our bank. But because the accounts I had pulled the money from didn’t have that kind of history, they weren’t willing to waive the fees. I was in a hurry because I was running late from shopping at a shoe store (see “The Ugly” below) so I didn’t have the time or energy to fight it, so I paid the darn $5 fee, but I was irritated. Clearly they couldn’t easily see the total customer relationship I have with their institution. The aren’t leveraging their data to tell a complete story, and missed an opportunity to show a loyal customer a great experience.
And The Ugly
A few days ago, I was at a shoe store picking up some new soccer cleats for one of my children. I had gotten an email offer for 30% off, so I pulled up the email and prepared to use it at the cash register. For whatever reason, the email didn’t had a blank where a code was supposed to be, and the woman at the register, despite her best efforts couldn’t use it. So, trying to be helpful, she looked at my loyalty account and, low and behold, I had $40 worth of rewards points that I didn’t even know existed. But I had to first download an app to try to issue a coupon using those points. I downloaded the app, put in my loyalty number, and no points were available.
Turns out, I had two accounts, but they weren’t linked despite having the same phone number and names. One had an address that was misspelled in the system, so it apparently wouldn’t merge with the other account – oh data quality and address validation how I missed you at that moment! She corrected the address, and informed me that it would now merge the two accounts and to try to log in again. Of course, I knew that there was no way this was a real time, or near real time process, but she was insistent. So I tried again, nothing.
The woman couldn’t have been nicer, but poor data quality processes and long batch windows had her hands tied. I was advised to call the customer support line, but of course it was a Sunday afternoon and nobody was there to pick up. So 45 minutes later, I left the store, irritated and very late, and without the shoes I was going to purchase out of principle. In the future, I’ll be going down the street and shopping at another shoe store – it’s my own personal strike against antiquated, inaccurate, incomplete, and painfully slow data processes which result in bad customer experiences!
In The End…
In the greater scheme of things, these varying degrees of customer experience “misses” aren’t exactly a crisis. It’s not curing cancer or solving world hunger, but to consumers, having a great customer experience is really important. Wouldn’t you rather have your customers raving about a great experience, than grumbling about a bad one, or losing a customer due to an ugly one?
We marketers can make the difference! We own that end-to-end omni channel experience. We need to make sure that our data is clean, safe, and connected so we can provide our customers what they expect and frankly deserve from us.
Informatica’s Total Customer Relationship Solution empowers organizations with confidence, knowing that they have access to the kind of great customer data that allows them to surpass customer acquisition and retention goals by providing consistent, integrated, and seamless customer experiences across channels. The end result? Great experiences that customers are inspired to share with their family and friends at dinner parties and on blog posts like this one.
Want to learn more? Check out these webinars to see how Informatica and our customers and partners are revolutionizing the customer experience.
At the recent Bosch Connected World conference in Berlin, Stefan Bungart, Software Leader Europe at GE, presented a very interesting keynote, “How Data Eats the World”—which I assume refers to Marc Andreesen’s statement that “Software eats the world”. One of the key points he addressed in his keynote was the importance of generating actionable insight from Big Data, securely and in real-time at every level, from local to global and at an industrial scale will be the key to survival. Companies that do not invest in DATA now, will eventually end up like consumer companies which missed the Internet: It will be too late.
As software and the value of data are becoming a larger part of the business value chain, the lines between different industries become more vague, or as GE’s Chairman and CEO Jeff Immelt once stated: “If you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company.” This is not only true for an industrial company, but for many companies that produce “things”: cars, jet-engines, boats, trains, lawn-mowers, tooth-brushes, nut-runners, computers, network-equipment, etc. GE, Bosch, Technicolor and Cisco are just a few of the industrial companies that offer an Internet of Things (IoT) platform. By offering the IoT platform, they enter domains of companies such as Amazon (AWS), Google, etc. As Google and Apple are moving into new areas such as manufacturing cars and watches and offering insurance, the industry-lines are becoming blurred and service becomes the key differentiator. The best service offerings will be contingent upon the best analytics and the best analytics require a complete and reliable data-platform. Only companies that can leverage data will be able to compete and thrive in the future.
The idea of this “servitization” is that instead of selling assets, companies offer service that utilizes those assets. For example, Siemens offers a service for body-scans to hospitals instead of selling the MRI scanner, Philips sells lightning services to cities and large companies, not the light bulbs. These business models enable suppliers to minimize disruption and repairs as this will cost them money. Also, it is more attractive to have as much functionality of devices in software so that upgrades or adjustments can be done without replacing physical components. This is made possible by the fact that all devices are connected, generate data and can be monitored and managed from another location. The data is used to analyse functionality, power consumption, usage , but also can be utilised to predict malfunction, proactive maintenance planning, etc.
So what impact does this have on data and on IT? First of all, the volumes are immense. Whereas the total global volume of for example Twitter messages is around 150GB, ONE gas-turbine with around 200 sensors generates close to 600GB per day! But according to IDC only 3% of potentially useful data is tagged and less than 1% is currently analysed. Secondly, the structure of the data is now always straightforward and even a similar device is producing different content (messages) as it can be on a different software level. This has impact on the backend processing and reliability of the analysis of the data.
Also the data often needs to put into context with other master data from thea, locations or customers for real-time decision making. This is a non-trivial task. Next, Governance is an aspect that needs top-level support. Questions like: Who owns the data? Who may see/use the data? What data needs to be kept or archived and for how long? What needs to be answered and governed in IoT projects with the same priorities as the data in the more traditional applications.
To summarize, managing data and mastering data governance is becoming one of the most important pillars of companies that lead the digital age. Companies that fail to do so will be at risk for becoming a new Blockbuster or Kodak: companies that didn’t adopt quickly enough. In order to avoid this, companies need to evaluate a data platform can support a comprehensive data strategy which encapsulates scalability, quality, governance, security, ease of use and flexibility, and that enables them to choose the most appropriate data processing infrastructure, whether that is on premise or in the cloud, or most likely a hybrid combination of these.
The Oil and Gas (O&G) industry is an important backbone of every economy. It is also an industry that has weathered the storm of constantly changing economic trends, regulations and technological innovations. O&G companies by nature have complex and intensive data processes. For a profitable functioning under changing trends, policies and guidelines, O&G’s need to manage these data processes really well.
The industry is subject to pricing volatility based on microeconomic pattern of supply and demand affected by geopolitical developments, economic meltdown and public scrutiny. The competition from other sources such as cheap natural gas and low margins are adding fuel to the burning fire making it hard for O&G’s to have a sustainable and predictable outcome.
A recent PWC survey of oil and gas CEOs similarly concluded that “energy CEOs can’t control market factors such as the world’s economic health or global oil supply, but can change how they respond to market conditions, such as getting the most out of technology investments, more effective use of partnerships and diversity strategies.” The survey also revealed that nearly 80% of respondents agreed that digital technologies are creating value for their companies when it comes to data analysis and operational efficiency.
O&G firms run three distinct business operations; upstream exploration & production (E&P’s), midstream (storage & transportation) and downstream (refining & distribution). All of these operations need a few core data domains being standardized for every major business process. However, a key challenge faced by O&G companies is that this critical core information is often spread across multiple disparate systems making it hard to take timely decisions. To ensure effective operations and to grow their asset base, it is vital for these companies to capture and manage critical data related to these domains.
E&P’s core data domains include wellhead/materials (asset), geo-spatial location data and engineer/technicians (associate). Midstream includes trading partners and distribution and downstream includes commercial and residential customers. Classic distribution use cases emerge around shipping locations, large-scale clients like airlines and other logistics providers buying millions of gallons of fuel and industrial lube products down to gas station customers. The industry also relies heavily on reference data and chart of accounts for financial cost and revenue roll-ups.
The main E&P asset, the well, goes through its life cycle and changes characteristics (location, ID, name, physical characterization, depth, crew, ownership, etc.), which are all master data aspects to consider for this baseline entity. If we master this data and create a consistent representation across the organization, it can then be linked to transaction and interaction data so that O&G’s can drive their investment decisions, split cost and revenue through reporting and real-time processes around
- Crew allocation
- Royalty payments
- Safety and environmental inspections
- Maintenance and overall production planning
E&P firms need a solution that allows them to:
- Have a flexible multidomain platform that permits easier management of different entities under one solution
- Create a single, cross-enterprise instance of a wellhead master
- Capture and master the relationships between the well, equipment, associates, land and location
- Govern end-to-end management of assets, facilities, equipment and sites throughout their life cycle
The upstream O&G industry is uniquely positioned to take advantage of vast amount of data from its operations. Thousands of sensors at the well head, millions of parts in the supply chain, global capital projects and many highly trained staff create a data-rich environment. A well implemented MDM brings a strong foundation for this data driven industry providing clean, consistent and connected information about the core master data so these companies can cut the material cost, IT maintenance and increase margins.
To know more on how you can achieve upstream operational excellence with Informatica Master Data Management, check out this recorded webinar with @OilAndGasData
Let’s face it, building a Data Governance program is no overnight task. As one CDO puts it: ”data governance is a marathon, not a sprint”. Why? Because data governance is a complex business function that encompasses technology, people and process, all of which have to work together effectively to ensure the success of the initiative. Because of the scope of the program, Data Governance often calls for participants from different business units within an organization, and it can be disruptive at first.
Why bother then? Given that data governance is complex, disruptive, and could potentially introduce additional cost to a company? Well, the drivers for data governance can vary for different organizations. Let’s take a close look at some of the motivations behind data governance program.
For companies in heavily regulated industries, establishing a formal data governance program is a mandate. When a company is not compliant, consequences can be severe. Penalties could include hefty fines, brand damage, loss in revenue, and even potential jail time for the person who is held accountable for being noncompliance. In order to meet the on-going regulatory requirements, adhere to data security policies and standards, companies need to rely on clean, connected and trusted data to enable transparency, auditability in their reporting to meet mandatory requirements and answer critical questions from auditors. Without a dedicated data governance program in place, the compliance initiative could become an on-going nightmare for companies in the regulated industry.
A data governance program can also be established to support customer centricity initiative. To make effective cross-sells and ups-sells to your customers and grow your business, you need clear visibility into customer purchasing behaviors across multiple shopping channels and touch points. Customer’s shopping behaviors and their attributes are captured by the data, therefore, to gain thorough understanding of your customers and boost your sales, a holistic Data Governance program is essential.
Other reasons for companies to start a data governance program include improving efficiency and reducing operational cost, supporting better analytics and driving more innovations. As long as it’s a business critical area and data is at the core of the process, and the business case is loud and sound, then there is a compelling reason for launching a data governance program.
Now that we have identified the drivers for data governance, how do we start? This rather loaded question really gets into the details of the implementation. A few critical elements come to consideration including: identifying and establishing various task forces such as steering committee, data governance team and business sponsors; identifying roles and responsibilities for the stakeholders involved in the program; defining metrics for tracking the results. And soon you will find that on top of everything, communications, communications and more communications is probably the most important tactic of all for driving the initial success of the program.
A rule of thumb? Start small, take one-step at a time and focus on producing something tangible.
Sounds easy, right? Well, let’s hear what the real-world practitioners have to say. Join us at this Informatica webinar to hear Michael Wodzinski, Director of Information Architecture, Lisa Bemis, Director of Master Data, Fabian Torres, Director of Project Management from Houghton Mifflin Harcourt, global leader in publishing, as well as David Lyle, VP of product strategy from Informatica to discuss how to implement a successful data governance practice that brings business impact to an enterprise organization.
If you are currently kicking the tires on setting up data governance practice in your organization, I’d like to invite you to visit a member-only website dedicated to Data Governance: http://governyourdata.com/. This site currently has over 1,000 members and is designed to foster open communications on everything data governance. There you will find conversations on best practices, methodologies, frame works, tools and metrics. I would also encourage you to take a data governance maturity assessment to see where you currently stand on the data governance maturity curve, and compare the result against industry benchmark. More than 200 members have taken the assessment to gain better understanding of their current data governance program, so why not give it a shot?
Data Governance is a journey, likely a never-ending one. We wish you best of the luck on this effort and a joyful ride! We love to hear your stories.