Tag Archives: Data Governance

Business Asking IT to Install MDM: Will IDMP make it happen?

Will IDMP Increase MDM Adoption?

Will IDMP Increase MDM Adoption?

MDM for years has been a technology struggling for acceptance.  Not for any technical reason, or in fact any sound business reason.  Quite simply, in many cases the business people cannot attribute value delivery directly to MDM, so MDM projects can be rated as ‘low priority’.  Although the tide is changing, many business people still need help in drawing a direct correlation between Master Data Management as a concept and a tool, and measurable business value.  In my experience having business people actively asking for a MDM project is a rare occurrence.  This should change as the value of MDM is becoming clearer, it is certainly gaining acceptance in principle that MDM will deliver value.  Perhaps this change is not too far off – the introduction of Identification of Medicinal Products (IDMP) regulation in Europe may be a turning point.

At the DIA conference in Berlin this month, Frits Stulp of Mesa Arch Consulting suggested that IDMP could get the business asking for MDM.  After looking at the requirements for IDMP compliance for approximately a year, his conclusion from a business point of view is that MDM has a key role to play in IDMP compliance.  A recent press release by Andrew Marr, an IDMP and XEVMPD expert and  specialist consultant, also shows support for MDM being ‘an advantageous thing to do’  for IDMP compliance.  A previous blog outlined my thoughts on why MDM can turn regulatory compliance into an opportunity, instead of a cost.  It seems that others are now seeing this opportunity too.

So why will IDMP enable the business (primarily regulatory affairs) to come to the conclusion that they need MDM?  At its heart, IDMP is a pharmacovigilance initiative which has a goal to uniquely identify all medicines globally, and have rapid access to the details of the medicine’s attributes.  If implemented in its ideal state, IDMP will deliver a single, accurate and trusted version of a medicinal product which can be used for multiple analytical and procedural purposes.  This is exactly what MDM is designed to do.

Here is a summary of the key reasons why an MDM-based approach to IDMP is such a good fit.

1.  IDMP is a data Consolidation effort; MDM enables data discovery & consolidation

  • IDMP will probably need to populate between 150 to 300 attributes per medicine
  • These attributes will be held in 10 to 13 systems, per product.
  • MDM (especially with close coupling to Data Integration) can easily discover and collect this data.

2.  IDMP requires cross-referencing; MDM has cross-referencing and cleansing as key process steps.          

  • Consolidating data from multiple systems normally means dealing with multiple identifiers per product.
  • Different entities must be linked to each other to build relationships within the IDMP model.
  • MDM allows for complex models catering for multiple identifiers and relationships between entities.

3.  IDMP submissions must ensure the correct value of an attribute is submitted; MDM has strong capabilities to resolve different attribute values.

  • Many attributes will exist in more than one of the 10 to 13 source systems
  • Without strong data governance, these values can (and probably will be) different.
  • MDM can set rules for determining the ‘golden source’ for each attribute, and then track the history of these values used for submission.

4.  IDMP is a translation effort; MDM is designed to translate

  • Submission will need to be within a defined vocabulary or set of reference data
  • Different regulators may opt for different vocabularies, in addition to the internal set of reference data.
  • MDM can hold multiple values/vocabularies for entities, depending on context.

5.  IDMP is a large co-ordination effort; MDM enables governance and is generally associated with higher data consistency and quality throughout an organisation.

  • The IDMP scope is broad, so attributes required by IDMP may also be required for compliance to other regulations.
  • Accurate compliance needs tracking and distribution of attribute values.  Attribute values submitted for IDMP, other regulations, and supporting internal business should be the same.
  • Not only is MDM designed to collect and cleanse data, it is equally comfortable for data dispersion and co-ordination of values across systems.

 Once business users assess the data management requirements, and consider the breadth of the IDMP scope, it is no surprise that some of them could be asking for a MDM solution.  Even if they do not use the acronym ‘MDM’ they could actually be asking for MDM by capabilities rather than name.

Given the good technical fit of a MDM approach to IDMP compliance, I would like to put forward three arguments as to why the approach makes sense.  There may be others, but these are the ones I feel are most compelling:

1.  Better chance to meet tight submission time

There is slightly over 18 months left before the EMA requires IDMP compliance.  Waiting for final guidance will not provide enough time for compliance.  Using MDM you have a tool to begin with the most time consuming tasks:  data discovery, collection and consolidation.  Required XEVMPD data, and the draft guidance can serve as a guide as to where to focus your efforts.

2.  Reduce Risk of non-compliance

With fines in Europe of ‘fines up to 5% of revenue’ at stake, risking non-compliance could be expensive.  Not only will MDM increase your chance of compliance on July 1, 2016, but will give you a tool to manage your data to ensure ongoing compliance in terms of meeting deadlines for delivering new data, and data changes.

3.  Your company will have a ready source of clean, multi-purpose product data

Unlike some Regulatory Information Management tools, MDM is not a single-purpose tool.  It is specifically designed to provide consolidated, high-quality master data to multiple systems and business processes.  This data source could be used to deliver high-quality data to multiple other initiatives, in particular compliance to other regulations, and projects addressing topics such as Traceability, Health Economics & Outcomes, Continuous Process Verification, Inventory Reduction.

So back to the original question – will the introduction of IDMP regulation in Europe result in the business asking IT to implement MDM?  Perhaps they will, but not by name.  It is still possible that they won’t.  However, for those of you who have been struggling to get buy-in to MDM within your organisation, and you need to comply to IDMP, then you may be able to find some more allies (potentially with an approved budget) to support you in your MDM efforts.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Healthcare, Master Data Management | Tagged , | Leave a comment

Time to Celebrate! Informatica is Once Again Positioned as a Leader in Gartner’s Magic Quadrant for Data Quality Tools!

It’s holiday season once again at Informatica and this one feels particularly special because we just received an early present from Gartner: Informatica has just been positioned as a leader in Gartner’s Magic Quadrant for Data Quality Tools report for 2014! Click here to download the full report.

Gartner's Magic Quadrant Data Quality Tools, 2014

Gartner’s Magic Quadrant Data Quality Tools, 2014

And as it turns out, this is a gift that keeps on giving.  For eight years in a row, Informatica has been ranked as a leader in Gartner’s Magic Quadrant for Data Quality Tools. In fact, for the past two years running, Informatica has been positioned highest and best for ability to execute and completeness of vision, the two dimensions Gartner measures in their report.  These results once again validate our operational excellence as well as our prescience with our data quality products offerings. Yes folks, some days it’s hard to be humble.

Consistency and leadership are becoming hallmarks for Informatica in these and other analyst reports, and it’s hardly an accident. Those milestones are the result of our deep understanding of the market, continued innovation in product design, seamless execution on sales and marketing, and relentless dedication to customer success. Our customer loyalty has never been stronger with those essential elements in place. However, while celebrating our achievements, we are equally excited about the success our customers have achieved using our data quality products.

Managing and producing quality data is indispensable in today’s data-centric world. Gaining access to clean, trusted information should be one of a company’s most important tasks, and has previously been shown to be directly linked to growth and continued innovation.

We are truly living in a digital world – a world revolving around the Internet, gadgets and apps – all of which generate data, and lots of it.  Should your organization take advantage of its increasing masses of data? You bet. But remember: only clean, trusted data has real value.  Informatica’s mission is to help you excel by turning your data into valuable information assets that you can put to good use.

To see for yourself what the industry leading data quality tool can do, click here.

And from all of our team at Informatica, Happy holidays to you and yours.

Happy Holidays!

Happy Holidays!

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance, Data Quality | Tagged , , | 1 Comment

Are The Banks Going to Make Retailers Pay for Their Poor Governance?

 

Retail and Data Governance

Retail and Data Governance

A couple months ago, I reached out to a set of CIOs on the importance of good governance and security. All of them agreed that both were incredibly important. However, one CIO retorted a very pointed remark by saying that “the IT leadership at these breached companies wasn’t stupid.” He continued by saying that when selling the rest of the C-Suite, the discussion needs to be about business outcomes and business benefits.  For this reason, he said that CIOs have struggled at selling the value of investments in governance and security investment. Now I have suggested previously that security pays because of the impact on “brand promise”.  And, I still believe this.

However, this week the ante was raised even higher. A district judge ruled that a group of banks can proceed to sue a retailer for negligence in their data governance and security. The decision could clearly lead to significant changes in the way the cost of fraud is distributed among parties within the credit card ecosystem. Where once banks and merchant acquirers would have shouldered the burden of fraud, this decision paves the way for more card-issuing banks to sue merchants for not adequately protecting their POS systems.

Accidents waste priceless time

Accidents waste priceless time

The judge’s ruling said that “although the third-party hackers’ activities caused harm, merchant played a key role in allowing the harm to occur.” The judge also determined that the bank suit against merchants was valid because the plaintiffs adequately showed that the retailer failed “to disclose that its data security systems were deficient.” This is interesting because it says that security systems should be sufficient and if not, retailers need to inform potentially affected stakeholders of their deficient systems. And while taking this step could avoid a lawsuit, it would likely increase the cost of interchange for more risky merchants. This would effectively create a risk premium for retailers that do not adequately govern and protect their IT environments.

There are broad implications for all companies who end up harming customer, partners, or other stakeholders by not keeping their security systems up to snuff. The question is, will this make good governance have enough of a business outcome and benefit that businesses will actually want to pay it forward — i.e. invest in good governance and security? What do you think? I would love to hear from you.

Related links

Solutions:

Enterprise Level Data Security

Hacking: How Ready Is Your Enterprise?

Gambling With Your Customer’s Financial Data

The State of Data Centric Security

Twitter: @MylesSuer

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Banking & Capital Markets, CIO, Data Governance, Data Security, Financial Services | Tagged , , , | Leave a comment

Informatica Rev: Data Democracy At Last – Part 2

This is a continuation from Part 1 of the Blog which you can read here.

Now, if you are in IT, reading about how Informatica Rev enables the everyday business users in your company to participate in the Data Democracy might feel like treachery. You are likely thinking that Informatica is letting the bull loose in your own fine china shop. You likely feel, first of all, that Informatica is supporting the systemic bypass of all the data governance that IT has worked hard to put in place and then second of all, that Informatica is supporting the alienation of the very IT people that have approved of and invested in Informatica for decades.

While I can understand this thought process I am here to, proudly, inform you that your thoughts cannot be further from the truth! In fact, in the not too distant future, Informatica is in a very strong position to create a very unique technology solution to ensure you can better govern all the data in your enterprise and do it in a way that will allow you to proactively deliver the right data to the business, yes, before the masses of everyday business users have started to knock your door down to even ask for it. Informatica’s unique solution will ensure the IT and Business divide that has existed in your company for decades, actually becomes a match made in heaven. And you in IT get the credit for leading this transformation of your company to a Data Democracy. Listen to this webinar to hear Justin Glatz, Executive Director of Information Technology at Code Nast speak about how he will be leading Conde Nast’s transformation to Data Democracy.

Data Democracy At Last

Data Democracy At Last

“How?” you might ask. Well, first let’s face it, today you do not have any visibility into how the business is procuring and using most data, and therefore you are not governing most of it. Without a change in your tooling, your ability to gain this visibility is diminishing greatly, especially since the business does not have to come to you to procure and use their cloud based applications.  By having all of your everyday business users use Informatica Rev, you, for the first time will have the potential to gain a truly complete picture of how data is being used in your company. Even the data they do not come to you to procure.

In the not too distant future, you will gain this visibility through an IT companion application to Informatica Rev. You will then gain the ability to easily operationalize your business user’s exact transformation logic or Recipe as we call it in Informatica Rev, into your existing repositories be they your enterprise data warehouse, datamart or master data management repository for example. And by-the-way you are likely already using Informatica PowerCenter or Informatica Cloud or Informatica MDM to manage these repositories anyway so you already have the needed infrastructure we will be integrating Informatica Rev with. And if you are not using Informatica for managing these repositories, the draw of becoming proactive with your business and leading the transformation of your company to a Data Democracy will be enough to make you want to go get Informatica.

Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too. You can try Informatica Rev for free by clicking here.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Data First, Data Governance | Tagged , , , , | Leave a comment

A New Dimension on a Data-Fueled World

A New Dimension on a Data-Fueled World

A New Dimension on a Data-Fueled World

A Data-Fueled World, Informatica’s new view on data in the enterprise.  I think that we can all agree that technology innovation has changed how we live and view every day life.  But, I want to speak about a new aspect of the data-fueled world.  This is evident now and will be shockingly present in the few years to come.  I want to address the topic of “information workers”.

Information workers deal with information, or in other words, data.  They use that data to do their jobs.  They make decisions in business with that data.  They impact the lives of their clients.

Many years ago, I was part of a formative working group researching information worker productivity.  The idea was to create an index like Labor Productivity indexes.  It was to be aimed at information worker productivity.  By this I mean the analysts, accountants, actuaries, underwriters and statisticians.  These are business information workers.  How productive are they?  How do you measure their output?  How do you calculate an economic cost of more or less productive employees?  How do you quantify the “soft” costs of passing work on to information workers?  The effort stalled in academia, but I learned a few key things.  These points underline the nature of an information worker and impacts to their productivity.

  1. Information workers need data…and lots of it
  2. Information workers use applications to view and manipulate data to get the job done
  3. Degradation, latency or poor ease of use in any of items 1 and 2 have a direct impact on productivity
  4. Items 1 and 2 have a direct correlation to training cost, output and (wait for it) employee health and retention

It’s time to make a super bold statement.  It’s time to maximize your investment in DATA. And past time to de-emphasize investments in applications!  Stated another way, applications come and go, but data lives forever.

My five-year old son is addicted to his iPad.  He’s had one since he was one-year old.  At about the age of three he had pretty much left off playing Angry Birds.  He started reading Wikipedia.  He started downloading apps from the App Store.  He wanted to learn about string theory, astrophysics and plate tectonics.  Now, he scares me a little with his knowledge.  I call him my little Sheldon Cooper.  The apps that he uses for research are so cool.  The way that they present the data, the speed and depth are amazing.  As soon as he’s mastered one, he’s on to the next one.  It won’t be long before he’s going to want to program his own apps.  When that day comes, I’ll do whatever it takes to make him successful.

And he’s not alone.  The world of the “selfie-generation” is one of rapid speed.  It is one of application proliferation and flat out application “coolness”.  High school students are learning iOS programming.  They are using cloud infrastructure to play games and run experiments.  Anyone under the age of 27 has been raised in a mélange of amazing data-fueled computing and mobility.

This is your new workforce.  And on their first day of their new career at an insurance company or large bank, they are handed an aging recycled workstation.  An old operating system follows and mainframe terminal sessions.  Then comes rich-client and web apps circa 2002.  And lastly (heaven forbid) a Blackberry.  Now do you wonder if that employee will feel empowered and productive?  I’ll tell you now, they won’t.  All that passion they have for viewing and interacting with information will disappear.  It will not be enabled in their new work day.  An outright information worker revolution would not surprise me.

And that is exactly why I say that it’s time to focus on data and not on applications.  Because data lives on as applications come and go.  I am going to coin a new phrase.  I call this the Empowered Selfie Formula.  The Empowered Selfie Formula is a way in which the focus on data liberates information workers.  They become free to be more productive in today’s technology ecosystem.

Enable a BYO* Culture

Many organizations have been experimenting with Bring Your Own Device (BYOD) programs.  Corporate stipends that allow employees to buy the computing hardware of their choice.  But let’s take that one step further.  How about a Bring Your Own Application program?  How about a Bring Your Own Codebase program?  The idea is not so far-fetched.  There are so many great applications for working with information.  Today’s generation is learning about coding applications at a rapid pace.  They are keen to implement their own processes and tools to “get the job done”.  It’s time to embrace that change.  Allow your information workers to be productive with their chosen devices and applications.

Empower Social Sharing

Your information workers are now empowered with their own flavors of device and application productivity.  Let them share it.  The ability to share success, great insights and great apps is engrained into the mindset of today’s technology users.  Companies like Tableau have become successful based on the democratization of business intelligence.  Through enabling social sharing, users can celebrate their successes and cool apps with colleagues.  This raises the overall levels of productivity as a grassroots movement.  Communities of best practices begin to emerge creating innovation where not previously seen.

Measure Productivity

As an organization it is important to measure success.  Find ways to capture key metrics in productivity of this new world of data-fueled information work.  Each information worker will typically be able to track trends in their output.  When they show improvement, celebrate that success.

Invest in “Cool”

With a new BYO* culture, make the investments in cool new things.  Allow users to spend a few dollars here and there for training online or in-person.  There they can learn new things will make them more productive.  It will also help with employee retention.  With small investment larger ROI can be realized in employee health and productivity.

Foster Healthy Competition

Throughout history, civilizations that fostered healthy competition have innovated faster.  The enterprise can foster healthy competition on metrics.  Other competition can be focused on new ways to look at information, valuable insights, and homegrown applications.  It isn’t about a “best one wins” competition.  It is a continuing round of innovation winners with lessons learned and continued growth.  These can also be centered on the social sharing and community aspects.  In the end it leads to a more productive team of information workers.

Revitalize Your Veterans

Naturally those information workers who are a little “longer in the tooth” may feel threatened.  But this doesn’t need to be the case.  Find ways to integrate them into the new culture.  Do this through peer training, knowledge transfer, and the data items listed below.  In the best of cases, they too will crave this new era of innovation.  They will bring a lot of value to the ecosystem.

There is a catch.  In order to realize success in the formula above, you need to overinvest in data and data infrastructure.  Perhaps that means doing things with data that only received lip service in the past.  It is imperative to create a competency or center of excellence for all things data.  Trusting your data centers of excellence activates your Empowered Selfie Formula.

Data Governance

You are going to have users using and building new apps and processing data and information in new and developing ways.  This means you need to trust your data.  Your data governance becomes more important.  Everything from metadata, data definition, standards, policies and glossaries need to be developed.  In this way the data that is being looked at can be trusted.  Chief Data Officers should put into place a data governance competency center.  All data feeding and coming from new applications is inspected regularly for adherence to corporate standards.  Remember, it’s not about the application.  It’s about what feeds any application and what data is generated.

Data Quality

Very much a part of data governance is the quality of data in the organization.  Also adhering to corporate standards.  These standards should dictate cleanliness, completeness, fuzzy logic and standardization.  Nothing frustrates an information worker more than building the coolest app that does nothing due to poor quality data.

Data Availability

Data needs to be in the right place at the right time.  Any enterprise data takes a journey from many places and to many places.  Movement of data that is governed and has met quality standards needs to happen quickly.  We are in a world of fast computing and massive storage.  There is no excuse for not having data readily available for a multitude of uses.

Data Security

And finally, make sure to secure your data.  Regardless of the application consuming your information, there may be people that shouldn’t see the data.  Access control, data masking and network security needs to be in place.  Each application from Microsoft Excel to Informatica Springbok to Tableau to an iOS developed application will only interact with the information it should see.

The changing role of an IT group will follow close behind.  IT will essentially become the data-fueled enablers using the principles above.  IT will provide the infrastructure necessary to enable the Empowered Selfie Formula.  IT will no longer be in the application business, aside from a few core corporation applications as a necessary evil.

Achieving a competency in the items above, you no longer need to worry about the success of the Empowered Selfie Formula.  What you will have is a truly data-fueled enterprise.  There will be a new class of information workers enabled by a data-fueled competency.  Informatica is thrilled to be an integral part of the realization that data can play in your journey.  We are energized to see the pervasive use of data by increasing numbers of information workers.  The are creating new and better ways to do business.  Come and join a data-fueled world with Informatica.

FacebookTwitterLinkedInEmailPrintShare
Posted in Business Impact / Benefits, Data First, Data Governance, Data Quality, Enterprise Data Management | Tagged , , , , | Leave a comment

Optimizing Supply Chain with Data Governance

Optimizing Supply Chain with Data Governance

Optimizing Supply Chain with Data Governance

Last week I met with a customer who recently completed a few data fueled supply chain projects. Informatica data integration and master data management are part of the solution architecture. I’m not going to name the client at this early stage but want to share highlights from our Q&A.

Q: What was the driver for this project?

A: The initiative fell out of a procure-to-pay (P2P) initiative.  We engaged a consulting firm to help centralize Accounts Payable operations.  One required deliverable was an executive P2P dashboard. This dashboard would provide enterprise insights by relying on the enterprise data warehousing and business intelligence platform.

Q: What did the dashboard illustrate?

The dashboard integrated data from many sources to provide a single view of information about all of our suppliers. By visualizing this information in one place, we were able to rapidly gain operational insights. There are approximately 30,000 suppliers in the supplier master who either manufacture, or distribute, or both over 150,000 unique products.

Q: From which sources is Informatica consuming data to power the P2P dashboard?

A: There are 8 sources of data:

3 ERP Systems:

  1. Lawson
  2. HBOC STAR
  3. Meditech

4 Enrichment Sources:

  1. Dun & Bradstreet – for associating suppliers together from disparate sources.
  2. GDSN – Global Data Pool for helping to cleanse healthcare products.
  3. McKesson Pharmacy Spend – spend file from third party pharmaceutical distributor Helps capture detailed pharmacy spend which we procure from this third party.
  4. Office Depot Spend – spend file from third party office supply distributor.  Helps capture detailed pharmacy spend.
  5. MedAssets – third party group purchasing organization (GPO) who provides detailed contract pricing.

Q: Did you tackle clinical scenarios first?

A: No, well we certainly have many clinical scenarios we want to explore like cost per procedure per patient we knew that we should establish a few quick, operational wins to gain traction and credibility.

Q: Great idea – capturing quick wins is certainly the way we are seeing customers have the most success in these transformative projects. Where did you start?

A: We started with supply chain cost containment; increasing pressures on healthcare organizations to reduce cost made this low hanging fruit the right place to start. There may be as much as 20% waste to be eliminated through strategic and actionable analytics.

Q: What did you discover?

A: Through the P2P dashboard, insights were gained into days to pay on invoices as well as early payment discounts and late payment penalties. With the visualization we quickly saw that we were paying a large amount of late fees. With this awareness, we dug into why the late fees were so high. What was discovered is that, with one large supplier, the original payment terms were net 30 but that in later negotiations terms were changed to 20 days. Late fees were accruing after 20 days. Through this complete view we were able to rapidly hone in on the issue and change operations — avoiding costly late fees.

Q: That’s a great example of straight forward analytics powered by an integrated view of data, thank you. What’s a more complex use case you plan to tackle?

A: Now that we have the systems in place along with data stewardship, we will start to focus on clinical supply chain scenarios like cost per procedure per patient. We have all of the data in one data warehouse to answer questions like – which procedures are costing the most, do procedure costs vary by clinician? By location? By supply? – and what is the outcome of each of these procedures? We always want to take the right and best action for the patient.

We were also able to identify where negotiated payment discounts were not being taken advantage of or where there were opportunities to negotiate discounts.

These insights were revealed through the dashboard and immediate value was realized the first day.

Fueling knowledge with data is helping procurement negotiate the right discounts, i.e. they can seek discounts on the most used supplies vs discounts on supplies rarely used. Think of it this way… you don’t want to get a discount on OJ and if you are buying milk.

Q: Excellent example and metaphor. Let’s talk more about stewardship, you have a data governance organization within IT that is governing supply chain?

A: No, we have a data governance team within supply chain… Supply chain staff that used to be called “content managers” now “data stewards”. They were doing the stewardship work of defining data, its use, its source, its quality before but it wasn’t a formally recognized part of their jobs… now it is. Armed with Informatica Data Director they are managing the quality of supply chain data across four domains including suppliers/vendors, locations, contracts and items. Data from each of these domains resides in our EMR, our ERP applications and in our ambulatory EMR/Practice Management application creating redundancy and manual reconciliation effort.

By adding Master Data Management (MDM) to the architecture, we were able to centralize management of master data about suppliers/vendors, items, contracts and locations, augment this data with enrichment data like that from D&B, reduce redundancy and reduce manual effort.

MDM shares this complete and accurate information with the enterprise data warehouse and we can use it to run analytics against. Having a confident, complete view of master data allows us to trust analytical insights revealed through the P2P dashboard.

Q: What lessons learned would you offer?

A: Having recognized operational value, I’d encourage health systems to focus on data driven supply chain because there are savings opportunities through easier identification of unmanaged spend.

I really enjoyed learning more about this project with valuable, tangible and nearly immediate results. I will keep you posted as the customer moves onto the next phase. If you have comments or questions, leave them here.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Data First, Data Governance, Retail | Tagged , , , | Leave a comment

10 Insights From The Road To Data Governance

10 Insights From The Road To Data Governance

10 Insights From The Road To Data Governance

I routinely have the pleasure of working with Terri Mikol, Director of Data Governance, UPMC. Terri has been spearheading data governance for three years at UPMC. As a result, she has a wealth of insights to offer on this hot topic. Enjoy her top 10 lessons learned from UPMC’s data governance journey:

1. You already have data stewards.

Commonly, health systems think they can’t staff data governance such as UPMC has becauseof a lack of funding. In reality, people are already doing data governance everywhere, across your organization! You don’t have to secure headcount; you locate these people within the business, formalize data governance as part of their job, and provide them tools to improve and manage their efforts.

2. Multiple types of data stewards ensure all governance needs are being met.

Three types of data stewards were identified and tasked across the enterprise:

I. Data Steward. Create and maintain data/business definitions. Assist with defining data and mappings along with rule definition and data integrity improvement.

II. Application Steward. One steward is named per application sourcing enterprise analytics. Populate and maintain inventory, assist with data definition and prioritize data integrity issues.

III. Analytics Steward. Named for each team providing analytics. Populate and maintain inventory, reduce duplication and define rules and self-service guidelines.

3. Establish IT as an enabler.

IT, instead of taking action on data governance or being the data governor, has become anenabler of data governance by investing in and administering tools that support metadata definition and master data management.

4. Form a governance council.

UPMC formed a governance council of 29 executives—yes, that’s a big number but UPMC is a big organization. The council is clinically led. It is co-chaired by two CMIOs and includes Marketing, Strategic Planning, Finance, Human Resources, the Health Plan, and Research. The council signs off on and prioritizes policies. Decision-making must be provided from somewhere.

5. Avoid slowing progress with process.

In these still-early days, only 15 minutes of monthly council meetings are spent on policy and guidelines; discussion and direction take priority. For example, a recent agenda item was “Length of Stay.” The council agreed a single owner would coordinate across Finance, Quality and Care Management to define and document an enterprise definition for “Length of Stay.”

6. Use examples.

Struggling to get buy-in from the business about the importance of data governance? An example everyone can relate to is “Test Patient.” For years, in her business intelligence role, Terri worked with “Test Patient.” Investigation revealed that these fake patients end up in places they should not. There was no standard for creation or removal of test patients, which meant that test patients and their costs, outcomes, etc., were included in analysis and reporting that drove decisions inside and external to UPMC. The governance program created a policy for testing in production should the need arise.

7. Make governance personal through marketing.

Terri holds monthly round tables with business and clinical constituents. These have been a game changer: Once a month, for two hours, ten business invitees meet and talk about the program. Each attendee shares a data challenge, and Terri educates them on the program and illustrates how the program will address each challenge.

8. Deliver self-service.

Providing self-service empowers your users to gain access and control to the data they need to improve their processes. The only way to deliver self-service business intelligence is to make metadata, master data, and data quality transparent and accessible across the enterprise.

9. IT can’t do it alone.

Initially, IT was resistant to giving up control, but now the team understands that it doesn’t have the knowledge or the time to effectively do data governance alone.

10. Don’t quit!

Governance can be complicated, and it may seem like little progress is being made. Terri keeps spirits high by reminding folks that the only failure is quitting.

Getting started? Assess the data governance maturity of your organization here: http://governyourdata.com/

FacebookTwitterLinkedInEmailPrintShare
Posted in Data First, Data Governance, Data Integration, Data Security | Tagged , , , | Leave a comment

To Determine The Business Value of Data, Don’t Talk About Data

The title of this article may seem counterintuitive, but the reality is that the business doesn’t care about data.  They care about their business processes and outcomes that generate real value for the organization. All IT professionals know there is huge value in quality data and in having it integrated and consistent across the enterprise.  The challenge is how to prove the business value of data if the business doesn’t care about it. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Business Impact / Benefits, CIO, Data Governance, Enterprise Data Management, Integration Competency Centers | Tagged , , , , , , | Leave a comment

IT Is All About Data!

AboutDataIs this how you think about IT? Or do you think of IT in terms of the technology it deploys instead? I recently was interviewing a CIO at Fortune 50 Company about the changing role of CIOs. When I asked him about which technology issues were most important, this CIO said something that surprised me.

He said, “IT is all about data. Think about it. What we do in IT is all about the intake of data, the processing of data, the store of data, and the analyzing of data. And we need, from data, to increasingly provide the intelligence to make better decisions”.

How many view the function of the IT organization with such clarity?

This was the question that I had after hearing this CIO. And how many IT organizations view IT as really a data system? It must not be very many. Jeanne Ross from MIT CISR contends in her book that company data “one of its most important assets, is patchy, error-prone, and not up-to-date” (Enterprise Architecture as Strategy, Jeanne Ross, page 7). Jeanne contends as well that companies having a data centric view “have higher profitability, experience faster time to market, and get more value from their IT investments” (Enterprise Architecture as Strategy, Jeanne Ross, page 2).

What then do you need to do to get your data house in order?

What then should IT organizations do to move their data from something that is “patchy, error-prone, and not up-to-date” to something that is trustworthy and timely? I would contend our CIO friend had it right. We need to manage all four elements of our data process better.

1.     Input Data Correctly

data inputYou need start by making sure that the data you produced is done so consistently and correctly.  I liken the need here to a problem that I had with my electronic bill pay a few years ago. My bank when it changed bill payment service providers started sending my payments to a set of out of date payee addresses. This caused me to receive late fees and for my credit score to actually go down. The same kind of thing can happen to a business when there are duplicate customers or customer addresses are entered incorrectly. So much of marketing today is about increasing customer intimacy. It is hard to improve customer intimacy when you bug the same customer too much or never connect with a customer because you had a bad address for them.

2.     Process Data to Produce Meaningful Results

processing dataYou need to collect and manipulate data to derive meaningful information. This is largely about processing data so it results produce meaningful analysis. To do this well, you need to take out the data quality issues from the data that is produced. We want, in this step, to make data is “trustworthy” to business users.

With this, data can be consolidated into a single view of customer, financial account, etc. A CFO explained the importance of this step by saying the following:

“We often have redundancies in each system and within the chart of accounts the names and numbers can differ from system to system. And as you establish a bigger and bigger set of systems, you need to, in accounting parlance, to roll-up the charter of accounts”.

Once data is consistently put together, then you need to consolidate it so that it can be used by business users. This means that aggregates need to be created for business analysis. These should support dimensional analysis so that business users can truly answer why something happened. For finance organizations, timely aggregated data with supporting dimensional analysis enables them to establish themselves as “a business person versus a bean counting historically oriented CPA”. Having this data answers questions like the following:

  • Why are sales not being achieved? Which regions or products are failing to be delivered?
  • Or why is the projected income statement not in conformance with plan? Which expense categories should be we cut in order to ensure the income statement is in line with business expectation?

3.     Store Data Where it is Most Appropriate

Cloud StorageData storage needs to be able to occur today in many ways. It can be in applications, a data warehouse, or even, a Hadoop cluster. You need here to have an overriding data architecture that considers the entire lifecycle of data. A key element of doing this well involves archiving data as it becomes inactive and protecting data across its entire lifecycle. The former can involve as well the disposing of information. And the latter requires the ability to audit, block, and dynamically mask sensitive production data to prevent unauthorized access.

4.      Enable analysis including the discovery, testing, and putting of data together

Data AnalysisAnalysis today is not just about the analysis tools. It is about enabling users to discover, test, and put data together. CFOs that we have talked to say they want analysis to expose earlier potential business problems. They want, for example, to know about metrics like average selling price and gross margins by account or by product. They want as well to see when they have seasonality affects.

Increasingly, CFOs need to use this information to help predict what the future of their business will look like. CFOs say that they want at the same time to help their businesses make better decisions from data. Limiting them today from doing this are disparate systems that cannot talk to each other. CFOs complain about their enterprise hodgepodge of systems that do not talk to one another. And yet to report, CFOs need to traverse between front office to back office systems.

One CIO said to us that the end of any analysis layer should be the ability to trust data and make dependable business decisions. And once dependable data exists, business users say that they want “access to data when they need it. They want to get data when and where you need it”. One CIO likened what is need here to orchestration when he said:

“Users want to be able to self-service. They want to be able to assembly data and put it together and do it from different sources at different times. I want them to be able to have no preconceived process. I want them to be able discover data across all sources”.

Parting thoughts

So as we said at the beginning of this post, IT is all about the data. And with mobile systems of engagement, IT’s customers are wanting increasingly their data at their fingertips.  This means that business users need to be able to trust that the data they use for business analysis is timely and accurate. This demands that IT organizations get better at managing their core function—data.

Solution Brief: The Intelligent Data Platform
Great Data by Design
The Power of a Data Centric Mindset
Data is it your competitive advantage?

Related Blogs

Competing on Analytics
The CFO Viewpoint of Data
Is Big Data Destined To Become Small And Vertical?
Big Data Why?
The Business Case for Better Data Connectivity
What is big data and why should your business care?
Twitter: @MylesSuer

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance | Tagged | Leave a comment