Tag Archives: Data Governance
You know that old saying, “What’s the best way to eat an elephant?” “One bite at a time.”
Much like the daunting task of eating an elephant, implementing an MDM solution can seem staggering. Many times, the implementation team gets mired in the details, wanting to create a solution that is the answer to everyone’s problems and as a result never gets started due to the overwhelming nature of the undertaking. After many successful MDM implementations – I’d like to recommend the following approach to eating your elephant:
- Clearly establish the business problem you are trying to solve
According to the Gartner Group, “MDM is a business-driven, technology-enabled environment and program.” It is really important to remember that the reason you are working on an MDM project is to satisfy a specific business need aligned with the overall business strategy. Once aligned – it is easier to show value to the organization when it is successfully implemented.
For instance – if you want to reduce readmission costs for a specific segment of your member population, this could have an impact on several potential business goals in the payer market including improving customer satisfaction and lowering cost of care for specific populations. Improving customer satisfaction is a pretty big elephant – but focusing down on reducing readmission costs is much more manageable. And it will impact two of your business ! Sutter Health has taken this approach and is releasing new use cases every 90 days.
- Understand and document your current state
Now that you know what problem you are attempting to solve – you will need to understand the current level of maturity of your current approach and the resources it will take to implement your solution. You will need to consider your available resources, understand the amount of time and money it may take to implement a solution, and obtain visibility into where your organization stands currently in regards to the .
Using the readmission example – you are going to need to identify where readmission data is available. How often is this data updated? What is the quality of the data? How difficult is it to get access to the data? Who are the resources that are responsible for readmission data? Are they distracted by many competing requests? What are some of the data sources that could impact readmission rates that are not currently part of your data sources? For instance – understanding the living arrangement of the member prior to discharge can have an impact on readmission rates (members living alone have a much higher readmission rate than those living with others). If you can identify a household through social media data, you can better predict who will be going home alone.
- Clearly define success
There is business adage (much like eating an elephant!) that says “you can’t manage what you can’t measure.” Before you implement your MDM solution – you need to identify what metrics are most important and show success. These should align with the goals you set in #1 (which should align with the goals of the business). You should also identify specific business outcomes that metrics can apply to.
What would be a reasonable measure of success for reduction of readmission rates?
- Moving from a 15% readmission rate to an 8% readmission rate?
- Moving from a 15% readmission rate to a 10% readmission rate? In what amount of time?
- Perhaps moving from a 15% readmission rate to a 13% readmission rate in a year after implementation?
- Build a governance hierarchy
According to Gartner Group, “Governance is about decision making and ensuring that you have an authority framework that takes decisions and is able to measure the execution of those decisions.” An effective governance program requires a well-defined hierarchy, headed by a sponsor — someone in a position of authority who carries the necessary weight and cross-departmental authority to make MDM governance a reality. How can you find out how mature your organization is from a data governance perspective? There are several vendors and websites that can help with that including: www.governyourdata.com .
You need to be able to make assessments about the quality of the source data, the data lineage (where and how the data that is feeding your MDM solution has been modified) for compliance reporting requirements, as well as establishing a process for support of your data quality. There are 10 data governance insights from UPMC here.
Now – what is the smallest bite you can define for your organization? By laying out your business priorities, identifying your current state, creating a measurable goal and a method for governing – you have put clear boundaries on what you are trying to accomplish. Once you’ve established your credibility on your ability to finish your first bite successfully, the next bite will be easier!
Data and Information becoming a key corporate asset
According to Barbara Wixom at MIT CISR, “In a digital economy, data and the information it produces is one of a company’s most important assets”. (“Recognizing data as an enterprise asset”, Barbara Wixom, MIT CISR, 3 March 2015). Barbara goes onto suggest that businesses increasingly “need to take an enterprise view of data. They should understand and govern data as a corporate asset, even when data management remains distributed”.
CIOs are not the enterprise data steward
Given that data is a corporate asset, you might expect this would be an area for the CIO’s leadership. However, I heard differently when I recently met with two different groups of CIOs. Regardless of whether the CIOs were public sector or private sector, they told me that they did not want to be the owner of enterprise data. One CIO succinctly put it this way, “we are not data stewards. Governance has to be done by the business—IT is merely the custodians of their data”. These CIOs claim that the reason that the business must own business data and must determine how that data should be managed is because only the business understands the business context around the data.
Given this, the CIOs that I talked to said that IT should not manage data but “should make sure that what the business needs done gets done with data”. CIOs, therefore, own the processes and technology for ensuring data is secured and available when and where the business needs it. Debbie Lew from ISACA put it this way, “IT does not own the data. IT facilitates data”.
So if the management of data is distributed what is the role of the CIO in being a good data custodian?
COBIT 5 provides some concrete suggestions that are worth taking a look at. According to COBIT, IT should make sure information and data owners are established and that they are able to make decisions about data definition, data classification, data security and control, and data integrity. Additionally, IT needs to ensure that the information system provides the “knowledge required to support all staff in their work activities.”
IT must create facilities so knowledge can be used
This means IT organizations need to create facilities so that knowledge can be used, shared and updated. Part of doing this task well involves ensuring the reliable availability of useful information. This should involve keeping the ratio of erroneous or unavailable information to a minimum. Measuring performance here requires looking at the percent of reports that are not delivered on time and the percent of reports containing inaccuracies. These obviously need to be kept to a minimum. Clearly, this function is enabled by backup systems, applications, data and documentation. These should be worked according to a defined schedule that meets business requirements.
To establish a level of data accuracy, that is acceptable to business users, starts by building and maintaining an enterprise data dictionary that includes details about the data definition, data ownership, appropriate data security, and data retention and destruction requirements. This involves identifying the data outputs from the source and mapping data storage, location, retrieval and recoverability. It needs to ensure from a design perspective, appropriate redundancy, recovery and backup are built into the enterprise data architecture.
IT must enable compliance and security
COBIT 5 stresses the importance of data and information compliance and security. Information needs to be “properly secured, stored, transmitted or destroyed.” This starts with effective security and controls over information systems. To do this, procedures need to be defined and implemented to ensure the integrity and consistency of information stored in databases, data warehouses and data archives. All users need to be uniquely identifiable and have access rights in accordance with their business role. And for business compliance, all business transactions need to be retained for governance and compliance reasons. According to COBIT 5, IT organizations are chartered to ensuring the following four elements are established:
- Clear information ownership
- Timely, correct information
- Clear enterprise architecture and efficiency
- Compliance and security
There needs to be a common set of information requirements
But how are these objectives achieved? Effective information governance requires that the business and IT have a strong working relationship. It, also, requires that information requirements are established. Getting timely and correct information often starts by improving how data is managed. Instead of manually moving data or creating layer over layer of spaghetti code integration, enterprises need to standardize a data architecture that creates a single integration layer among all data sources.
This integration layer increasingly needs to support new sources of data too and be able to do so at the speed of business. Business users want trustworthy data. An expert on data integration “maintains that at least 20 percent of all raw data is incorrect. Inaccurate data leads data users to question the information their systems provide.” The data system needs to automatically and proactively fix data issues like addresses, missing data and data format problems. And once this has been accomplished, it needs to go after redundancies in customers and transactions. With multiple IT-managed transaction systems, it is easy to misstate both customers and customer transactions. It is also possible to miss potential business opportunities. All of these are required to get accurate data.
Data needs to be systematically protection
Additionally, data need to be systematically protected. This means that user access to data needs to be managed systematically across all IT-managed systems. Typical data integrations move data between applications without protecting the source data systems’ rules. A data security issue at any point in the IT system can expose all data. At the same time, enterprises need to control exactly what data are moved in test environments and product environments. Enterprises must also ensure that a common set of security governance rules are established and maintained across the entire enterprise, including data being exchanged with partners, employees and contractors using data outside of the enterprise firewall.
Clearly, COBIT 5 suggests that CIOs cannot completely divorce themselves from data governance. Yes, CIOs are data custodians but there are clear and specific tasks that the CIO and their staff must uniquely take on. Otherwise, a good foundation for data governance cannot be established.
Data Governance, the art of being Regulation Ready is about a lot of things, but one thing is clear. It’s NOT just about the technology. You ever been in one of those meetings, probably more than a few, where committees and virtual teams discuss the latest corporate initiatives? You know, those meetings where you want to dip your face in lava and run into the ocean? Because at the end of the meeting, everyone goes back to their day jobs and nothing changes.
Now comes a new law or regulation from the governing body du jour. There are common threads to each and every regulation related to data. Laws like HIPAA even had entire sections dedicated to the types of filing cabinets required in the office to protect healthcare data. And the same is true of regulations like BCBS 239, CCAR reporting and Solvency II. The laws ask; what are you reporting, how did you get that data, where has it been, what does this data mean and who has touched it. Virtually all of the regulations dealing with data have those elements.
So it behooves an organization to be Regulation Ready. This means those committees and virtual teams need to be driving cultural and process change. It’s not just about the technology; it’s as much about people and processes. Every role in the organization, from the developer to the business executive should embed the concepts of data governance in their daily work. From the time a developer or architect builds a new system, they need to document and define everything and every piece of data. It reminds me of days writing code and remembering to comment each code block. And the business executive likewise is sharing business rules and definition from the top so they can be integrated into the systems that eventually have to report on it.
Finally, the processes that support a data governance program are augmented by the technology. It may seem to suffice, that systems are documented in spreadsheets and documents, but those are more and more error prone and in the end not reliable in audit.
Informatica is the market leader in data management infrastructure to be Regulation Ready. This means, everything, from data movement and quality to definitions and security. Because at the end of the day, once you have the people culturally integrated, and the processes supporting the data workload, a centralized, high performance and feature rich technology needs to be in place to complete the trifecta. Informatica is pleased to offer the industry this leading technology as part of a comprehensive data governance foundation.
Informatica will be sharing this vision at the upcoming Annual FIMA 2015 Conference in Boston from March 30 to April 1. Come and visit Informatica at FIMA 2015 in Booth #3.
I recently got to talk to several senior IT leaders about their views on information governance and analytics. Participating were a telecom company, a government transportation entity, a consulting company, and a major retailer. Each shared openly in what was a free flow of ideas.
The CEO and Corporate Culture is critical to driving a fact based culture
I started this discussion by sharing the COBIT Information Life Cycle. Everyone agreed that the starting point for information governance needs to be business strategy and business processes. However, this caused an extremely interesting discussion about enterprise analytics readiness. Most said that they are in the midst of leading the proverbial horse to water—in this case the horse is the business. The CIO in the group said that he personally is all about the data and making factual decisions. But his business is not really there yet. I asked everyone at this point about the importance of culture and the CEO. Everyone agreed that the CEO is incredibly important in driving a fact based culture. Apparent, people like the new CEO of Target are in the vanguard and not the mainstream yet.
KPIs need to be business drivers
The above CIO said that too many of his managers are operationally, day-to-day focused and don’t understand the value of analytics or of predictive analytics. This CIO said that he needs to teach the business to think analytically and to understand how analytics can help drive the business as well as how to use Key Performance Indicators (KPIs). The enterprise architect in the group shared at this point that he had previously worked for a major healthcare organization. When organization was asked to determine a list of KPIs, they came back 168 KPIs. Obviously, this could not work so he explained to the business that an effective KPI must be a “driver of performance”. He stressed to the healthcare organization’s leadership the importance of having less KPIs and of having those that get produced being around business capabilities and performance drivers.
IT needs increasingly to understand their customers business models
I shared at this point that I visited a major Italian bank a few years ago. The key leadership had high definition displays that would roll by an analytic every five minutes. Everyone laughed at the absurdity of having so many KPIs. But with this said, everyone felt that they needed to get business buy in because only the business can derive the value from acting upon the data. According to this group of IT leaders, this causing them more and more to understand their customer’s business models.
Others said that they were trying to create an omni-channel view of customers. The retailer wanted to get more predictive. While Theodore Levitt said the job of marketing is to create and keep a customer. This retailer is focused on keeping and bringing back more often the customer. They want to give customers offers that use customer data that to increase sales. Much like what I described recently was happening at 58.com, eBay, and Facebook.
Most say they have limited governance maturity
We talked about where people are in their governance maturity. Even though, I wanted to gloss over this topic, the group wanted to spend time here and compare notes between each other. Most said that they were at stage 2 or 3 in in a five stage governance maturity process. One CIO said, gee does anyone ever at level 5. Like analytics, governance was being pushed forward by IT rather than the business. Nevertheless, everyone said that they are working to get data stewards defined for each business function. At this point, I asked about the elements that COBIT 5 suggests go into good governance. I shared that it should include the following four elements: 1) clear information ownership; 2) timely, correct information; 3) clear enterprise architecture and efficiency; and 4) compliance and security. Everyone felt the definition was fine but wanted specifics with each element. I referred them and you to my recent article in COBIT Focus.
CIO says they are the custodians of data only
At this point, one of the CIOs said something incredibly insightful. We are not data stewards. This has to be done by the business—IT is the custodians of the data. More specifically, we should not manage data but we should make sure what the business needs done gets done with data. Everyone agreed with this point and even reused the term, data custodians several times during the next few minutes. Debbie Lew of COBIT said just last week the same thing. According to her, “IT does not own the data. They facilitate the data”. From here, the discussion moved to security and data privacy. The retailer in the group was extremely concerned about privacy and felt that they needed masking and other data level technologies to ensure a breach minimally impacts their customers. At this point, another IT leader in the group said that it is the job of IT leadership to make sure the business does the right things in security and compliance. I shared here that one my CIO friends had said that “the CIOs at the retailers with breaches weren’t stupid—it is just hard to sell the business impact”. The CIO in the group said, we need to do risk assessments—also a big thing for COBIT 5–that get the business to say we have to invest to protect. “It is IT’s job to adequately explain the business risk”.
Is mobility a driver of better governance and analytics?
Several shared towards the end of the evening that mobility is an increasing impetus for better information governance and analytics. Mobility is driving business users and business customers to demand better information and thereby, better governance of information. Many said that a starting point for providing better information is data mastering. These attendees felt as well that data governance involves helping the business determine its relevant business capabilities and business processes. It seems that these should come naturally, but once again, IT for these organizations seems to be pushing the business across the finish line.
Blogs and Articles:
Let’s face it, building a Data Governance program is no overnight task. As one CDO puts it: ”data governance is a marathon, not a sprint”. Why? Because data governance is a complex business function that encompasses technology, people and process, all of which have to work together effectively to ensure the success of the initiative. Because of the scope of the program, Data Governance often calls for participants from different business units within an organization, and it can be disruptive at first.
Why bother then? Given that data governance is complex, disruptive, and could potentially introduce additional cost to a company? Well, the drivers for data governance can vary for different organizations. Let’s take a close look at some of the motivations behind data governance program.
For companies in heavily regulated industries, establishing a formal data governance program is a mandate. When a company is not compliant, consequences can be severe. Penalties could include hefty fines, brand damage, loss in revenue, and even potential jail time for the person who is held accountable for being noncompliance. In order to meet the on-going regulatory requirements, adhere to data security policies and standards, companies need to rely on clean, connected and trusted data to enable transparency, auditability in their reporting to meet mandatory requirements and answer critical questions from auditors. Without a dedicated data governance program in place, the compliance initiative could become an on-going nightmare for companies in the regulated industry.
A data governance program can also be established to support customer centricity initiative. To make effective cross-sells and ups-sells to your customers and grow your business, you need clear visibility into customer purchasing behaviors across multiple shopping channels and touch points. Customer’s shopping behaviors and their attributes are captured by the data, therefore, to gain thorough understanding of your customers and boost your sales, a holistic Data Governance program is essential.
Other reasons for companies to start a data governance program include improving efficiency and reducing operational cost, supporting better analytics and driving more innovations. As long as it’s a business critical area and data is at the core of the process, and the business case is loud and sound, then there is a compelling reason for launching a data governance program.
Now that we have identified the drivers for data governance, how do we start? This rather loaded question really gets into the details of the implementation. A few critical elements come to consideration including: identifying and establishing various task forces such as steering committee, data governance team and business sponsors; identifying roles and responsibilities for the stakeholders involved in the program; defining metrics for tracking the results. And soon you will find that on top of everything, communications, communications and more communications is probably the most important tactic of all for driving the initial success of the program.
A rule of thumb? Start small, take one-step at a time and focus on producing something tangible.
Sounds easy, right? Well, let’s hear what the real-world practitioners have to say. Join us at this Informatica webinar to hear Michael Wodzinski, Director of Information Architecture, Lisa Bemis, Director of Master Data, Fabian Torres, Director of Project Management from Houghton Mifflin Harcourt, global leader in publishing, as well as David Lyle, VP of product strategy from Informatica to discuss how to implement a successful data governance practice that brings business impact to an enterprise organization.
If you are currently kicking the tires on setting up data governance practice in your organization, I’d like to invite you to visit a member-only website dedicated to Data Governance: http://governyourdata.com/. This site currently has over 1,000 members and is designed to foster open communications on everything data governance. There you will find conversations on best practices, methodologies, frame works, tools and metrics. I would also encourage you to take a data governance maturity assessment to see where you currently stand on the data governance maturity curve, and compare the result against industry benchmark. More than 200 members have taken the assessment to gain better understanding of their current data governance program, so why not give it a shot?
Data Governance is a journey, likely a never-ending one. We wish you best of the luck on this effort and a joyful ride! We love to hear your stories.
2014 was a pivotal turning point for Informatica as our investments in Hadoop and efforts to innovate in big data gathered momentum and became a core part of Informatica’s business. Our Hadoop related big data revenue growth was in the ballpark of leading Hadoop startups – more than doubling over 2013.
In 2014, Informatica reached about 100 enterprise customers of our big data products with an increasing number going into production with Informatica together with Hadoop and other big data technologies. Informatica’s big data Hadoop customers include companies in financial services, insurance, telcommunications, technology, energy, life sciences, healthcare and business services. These innovative companies are leveraging Informatica to accelerate their time to production and drive greater value from their big data investments.
These customers are in-production or implementing a wide range of use cases leveraging Informatica’s great data pipeline capabilities to better put the scale, efficiency and flexibility of Hadoop to work. Many Hadoop customers start by optimizing their data warehouse environments by moving data storage, profiling, integration and cleansing to Hadoop in order to free up capacity in their traditional analytics data warehousing systems. Customers that are further along in their big data journeys have expanded to use Informatica on Hadoop for exploratory analytics of new data types, 360 degree customer analytics, fraud detection, predictive maintenance, and analysis of massive amounts of Internet of Things machine data for optimization of energy exploration, manufacturing processes, network data, security and other large scale systems initiatives.
2014 was not just a year of market momentum for Informatica, but also one of new product development innovations. We shipped enhanced functionality for entity matching and relationship building at Hadoop scale (a key part of Master Data Management), end-to-end data lineage through Hadoop, as well as high performance real-time streaming of data into Hadoop. We also launched connectors to NoSQL and analytics databases including Datastax Cassandra, MongoDB and Amazon Redshift. Informatica advanced our capabilities to curate great data for self-serve analytics with a connector to output Tableau’s data format and launched our self-service data preparation solution, Informatica Rev.
Customers can now quickly try out Informatica on Hadoop by downloading the free trials for the Big Data Edition and Vibe Data Stream that we launched in 2014. Now that Informatica supports all five of the leading Hadoop distributions, customers can build their data pipelines on Informatica with confidence that no matter how the underlying Hadoop technologies evolve, their Informatica mappings will run. Informatica provides highly scalable data processing engines that run natively in Hadoop and leverage the best of open source innovations such as YARN, MapReduce, and more. Abstracting data pipeline mappings from the underlying Hadoop technologies combined with visual tools enabling team collaboration empowers large organizations to put Hadoop into production with confidence.
As we look ahead into 2015, we have ambitious plans to continue to expand and evolve our product capabilities with enhanced productivity to help customers rapidly get more value from their data in Hadoop. Stay tuned for announcements throughout the year.
Try some of Informatica’s products for Hadoop on the Informatica Marketplace here.
Achieving and maintaining a single, semantically consistent version of master data is crucial for every organization. As many companies are moving from an account or product-centric approach to a customer-centric model, master data management is becoming an important part of their enterprise data management strategy. MDM provides the clean, consistent and connected information your organizations need for you to –
- Empower customer facing teams to capitalize on cross-sell and up-sell opportunities
- Create trusted information to improve employee productivity
- Be agile with data management so you can make confident decisions in a fast changing business landscape
- Improve information governance and be compliant with regulations
But there are challenges ahead for the organizations. As Andrew White of Gartner very aptly wrote in a blog post, we are only half pregnant with Master Data Management. Andrew in his blog post talked about increasing number of inquiries he gets from organizations that are making some pretty simple mistakes in their approach to MDM without realizing the impact of those decisions on a long run.
Over last 10 years, I have seen many organizations struggle to implement MDM in a right way. Few MDM implementations have failed and many have taken more time and incurred cost before showing value.
So, what is the secret sauce?
A key factor for a successful MDM implementation lays in mapping your business objectives to features and functionalities offered by the product you are selecting. It is a phase where you ask right questions and get them answered. There are few great ways in which organizations can get this done and talking to analysts is one of them. The other option is to attend MDM focused events that allow you to talk to experts, learn from other customer’s experience and hear about best practices.
We at Informatica have been working hard to deliver you a flexible MDM platform that provides complete capabilities out of the box. But MDM journey is more than just technology and product features as we have learnt over the years. To ensure our customer success, we are sharing knowledge and best practices we have gained with hundreds of successful MDM and PIM implementations. The Informatica MDM Day, is a great opportunity for organizations where we will –
- Share best practices and demonstrate our latest features and functionality
- Show our product capabilities which will address your current and future master data challenges
- Provide you opportunity to learn from other customer’s MDM and PIM journeys.
- Share knowledge about MDM powered applications that can help you realize early benefits
- Share our product roadmap and our vision
- Provide you an opportunity to network with other like-minded MDM, PIM experts and practitioners
So, join us by registering today for our MDM Day event in New York on 24th February. We are excited to see you all there and walk with you towards MDM Nirvana.
Informatica users leveraging HDP are now able to see a complete end-to-end visual data lineage map of everything done through the Informatica platform. In this blog post, Scott Hedrick, director Big Data Partnerships at Informatica, tells us more about end-to-end visual data lineage.
Hadoop adoption continues to accelerate within mainstream enterprise IT and, as always, organizations need the ability to govern their end-to-end data pipelines for compliance and visibility purposes. Working with Hortonworks, Informatica has extended the metadata management capabilities in Informatica Big Data Governance Edition to include data lineage visibility of data movement, transformation and cleansing beyond traditional systems to cover Apache Hadoop.
Informatica users are now able to see a complete end-to-end visual data lineage map of everything done through Informatica, which includes sources outside Hortonworks Data Platform (HDP) being loaded into HDP, all data integration, parsing and data quality transformation running on Hortonworks and then loading of curated data sets onto data warehouses, analytics tools and operational systems outside Hadoop.
Regulated industries such as banking, insurance and healthcare are required to have detailed histories of data management for audit purposes. Without tools to provide data lineage, compliance with regulations and gathering the required information for audits can prove challenging.
With Informatica, the data scientist and analyst can now visualize data lineage and detailed history of data transformations providing unprecedented transparency into their data analysis. They can be more confident in their findings based on this visibility into the origins and quality of the data they are working with to create valuable insights for their organizations. Web-based access to visual data lineage for analysts also facilitates team collaboration on challenging and evolving data analytics and operational system projects.
The Informatica and Hortonworks partnership brings together leading enterprise data governance tools with open source Hadoop leadership to extend governance to this new platform. Deploying Informatica for data integration, parsing, data quality and data lineage on Hortonworks reduces risk to deployment schedules.
A demo of Informatica’s end-to-end metadata management capabilities on Hadoop and beyond is available here:
- A free trial of Informatica Big Data Edition in the Hortonworks Sandbox is available here .
As Valentine’s Day approaches and retailers & restaurants prepare to sell millions of cards, teddy bears, bottles of champagne and for the lucky few, some expensive jewels, I started to think about my love affair with data and the many ups and downs we had over the years!
Our first date together was arranged by a third party and everything I was told was from their perspective. I had many questions; could I trust data, was I getting the complete picture from the third party, would we be compatible and ultimately “fit for purpose” or would data break my heart!
As we shared information we were both apprehensive, not everything was fitting together, there were gaps in data’s story, and I just could not make an informed decision, this lead to mistrust between the two of us. I stated to ask other friends and associates for their information and tried to reconcile with my view of data. I wanted it to work but what could I do?
A close friend, Stewart, recommended I get some professional advice to help with my issues with data and pointed me towards Doctor Rob, one of the leading authorities on data, specialising in data governance.
The first bit of advice Doctor Rob gave me was; it should never have been about data, the dream must be about your long term goals together, your commitment to get it right, your interactions with others in your circle of friends and dependents.
The second piece of advice was to decide what roles and responsibilities each of us would take on in the relationship. Evaluate if we have the right skills or do we need external support or training to succeed.
While we are still on our journey together data and I are now in a long term committed relationship and look forward to many years on Cloud 9.
Now all I have to decide is will I go to Tiffany’s or Claire’s for that piece of jewellery!
The signs that healthcare is becoming a more consumer (think patients, members, providers) driven industry are evident all around us. I see provider and payer organizations clamoring for more data, specifically data that is actionable, relatable and has integrity. Armed with this data, healthcare organizations are able to differentiate around a member/patient-centric view.
These consumer-centric views convey the total value of the relationships healthcare organizations have with consumers. Understanding the total value creates a more comprehensive understanding of consumers because they deliver a complete picture of an individual’s critical relationships including: patient to primary care provider, member to household, provider to network and even members to legacy plans. This is the type of knowledge that informs new treatments, targets preventative care programs and improves outcomes.
Payer organizations are collecting and analyzing data to identify opportunities for more informed care management and segmentation to reach new, high value customers in individual markets. By segmenting and targeting messaging to specific populations, health plans generate increased member satisfaction and cost effectively expands and manages provider networks.
How will they accomplish this? Enabling members to interact in health and wellness forums, analyzing member behavior and trends and informing care management programs with a 360 view of members… to name a few . Payers will also drive new member programs, member retention and member engagement marketing and sales programs by investigating complete views of member households and market segments.
In the provider space, this relationship building can be a little more challenging because often consumers as patients do not interact with their doctor unless they are sick, creating gaps in data. When provider organizations have a better understanding of their patients and providers, they can increase patient satisfaction and proactively offer preventative care to the sickest (and most likely to engage) of patients before an episode occurs. These activities result in increased market share and improved outcomes.
Where can providers start? By creating a 360 view of the patient, organizations can now improve care coordination, open new patient service centers and develop patient engagement programs.
Analyzing populations of patients, and fostering patient engagement based on Meaningful Use requirements or Accountable Care requirements, building out referral networks and developing physician relationships are essential ingredients in consumer engagement. Knowing your patients and providing a better patient experience than your competition will differentiate provider organizations.
You may say “This all sounds great, but how does it work?” An essential ingredient is clean, safe and connected data. Clean, safe and connected data requires an investment in data as an asset… just like you invest in real estate and human capital, you must invest in the accessibility and quality of your data. To be successful, arm your team with tools to govern data –ensuring ongoing integrity and quality of data, removes duplicate records and dynamically incorporates data validation/quality rules. These tools include master data management, data quality, metadata management and are focused on information quality. Tools focused on information quality support a total customer relationship view of members, patients and providers.