Category Archives: Healthcare
It is probable that all of the information on a member is stored in several different systems – so getting the complete picture can be difficult. In addition – controlling access to this information is an important part of any organization’s overall strategy. And finally – data assets become more valuable the more you use them. If three divisions of an organization all share information about their interactions with a customer, the organization as a whole is better able to service the customer, at lower cost and with high customer satisfaction.
Data governance is used by organizations to exercise control over processes and methods used by their data stewards and data custodians in order to improve data quality. Data governance is a control that ensures that the data entry by an operations team member or by an automated process meets precise standards, such as a business rule, a data definition and data integrity constraints in the data model. A data governor uses data quality monitoring against production data to communicate errors in data back to operational team members, or to the technical support team, for corrective action.
How far along in the Data Governance journey is your organization?
- Is your organization currently unaware of Data Governance?
There is minimal focus on data quality or security, data isn’t prioritized in any meaningful or actionable way, there is no measurement around data governance and it isn’t managed.
- Is your organization in the initial phases of Data Governance?
Data Governance is primarily grassroots driven by a few passionate individuals, rules are implemented in an ad hoc fashion, with policies or standards are part of functional requirements in an IT project, which is only considered successful if the IT release is considered successful.
- Is Data Governance at your organization repeatable?
For these organizations – data governance is still grassroots, but gaining attention at the IT management level. There are documented IT governance and standards driving metadata resuse and improved collaboration across IT projects. The success is measured primarily on improved IT efficiencies. This is typically managed through a pilot project.
- Defined Data Governance
This is lead primarily from senior IT through adoption of competency centers and centers of excellence. Project leadership is primarily through IT, but there is business involvement. The success is measured on operational metrics at a project level.
- Data Governance that is Managed
The Data Governance program is sponsored by business leaders, initiated as part of a broader strategic enterprise information management program. Data Governance will live through multi-phase, multi-year efforts but measured based on the success of the program.
- Optimized Data Governance
There is top-level executive sponsorship and support. Data governance is embraced as a self-sustaining core business function managing data as a corporate asset. Success is measured on the total impact to the business, not just confined to specific programs or strategies.
There is a fantastic site (http://governyourdata.com/ ) that is an open peer-to-peer community of data governance practitioners, evangelists, thought leaders, bloggers, analysts and vendors. The goal of the governyourdata community is to share best practices, methodologies, frameworks, education, and other tools to help data governance leaders succeed in their efforts.
One of our customers, UPMC has a great blog post on their implementation of a Data Governance council and the challenges they faced making it a priority in their organization.
To figure out where on the continuum of data governance maturity – there is a Data Governance Maturity Assessment Tool through the governyourdata.com site. A maturity assessment level sets current gaps and strengths and paves the way for defining a successful strategy. The process of assessing an organization’s maturity should include interviews of relevant business and IT staff, business risk surveys, business analyst time and activity analysis, and other techniques. Once your assessment is completed – you can identify the appropriate steps you need to plan for to develop an Optimized Data Governance approach for your organization. Where does your organization stand?
If you follow me on LinkedIn than you already know that there is no place I would rather be than in front of a client – virtually or in person. There is simply nothing that energizes me more than gathering the insights from client advocates. With this said, it will be no surprise that Informatica World makes me giddy; like a kid in a candy store – over 1500 clients telling their stories and sharing valuable lessons learned.
For healthcare alone, over a dozen payer and provider organizations have volunteered to share their use cases, their stories and their lessons learned. The array of brands represented is second to none; i.e. Kaiser, UPMC, Cleveland Clinic and Humana.
Beyond sessions, clients ask for more opportunities to network with peers and get hands on with the next releases of products and we listen!
- Healthcare cocktail reception Tuesday evening
- Healthcare Industry breakfast Thursday morning
- Hands on Labs with industry specific content
- Partner technology showcase
A complete list of healthcare sessions + a few you hot topic sessions is below. I look forward to seeing you in Las Vegas next week!
- PII – Personally Identifiable Information – any data that could potentially identify a specific individual. Any information that can be used to distinguish one person from another and can be used for de-anonymizing anonymous data can be considered PII
- GSA’s Rules of Behavior for Handling Personally Identifiable Information – This directive provides GSA’s policy on how to properly handle PII and the consequences and corrective actions that will be taken if a breach occurs
- PHI – Protected Health Information – any information about health status, provision of health care, or payment for health care that can be lined to a specific individual
- HIPAA Privacy Rule – The HIPAA Privacy Rule establishes national standards to protect individuals’ medical records and other personal health information and applies to health plans, health care clearinghouses, and those health care providers that conduct certain health care transactions electronically. The Rule requires appropriate safeguards to protect the privacy of personal health information, and sets limits and conditions on the uses and disclosures that may be made of such information without patient authorization. The Rule also gives patients rights over their health information, including rights to examine and obtain a copy of their health records, and to request corrections.
- Encryption – a method of protecting data by scrambling it into an unreadable form. It is a systematic encoding process which is only reversible with the right key.
- Tokenization – a method of replacing sensitive data with non-sensitive placeholder tokens. These tokens are swapped with data stored in relational databases and files.
- Data masking – a process that scrambles data, either an entire database or a subset. Unlike encryption, masking is not reversible; unlike tokenization, masked data is useful for limited purposes. There are several types of data masking:
- Static data masking (SDM) masks data in advance of using it. Non production databases masked NOT in real-time.
- Dynamic data masking (DDM) masks production data in real time
- Data Redaction – masks unstructured content (PDF, Word, Excel)
Each of the three methods for protecting data (encryption, tokenization and data masking) have different benefits and work to solve different security issues . We’ll address them in a bit. For a visual representation of the three methods – please see the table below:
For protecting PHI data – encryption is superior to tokenization. You encrypt different portions of personal healthcare data under different encryption keys. Only those with the requisite keys can see the data. This form of encryption requires advanced application support to manage the different data sets to be viewed or updated by different audiences. The key management service must be very scalable to handle even a modest community of users. Record management is particularly complicated. Encryption works better than tokenization for PHI – but it does not scale well.
Properly deployed, encryption is a perfectly suitable tool for protecting PII. It can be set up to protect archived data or data residing on file systems without modification to business processes.
- To protect the data, you must install encryption and key management services to protect the data – this only protects the data from access that circumvents applications
- You can add application layer encryption to protect data in use
- This requires changing applications and databases to support the additional protection
- You will pay the cost of modification and the performance of the application will be impacted
For tokenization of PHI – there are many pieces of data which must be bundled up in different ways for many different audiences. Using the tokenized data requires it to be de-tokenized (which usually includes a decryption process). This introduces an overhead to the process. A person’s medical history is a combination of medical attributes, doctor visits, outsourced visits. It is an entangled set of personal, financial, and medical data. Different groups need access to different subsets. Each audience needs a different slice of the data – but must not see the rest of it. You need to issue a different token for each and every audience. You will need a very sophisticated token management and tracking system to divide up the data, issuing and tracking different tokens for each audience.
Masking can scramble individual data columns in different ways so that the masked data looks like the original (retaining its format and data type) but it is no longer sensitive data. Masking is effective for maintaining aggregate values across an entire database, enabling preservation of sum and average values within a data set, while changing all the individual data elements. Masking plus encryption provide a powerful combination for distribution and sharing of medical information
Traditionally, data masking has been viewed as a technique for solving a test data problem. The December 2014 Gartner Magic Quadrant Report on Data Masking Technology extends the scope of data masking to more broadly include data de-identification in production, non-production, and analytic use cases. The challenge is to do this while retaining business value in the information for consumption and use.
Masked data should be realistic and quasi-real. It should satisfy the same business rules as real data. It is very common to use masked data in test and development environments as the data looks like “real” data, but doesn’t contain any sensitive information.
The U.S. Department of Health and Human Services declares that approximately 75 percent of the country’s eligible professionals and more than 91 percent of hospitals are on electronic health records certified for Stage 1 meaningful use. These applications along with many others are creating valuable electronic data that – when integrated, shared and analyzed – can propel transformational initiatives like accountable care, population health and risk based contracting.
Success should not be limited by technology given other industries have demonstrated that near real time sharing and analysis of sensitive data is quite possible. What, then, could hold healthcare back from success?
Informatica recently released the findings of a survey targeted at understanding the answer to this very question. Respondents revealed that while 85% of are effective at putting financial data to use to inform decision making, many less are confident about putting data to use to inform patient engagement initiatives requiring access to external data and big data which they note to be more challenging. Complex, cross organizational transformational business processes require information sharing between applications yet over 65% of respondents say data integration and data quality are significantly challenging. So healthcare organizations are collecting data but many have yet to integrate these silos of application data to realize its full potential.
Similarly, International Institute of Analytics, a little over a year ago, offered a view of the healthcare analytics maturity landscape based upon deep quantitative assessments of more than 20 healthcare provider organizations. This research uncovered two important facts:
- Hospitals and other healthcare provider organizations have gone to great effort to implement core components of an EMR, giving them access to large amounts of data on patients, processes and costs.
- Those data assets have not yet been put to their highest and best use.
This is not uncommon, across industries and firms of all shapes and sizes, many have confronted the question of how well they are leveraging data and analytics to transform their organizations. Those organizations that have made the most progress in revealing meaningful and differentiated insights are those that have intentionally built and funded enterprise information management or data management programs in support of analytics. These programs accelerate stakeholder access to trusted information when and where they need it.
Informatica worked with International Institute of Analytics to publish a new whitepaper that explores this issue, with detailed definitions for how IIA measures analytics maturity within healthcare as an independent third-party. This report looks at how provider organizations are approaching their investments in analytics, with a focus on essential attributes like data quality and management, leadership support, the culture of data, analytics talent, and an enterprise-wide approach to analytics. If you haven’t read it yet, I urge you to read the report.
If you’d like to talk about these ideas, visit Informatica at HIMSS15 in Booth #3056.
I live in a small town in Maine. Between my town and the surrounding three towns, there are seven Main Streets and three Annis Roads or Lanes (and don’t get me started on the number of Moose Trails). If your insurance company wants to market to or communicate with someone in my town or one of the surrounding towns, how can you ensure that the address that you are sending material to is correct? What is the cost if material is sent to an incorrect or outdated address? What is the cost to your insurance company if a provider sends the bill out to the wrong ?
How much is poor address quality costing your business? It doesn’t just impact marketing where inaccurate address data translates into missed opportunity – it also means significant waste in materials, labor, time and postage . Bills may be delivered late or returned with sender unknown, meaning additional handling times, possible repackaging, additional postage costs (Address Correction Penalties) and the risk of customer service issues. When mail or packages don’t arrive, pressure on your customer support team can increase and your company’s reputation can be negatively impacted. Bills and payments may arrive late or not at all directly impacting your cash flow. The cost of bad address data causes inefficiencies and raises costs across your entire organization.
The best method for handling address correction is through a validation and correction process:
When trying to standardize member or provider information one of the first places to look is address data. If you can determine that John Q Smith that lives at 134 Main St in Northport, Maine 04843 is the same John Q Smith that lives at 134 Maine Street in Lincolnville, Maine 04849, you have provided a link between two members that are probably considered distinct in your systems. Once you can validate that there is no 134 Main St in Northport according to the postal service, and then can validate that 04849 is a valid zip code for Lincolnville – you can then standardize your address format to something along the lines of: 134 MAIN ST LINCOLNVILLE,ME 04849. Now you have a consistent layout for all of your addresses that follows postal service standards. Each member now has a consistent address which is going to make the next step of creating a golden record for each member that much simpler.
Think about your current method of managing addresses. Likely, there are several different systems that capture addresses with different standards for what data is allowed into each field – and quite possibly these independent applications are not checking or validating against country postal standards. By improving the quality of address data, you are one step closer to creating high quality data that can provide the up-to-the minute accurate reporting your organization needs to succeed.
With the European Medicines Agency (EMA) date for compliance to IDMP (Identification of Medicinal Products) looming, Q1 2015 has seen a significant increase in IDMP activity. Both Informatica & HighPoint Solution’s IDMP Round Table in January, and a February Marcus Evans conference in Berlin provided excellent forums for sharing progress, thoughts and strategies. Additional confidential conversations with pharmaceutical companies show an increase in the number of approved and active projects, although some are still seeking full funding. The following paragraphs sum up the activity and trends that I have witnessed in the first three months of the year.
I’ll start with my favourite quote, which is from Dr. Jörg Stüben of Boehringer Ingelheim, who asked:
“Isn’t part of compliance being in control of your data?”
I like it because to me it is just the right balance of stating the obvious, and questioning the way the majority of pharmaceutical companies approach compliance: A report that has to be created and submitted. If a company is in control of their data, regulatory compliance would be easier and come at a lower cost. More importantly, the company itself would benefit from easy access to high quality data.
Dr. Stüben’s question was raised during his excellent presentation at the Marcus Evans conference. Not only did he question the status quo, but proposed an alternate way for IDMP compliance: Let Boehringer benefit from their investment in IDMP compliance. His approach can be summarised as follows:
- Embrace a holistic approach to being in control of data, i.e. adopt data governance practices.
- This is not about just compliance. Include optional attributes that will deliver value to the organisation if correctly managed.
- Get started by creating simple, clear work packages.
Although Dr Stüben did not outline his technical solution, it would include data quality tools and a product data hub.
At the same conference, Stefan Fischer Rivera & Stefan Brügger of Bayer and Guido Claes from Janssen Pharmaceuticals both came out strongly in favour of using a Master Data Management (MDM) approach to achieving compliance. Both companies have MDM technology and processes within their organisations, and realise the value a MDM approach can bring to achieving compliance in terms of data management and governance. Having Mr Claes express how well Informatica’s MDM and Data Quality solutions support his existing substance data management program, made his presentation even more enjoyable to me.
Whilst the exact approaches of Bayer and Janssen differed, there were some common themes:
- Consider both the short term (compliance) and the long term (data governance) in the strategy
- Centralised MDM is ideal, but a federated approach is practical for July 2016
- High quality data should be available to a wide audience outside of IDMP compliance
The first and third bullet points map very closely to Dr. Stüben’s key points, and in fact show a clear trend in 2015:
IDMP Compliance is an opportunity to invest in your data management solutions and processes for the benefit of the entire organisation.
Although the EMA was not represented at the conference, Andrew Marr presented their approach to IDMP, and master data in general. The EMA is undergoing a system re-organisation to focus on managing Substance, Product, Organisation and Reference data centrally, rather than within each regulation or program as it is today. MDM will play a key role in managing this data, setting a high standard of data control and management for regulatory purposes. It appears that the EMA is also using IDMP to introduce better data management practice.
Depending on the size of the company, and the skills & tools available, other non-MDM approaches have been presented or discussed during the first part of 2015. These include using XML and SharePoint to manage product data. However I share a primary concern with others in the industry with this approach: How well can you manage and control change using these tools? Some pharmaceutical companies have openly stated that data contributors often spend more time looking for data than doing their own jobs. A XML/SharePoint approach will do little to ease this burden, but an MDM approach will.
Despite the others approaches and solutions being discovered, there is another clear trend in Q1 2015
MDM is becoming a favoured approach for IDMP compliance due to its strong governance, centralised attribute-level data management and ability to track changes.
Interestingly, the opportunity to invest in data management, and the rise of MDM as a favoured approach has been backed up with research by Gens Associates. Messers Gens and Brolund found a rapid increase in investment during 2014 of what they term Information Architecture, in which MDM plays a key role. IDMP is seen as a major driver for this investment. They go on to state that investment in master data management programs will allow a much easier and cost effective approach to data exchange (internally and externally), resulting in substantial benefits. Unfortunately they do not elaborate on these benefits, but I have placed a summary on benefits of using MDM for IDMP compliance here.
In terms of active projects, the common compliance activities I have seen in the first quarter of 2015 are as follows:
- Most companies are in the discovery phase: identifying the effort for compliance
- Some are starting to make technology choices, and have submitted RFPs/RFQs
- Those furthest along in technology already have MDM programs or initiatives underway
- Despite getting a start, some are still lacking enough funding for achieving compliance
- Output from the discovery phase will in some cases be used to request full funding
- A significant number of projects have a goal to implement better data management practice throughout the company. IDMP will be the as the first release.
A final trend I have noticed in 2015 is regarding the magnitude of the compliance task ahead:
Those who have made the most progress are those who are most concerned about achieving compliance on time.
The implication is that the companies who are starting late do not yet realise the magnitude of the task ahead. It is not yet too late to comply and achieve long term benefits through better data management, despite only 15 months before the initial EMA deadline. Informatica has customers who have implemented MDM within 6 months. 15 months is achievable provided the project (or program) gets the focus and resources required.
IDMP compliance is a common challenge to all those in the pharmaceutical industry. Learning from others will help avoid common mistakes and provide tips on important topics. For example, how to secure funding and support from senior management is a common concern among those tasked with compliance. In order to encourage learning and networking, Informatica and HighPoint Solutions will be hosting our third IDMP roundtable in London on May 13th. Please do join us to share your experiences, and learn from the experiences of others.
In our house when we paint a room, my husband does the big rolling of the walls or ceiling, I do the cut-in work. I am good at prepping the room, taping all the trim and deliberately painting the corners. However, I am thrifty and constantly concerned that we won’t have enough paint to finish a room. My husband isn’t afraid to use enough paint and is extremely efficient at painting a wall in a single even coat. As a result, I don’t do the big rolling and he doesn’t do the cutting in. It took us awhile to figure this out, and a few rooms had to be repainted while we were figuring it out. Now we know what we are good at, and what we need help with.
Payers roles are changing. Payers were previously focused on risk assessment, setting and collecting premiums, analyzing claims and making payments – all while optimizing revenues. Payers are pretty good at selling to employers, figuring out the cost/benefit ratio from an employers perspective and ensuring a good, profitable product. With the advent of the Affordable Healthcare Act along with a much more transient insured population, payers now must focus more on the individual insured and be able to communicate with the individuals in a more nimble manner than in the past.
Individual members will shop for insurance based on consumer feedback and price. They are interested in ease of enrollment and the ability to submit and substantiate claims quickly and intuitively. Payers are discovering that they need to help manage population health at a individual member level. And population health management requires less of a business-data analytics approach and more social media and gaming-style logic to understand patients. In this way, payers can help develop interventions to sustain behavioral changes for better health.
When designing such analytics, payers should consider the following key design steps:
- Extend data warehouses to an analytics appliance
- Invest in a big data platform to absorb patients’ social data
- Build predictive analytics for patient behavior
- Bridge collaborative and behavioral analytics with claims to build revenue and profitability
Due to payers’ mature predictive analytics competencies, they will have a much easier time in the next generation of population behavior compared to their provider counterparts. As clinical content is often unstructured compared to the claims data, payers need to pay extra attention to context and semantics when deciphering clinical content submitted by providers. Payers can use help from vendors that can help them understand unstructured data, individual members. They can then use that data to create fantastic predictive analytic solutions.
Healthcare and data have the makings of an epic love affair, but like most relationships, it’s not all roses. Data is playing a powerful role in finding a cure for cancer, informing cost reduction, targeting preventative treatments and engaging healthcare consumers in their own care. The downside? Data is needy. It requires investments in connectedness, cleanliness and safety to maximize its potential.
- Data is ubiquitous…connect it.
4400 times the amount of information held at the Library of Congress – that’s how much data Kaiser Permanente alone has generated from its electronic medical record. Kaiser successfully makes every piece of information about each patient available to clinicians, including patient health history, diagnosis by other providers, lab results and prescriptions. As a result, Kaiser has seen marked improvements in outcomes: 26% reduction in office visits per member and a 57% reduction in medication errors.
Ongoing value, however, requires continuous investment in data. Investments in data integration and data quality ensure that information from the EMR is integrated with other sources (think claims, social, billing, supply chain) so that clinicians and decision makers have access in the format they need. Without this, self-service intelligence can be inhibited by duplicate data, poor quality data or application silos.
- Data is popular…ensure it is clean.
Healthcare leaders can finally rely on electronic data to make strategic decisions. A CHRISTUS Health anecdote you might relate to – In a weekly meeting each executive reviews their strategic dashboard; these dashboards drive strategic decision making about CPOE adoption (computerized physician order entry), emergency room wait times and price per procedure. Powered by enterprise information management, these dashboards paint a reliable and consistent view across the system’s 60 hospitals. Previous to the implementation of an enterprise data platform, each executive was reliant on their own set of data.
In the pre-data investment era, seemingly common data elements from different sources did not mean the same thing. For example, “Admit Date” in one report reflected the emergency department admission date whereas “Admit Date” in another report referred to the inpatient admission date.
- Sharing data is necessary…make it safe.
To cure cancer, reduce costs and engage patients, care providers need access to data and not just the data they generate; it has to be shared for coordination of care through transitions of care and across settings, i.e. home care, long term care and behavioral health. Fortunately, Consumers and Clinicians agree on this, PWC reports that 56% of consumers and 30% of physicians are comfortable with data sharing for care coordination. Further progress is demonstrated by healthcare organizations willingly adopting cloud based applications –as of 2013, 40% of healthcare organizations were already storing protected health information (PHI) in the cloud.
Increased data access carries risk, leaving health data exposed, however. The threat of data breach or hacking is multiplied by the presence (in many cases necessary) of PHI on employee laptops and the fact that providers are provided increased access to PHI. Ponemon Institute, a security firm estimates that data breaches cost the industry $5.6 billion each year. Investments in data-centric security are necessary to assuage fear, protect personal health data and make secure data sharing a reality.
Early improvements in patient outcomes indicate that the relationship between data and healthcare is a valuable investment. The International Institute of Analytics supports this, reporting that although analytics and data maturity across healthcare lags other industries, the opportunity to positively impact clinical and operational outcomes is significant.
The signs that healthcare is becoming a more consumer (think patients, members, providers) driven industry are evident all around us. I see provider and payer organizations clamoring for more data, specifically data that is actionable, relatable and has integrity. Armed with this data, healthcare organizations are able to differentiate around a member/patient-centric view.
These consumer-centric views convey the total value of the relationships healthcare organizations have with consumers. Understanding the total value creates a more comprehensive understanding of consumers because they deliver a complete picture of an individual’s critical relationships including: patient to primary care provider, member to household, provider to network and even members to legacy plans. This is the type of knowledge that informs new treatments, targets preventative care programs and improves outcomes.
Payer organizations are collecting and analyzing data to identify opportunities for more informed care management and segmentation to reach new, high value customers in individual markets. By segmenting and targeting messaging to specific populations, health plans generate increased member satisfaction and cost effectively expands and manages provider networks.
How will they accomplish this? Enabling members to interact in health and wellness forums, analyzing member behavior and trends and informing care management programs with a 360 view of members… to name a few . Payers will also drive new member programs, member retention and member engagement marketing and sales programs by investigating complete views of member households and market segments.
In the provider space, this relationship building can be a little more challenging because often consumers as patients do not interact with their doctor unless they are sick, creating gaps in data. When provider organizations have a better understanding of their patients and providers, they can increase patient satisfaction and proactively offer preventative care to the sickest (and most likely to engage) of patients before an episode occurs. These activities result in increased market share and improved outcomes.
Where can providers start? By creating a 360 view of the patient, organizations can now improve care coordination, open new patient service centers and develop patient engagement programs.
Analyzing populations of patients, and fostering patient engagement based on Meaningful Use requirements or Accountable Care requirements, building out referral networks and developing physician relationships are essential ingredients in consumer engagement. Knowing your patients and providing a better patient experience than your competition will differentiate provider organizations.
You may say “This all sounds great, but how does it work?” An essential ingredient is clean, safe and connected data. Clean, safe and connected data requires an investment in data as an asset… just like you invest in real estate and human capital, you must invest in the accessibility and quality of your data. To be successful, arm your team with tools to govern data –ensuring ongoing integrity and quality of data, removes duplicate records and dynamically incorporates data validation/quality rules. These tools include master data management, data quality, metadata management and are focused on information quality. Tools focused on information quality support a total customer relationship view of members, patients and providers.