Category Archives: Healthcare
I live in a small town in Maine. Between my town and the surrounding three towns, there are seven Main Streets and three Annis Roads or Lanes (and don’t get me started on the number of Moose Trails). If your insurance company wants to market to or communicate with someone in my town or one of the surrounding towns, how can you ensure that the address that you are sending material to is correct? What is the cost if material is sent to an incorrect or outdated address? What is the cost to your insurance company if a provider sends the bill out to the wrong ?
How much is poor address quality costing your business? It doesn’t just impact marketing where inaccurate address data translates into missed opportunity – it also means significant waste in materials, labor, time and postage . Bills may be delivered late or returned with sender unknown, meaning additional handling times, possible repackaging, additional postage costs (Address Correction Penalties) and the risk of customer service issues. When mail or packages don’t arrive, pressure on your customer support team can increase and your company’s reputation can be negatively impacted. Bills and payments may arrive late or not at all directly impacting your cash flow. The cost of bad address data causes inefficiencies and raises costs across your entire organization.
The best method for handling address correction is through a validation and correction process:
When trying to standardize member or provider information one of the first places to look is address data. If you can determine that John Q Smith that lives at 134 Main St in Northport, Maine 04843 is the same John Q Smith that lives at 134 Maine Street in Lincolnville, Maine 04849, you have provided a link between two members that are probably considered distinct in your systems. Once you can validate that there is no 134 Main St in Northport according to the postal service, and then can validate that 04849 is a valid zip code for Lincolnville – you can then standardize your address format to something along the lines of: 134 MAIN ST LINCOLNVILLE,ME 04849. Now you have a consistent layout for all of your addresses that follows postal service standards. Each member now has a consistent address which is going to make the next step of creating a golden record for each member that much simpler.
Think about your current method of managing addresses. Likely, there are several different systems that capture addresses with different standards for what data is allowed into each field – and quite possibly these independent applications are not checking or validating against country postal standards. By improving the quality of address data, you are one step closer to creating high quality data that can provide the up-to-the minute accurate reporting your organization needs to succeed.
With the European Medicines Agency (EMA) date for compliance to IDMP (Identification of Medicinal Products) looming, Q1 2015 has seen a significant increase in IDMP activity. Both Informatica & HighPoint Solution’s IDMP Round Table in January, and a February Marcus Evans conference in Berlin provided excellent forums for sharing progress, thoughts and strategies. Additional confidential conversations with pharmaceutical companies show an increase in the number of approved and active projects, although some are still seeking full funding. The following paragraphs sum up the activity and trends that I have witnessed in the first three months of the year.
I’ll start with my favourite quote, which is from Dr. Jörg Stüben of Boehringer Ingelheim, who asked:
“Isn’t part of compliance being in control of your data?”
I like it because to me it is just the right balance of stating the obvious, and questioning the way the majority of pharmaceutical companies approach compliance: A report that has to be created and submitted. If a company is in control of their data, regulatory compliance would be easier and come at a lower cost. More importantly, the company itself would benefit from easy access to high quality data.
Dr. Stüben’s question was raised during his excellent presentation at the Marcus Evans conference. Not only did he question the status quo, but proposed an alternate way for IDMP compliance: Let Boehringer benefit from their investment in IDMP compliance. His approach can be summarised as follows:
- Embrace a holistic approach to being in control of data, i.e. adopt data governance practices.
- This is not about just compliance. Include optional attributes that will deliver value to the organisation if correctly managed.
- Get started by creating simple, clear work packages.
Although Dr Stüben did not outline his technical solution, it would include data quality tools and a product data hub.
At the same conference, Stefan Fischer Rivera & Stefan Brügger of Bayer and Guido Claes from Janssen Pharmaceuticals both came out strongly in favour of using a Master Data Management (MDM) approach to achieving compliance. Both companies have MDM technology and processes within their organisations, and realise the value a MDM approach can bring to achieving compliance in terms of data management and governance. Having Mr Claes express how well Informatica’s MDM and Data Quality solutions support his existing substance data management program, made his presentation even more enjoyable to me.
Whilst the exact approaches of Bayer and Janssen differed, there were some common themes:
- Consider both the short term (compliance) and the long term (data governance) in the strategy
- Centralised MDM is ideal, but a federated approach is practical for July 2016
- High quality data should be available to a wide audience outside of IDMP compliance
The first and third bullet points map very closely to Dr. Stüben’s key points, and in fact show a clear trend in 2015:
IDMP Compliance is an opportunity to invest in your data management solutions and processes for the benefit of the entire organisation.
Although the EMA was not represented at the conference, Andrew Marr presented their approach to IDMP, and master data in general. The EMA is undergoing a system re-organisation to focus on managing Substance, Product, Organisation and Reference data centrally, rather than within each regulation or program as it is today. MDM will play a key role in managing this data, setting a high standard of data control and management for regulatory purposes. It appears that the EMA is also using IDMP to introduce better data management practice.
Depending on the size of the company, and the skills & tools available, other non-MDM approaches have been presented or discussed during the first part of 2015. These include using XML and SharePoint to manage product data. However I share a primary concern with others in the industry with this approach: How well can you manage and control change using these tools? Some pharmaceutical companies have openly stated that data contributors often spend more time looking for data than doing their own jobs. A XML/SharePoint approach will do little to ease this burden, but an MDM approach will.
Despite the others approaches and solutions being discovered, there is another clear trend in Q1 2015
MDM is becoming a favoured approach for IDMP compliance due to its strong governance, centralised attribute-level data management and ability to track changes.
Interestingly, the opportunity to invest in data management, and the rise of MDM as a favoured approach has been backed up with research by Gens Associates. Messers Gens and Brolund found a rapid increase in investment during 2014 of what they term Information Architecture, in which MDM plays a key role. IDMP is seen as a major driver for this investment. They go on to state that investment in master data management programs will allow a much easier and cost effective approach to data exchange (internally and externally), resulting in substantial benefits. Unfortunately they do not elaborate on these benefits, but I have placed a summary on benefits of using MDM for IDMP compliance here.
In terms of active projects, the common compliance activities I have seen in the first quarter of 2015 are as follows:
- Most companies are in the discovery phase: identifying the effort for compliance
- Some are starting to make technology choices, and have submitted RFPs/RFQs
- Those furthest along in technology already have MDM programs or initiatives underway
- Despite getting a start, some are still lacking enough funding for achieving compliance
- Output from the discovery phase will in some cases be used to request full funding
- A significant number of projects have a goal to implement better data management practice throughout the company. IDMP will be the as the first release.
A final trend I have noticed in 2015 is regarding the magnitude of the compliance task ahead:
Those who have made the most progress are those who are most concerned about achieving compliance on time.
The implication is that the companies who are starting late do not yet realise the magnitude of the task ahead. It is not yet too late to comply and achieve long term benefits through better data management, despite only 15 months before the initial EMA deadline. Informatica has customers who have implemented MDM within 6 months. 15 months is achievable provided the project (or program) gets the focus and resources required.
IDMP compliance is a common challenge to all those in the pharmaceutical industry. Learning from others will help avoid common mistakes and provide tips on important topics. For example, how to secure funding and support from senior management is a common concern among those tasked with compliance. In order to encourage learning and networking, Informatica and HighPoint Solutions will be hosting our third IDMP roundtable in London on May 13th. Please do join us to share your experiences, and learn from the experiences of others.
In our house when we paint a room, my husband does the big rolling of the walls or ceiling, I do the cut-in work. I am good at prepping the room, taping all the trim and deliberately painting the corners. However, I am thrifty and constantly concerned that we won’t have enough paint to finish a room. My husband isn’t afraid to use enough paint and is extremely efficient at painting a wall in a single even coat. As a result, I don’t do the big rolling and he doesn’t do the cutting in. It took us awhile to figure this out, and a few rooms had to be repainted while we were figuring it out. Now we know what we are good at, and what we need help with.
Payers roles are changing. Payers were previously focused on risk assessment, setting and collecting premiums, analyzing claims and making payments – all while optimizing revenues. Payers are pretty good at selling to employers, figuring out the cost/benefit ratio from an employers perspective and ensuring a good, profitable product. With the advent of the Affordable Healthcare Act along with a much more transient insured population, payers now must focus more on the individual insured and be able to communicate with the individuals in a more nimble manner than in the past.
Individual members will shop for insurance based on consumer feedback and price. They are interested in ease of enrollment and the ability to submit and substantiate claims quickly and intuitively. Payers are discovering that they need to help manage population health at a individual member level. And population health management requires less of a business-data analytics approach and more social media and gaming-style logic to understand patients. In this way, payers can help develop interventions to sustain behavioral changes for better health.
When designing such analytics, payers should consider the following key design steps:
- Extend data warehouses to an analytics appliance
- Invest in a big data platform to absorb patients’ social data
- Build predictive analytics for patient behavior
- Bridge collaborative and behavioral analytics with claims to build revenue and profitability
Due to payers’ mature predictive analytics competencies, they will have a much easier time in the next generation of population behavior compared to their provider counterparts. As clinical content is often unstructured compared to the claims data, payers need to pay extra attention to context and semantics when deciphering clinical content submitted by providers. Payers can use help from vendors that can help them understand unstructured data, individual members. They can then use that data to create fantastic predictive analytic solutions.
Healthcare and data have the makings of an epic love affair, but like most relationships, it’s not all roses. Data is playing a powerful role in finding a cure for cancer, informing cost reduction, targeting preventative treatments and engaging healthcare consumers in their own care. The downside? Data is needy. It requires investments in connectedness, cleanliness and safety to maximize its potential.
- Data is ubiquitous…connect it.
4400 times the amount of information held at the Library of Congress – that’s how much data Kaiser Permanente alone has generated from its electronic medical record. Kaiser successfully makes every piece of information about each patient available to clinicians, including patient health history, diagnosis by other providers, lab results and prescriptions. As a result, Kaiser has seen marked improvements in outcomes: 26% reduction in office visits per member and a 57% reduction in medication errors.
Ongoing value, however, requires continuous investment in data. Investments in data integration and data quality ensure that information from the EMR is integrated with other sources (think claims, social, billing, supply chain) so that clinicians and decision makers have access in the format they need. Without this, self-service intelligence can be inhibited by duplicate data, poor quality data or application silos.
- Data is popular…ensure it is clean.
Healthcare leaders can finally rely on electronic data to make strategic decisions. A CHRISTUS Health anecdote you might relate to – In a weekly meeting each executive reviews their strategic dashboard; these dashboards drive strategic decision making about CPOE adoption (computerized physician order entry), emergency room wait times and price per procedure. Powered by enterprise information management, these dashboards paint a reliable and consistent view across the system’s 60 hospitals. Previous to the implementation of an enterprise data platform, each executive was reliant on their own set of data.
In the pre-data investment era, seemingly common data elements from different sources did not mean the same thing. For example, “Admit Date” in one report reflected the emergency department admission date whereas “Admit Date” in another report referred to the inpatient admission date.
- Sharing data is necessary…make it safe.
To cure cancer, reduce costs and engage patients, care providers need access to data and not just the data they generate; it has to be shared for coordination of care through transitions of care and across settings, i.e. home care, long term care and behavioral health. Fortunately, Consumers and Clinicians agree on this, PWC reports that 56% of consumers and 30% of physicians are comfortable with data sharing for care coordination. Further progress is demonstrated by healthcare organizations willingly adopting cloud based applications –as of 2013, 40% of healthcare organizations were already storing protected health information (PHI) in the cloud.
Increased data access carries risk, leaving health data exposed, however. The threat of data breach or hacking is multiplied by the presence (in many cases necessary) of PHI on employee laptops and the fact that providers are provided increased access to PHI. Ponemon Institute, a security firm estimates that data breaches cost the industry $5.6 billion each year. Investments in data-centric security are necessary to assuage fear, protect personal health data and make secure data sharing a reality.
Early improvements in patient outcomes indicate that the relationship between data and healthcare is a valuable investment. The International Institute of Analytics supports this, reporting that although analytics and data maturity across healthcare lags other industries, the opportunity to positively impact clinical and operational outcomes is significant.
The signs that healthcare is becoming a more consumer (think patients, members, providers) driven industry are evident all around us. I see provider and payer organizations clamoring for more data, specifically data that is actionable, relatable and has integrity. Armed with this data, healthcare organizations are able to differentiate around a member/patient-centric view.
These consumer-centric views convey the total value of the relationships healthcare organizations have with consumers. Understanding the total value creates a more comprehensive understanding of consumers because they deliver a complete picture of an individual’s critical relationships including: patient to primary care provider, member to household, provider to network and even members to legacy plans. This is the type of knowledge that informs new treatments, targets preventative care programs and improves outcomes.
Payer organizations are collecting and analyzing data to identify opportunities for more informed care management and segmentation to reach new, high value customers in individual markets. By segmenting and targeting messaging to specific populations, health plans generate increased member satisfaction and cost effectively expands and manages provider networks.
How will they accomplish this? Enabling members to interact in health and wellness forums, analyzing member behavior and trends and informing care management programs with a 360 view of members… to name a few . Payers will also drive new member programs, member retention and member engagement marketing and sales programs by investigating complete views of member households and market segments.
In the provider space, this relationship building can be a little more challenging because often consumers as patients do not interact with their doctor unless they are sick, creating gaps in data. When provider organizations have a better understanding of their patients and providers, they can increase patient satisfaction and proactively offer preventative care to the sickest (and most likely to engage) of patients before an episode occurs. These activities result in increased market share and improved outcomes.
Where can providers start? By creating a 360 view of the patient, organizations can now improve care coordination, open new patient service centers and develop patient engagement programs.
Analyzing populations of patients, and fostering patient engagement based on Meaningful Use requirements or Accountable Care requirements, building out referral networks and developing physician relationships are essential ingredients in consumer engagement. Knowing your patients and providing a better patient experience than your competition will differentiate provider organizations.
You may say “This all sounds great, but how does it work?” An essential ingredient is clean, safe and connected data. Clean, safe and connected data requires an investment in data as an asset… just like you invest in real estate and human capital, you must invest in the accessibility and quality of your data. To be successful, arm your team with tools to govern data –ensuring ongoing integrity and quality of data, removes duplicate records and dynamically incorporates data validation/quality rules. These tools include master data management, data quality, metadata management and are focused on information quality. Tools focused on information quality support a total customer relationship view of members, patients and providers.
I have two kids. In school. They generate a remarkable amount of paper. From math worksheets, permission slips, book reports (now called reading responses) to newsletters from the school. That’s a lot of paper. All of it is presented in different forms with different results – the math worksheets tell me how my child is doing in math, the permission slips tell me when my kids will be leaving school property and the book reports tell me what kind of books my child is interested in reading. I need to put the math worksheet information into a storage space so I can figure out how to prop up my kid if needed on the basic geometry constructs. The dates that permission slips are covering need to go into the calendar. The book reports can be used at the library to choose the next book.
We are facing a similar problem (albeit on a MUCH larger scale) in the insurance market. We are getting data from clinicians. Many of you are developing and deploying mobile applications to help patients manage their care, locate providers and improve their health. You may capture licensing data to assist pharmaceutical companies identify patients for inclusion in clinical trials. You have advanced analytics systems for fraud detection and to check the accuracy and consistency of claims. Possibly you are at the point of near real-time claim authorization.
The amount of data generated in our world is expected to increase significantly in the coming years. There are an estimated 50 petabytes of data in the Healthcare realm, which is predicted to grow by a factor of 50 to 25,000 petabytes by 2020. Healthcare payers already store and analyze some of this data. However in order to capture, integrate and interrogate large information sets, the scope of the payer information will have to increase significantly to include provider data, social data, government data, pharmaceutical and medical product manufacturers data, and information aggregator data.
Right now – you probably depend on a traditional data warehouse model and structured data analytics to access some of your data. This has worked adequately for you up to now, but with the amount of data that will be generated in the future, you need the processing capability to load and query multi-terabyte datasets in a timely fashion. You need the ability to manage both semi-structured and unstructured data.
Fortunately, a set of emerging technologies (called “Big Data”) may provide the technical foundation of a solution. Big Data usually includes data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage and process data within a tolerable amount of time. While some existing technology may prove inadequate to future tasks, many of the information management methods of the past will prove to be as valuable as ever. Assembling successful Big Data solutions will require a fusion of new technology and old-school disciplines:
Which of these technologies do you have? Which of these technologies can integrate with on-premise AND cloud based solutions? On which of these technologies does your organization have knowledgeable resources that can utilize the capabilities to take advantage of Big Data?
This week, another reputable organization, Anthem Inc, reported it was ‘the target of a very sophisticated external cyber attack’. But rather than be upset at Anthem, I respect their responsible data breach reporting.
In this post from Joseph R. Swedish, President and CEO, Anthem, Inc., does something that I believe all CEO’s should do in this situation. He is straight up about what happened, what information was breached, actions they took to plug the security hole, and services available to those impacted.
When it comes to a data breach, the worst thing you can do is ignore it or hope it will go away. This was not the case with Anthem. Mr Swedish did the right thing and I appreciate it.
You only have one corporate reputation – and it is typically aligned with the CEO’s reputation. When the CEO talks about the details of a data breach and empathizes with those impacted, he establishes a dialogue based on transparency and accountability.
Research that tells us 44% of healthcare and pharmaceutical organizations experienced a breach in 2014. And we know that when personal information when combined with health information is worth more on the black market because the data can be used for insurance fraud. I expect more healthcare providers will be on the defensive this year and only hope that they follow Mr Swedish’s example when facing the music.
Patient experience is key to growth and success for all health delivery organizations. Gartner has stated that the patient experience needs to be one of the highest priorities for organizations. The quality of your data is critical to achieving that goal. My recent experience with my physician’s office demonstrates how easy it is for the quality of data to influence the patient experience and undermine a patient’s trust in their physician and the organization with which they are interacting.
I have a great relationship with my doctor and have always been impressed by the efficiency of the office. I never wait beyond my appointment time, the care is excellent and the staff is friendly and professional. There is an online tool that allows me to see my records, send messages to my doctor, request an appointment and get test results. The organization enjoys the highest reputation for clinical quality. Pretty much perfect from my perspective – until now.
I needed to change a scheduled appointment due to a business conflict. Since I expected some negotiation I decided to make the phone call rather than request it on line…there are still transactions for which human to human is optimal! I had all my information at hand and made the call. The phone was pleasantly answered and the request given. The receptionist requested my name and date of birth, but then stated that I did not have a future appointment. I am looking at the online tool, which clearly states that I am scheduled for February 17 at 8:30 AM. The pleasant young woman confirms my name, date of birth and address and then tells me that I do not have an appointment scheduled. I am reasonably savvy about these things and figured out the core problem, which is that my last name is hyphenated. Armed with that information, my other record is found and a new appointment scheduled. The transaction is completed.
But now I am worried. My name has been like this for many years and none of my other key data has changed. Are there parts of my clinical history missing in the record that my doctor is using? Will that have a negative impact on the quality of my care? If I were to be unable to clearly respond, might that older record be accessed and my current medications and history not be available? The receptionist did not address the duplicate issue clearly by telling me that she would attend to merging the records, so I have no reason to believe that she will. My confidence is now shaken and I am less trustful of the system and how well it will serve me going forward. I have resolved my issue, but not everyone would be able to push back to insure that their records are now accurate.
Many millions of dollars are being spent on electronic health records. Many more millions are being spent to redesign work flow to accommodate the new EHR’s. Physicians and other clinicians are learning new ways to access data and treat their patients. The foundation for all of this is accurate data. Nicely displayed but inaccurate data will not result in improved care or enhanced member experience. As healthcare organizations move forward with the razzle dazzle of new systems they need to remember the basics of good quality data and insure that it is available to these new applications.
In a previous life, I was a pastry chef in a now-defunct restaurant. One of the things I noticed while working there (and frankly while cooking at home) is that the better the ingredients, the better the final result. If we used poor quality apples in the apple tart, we ended up with a soupy, flavorless mess with a chewy crust.
The same analogy can be applied to Data Analytics. With poor quality data, you get poor results from your analytics projects. We all know that companies that can implement fantastic analytic solutions that can provide near real-time access to consumer trends are the same companies that can do successful targeted marketing campaigns that are of the minute. The Data Warehousing Institute estimates that data quality problems cost U.S. businesses more than $600 billion a year.
The business impact of poor data quality cannot be underestimated. If not identified and corrected early on, defective data can contaminate all downstream systems and information assets, jacking up costs, jeopardizing customer relationships, and causing imprecise forecasts and poor decisions.
- To help you quantify: Let’s say your company receives 2 million claims per month with 377 data elements per claim. Even at an error rate of .001, the claims data contains more than 754,000 errors per month and more than 9.04 million errors per year! If you determine that 10 percent of the data elements are critical to your business decisions and processes, you still must fix almost 1 million errors each year!
- What is your exposure to these errors? Let’s estimate the risk at $10 per error (including staff time required to fix the error downstream after a customer discovers it, the loss of customer trust and loyalty and erroneous payouts. Your company’s risk exposure to poor quality claims data is $10 million a year.
Once your company values quality data as a critical resource – it is much easier to perform high-value analytics that have an impact on your bottom line. Start with creation of a Data Quality program. Data is a critical asset in the information economy, and the quality of a company’s data is a good predictor of its future success.
Every year, I get a replacement desk calendar to help keep all of our activities straight – and for a family of four, that is no easy task. I start with taking all of the little appointment cards the dentist, orthodontist, pediatrician and GP give to us for appointments that occur beyond the current calendar dates. I transcribe them all. Then I go through last year’s calendar to transfer any information that is relevant to this year’s calendar. And finally, I put the calendar down in the basement next to previous year calendars so I can refer back to them if I need. Last year’s calendar contains a lot of useful information, but no longer has the ability to solve my need to organize schedules for this year.
In a very loose way – this is very similar to application retirement. Many larger health plans have existing systems that were created several years (sometimes even several decades) ago. These legacy systems have been customized to reflect the health plan’s very specific business processes. They may be hosted on costly hardware, developed in antiquated software languages and rely on a few developers that are very close to retirement. The cost of supporting these (most likely) antiquated systems can be diverting valuable dollars away from innovation.
The process that I use to move appointment and contact data from one calendar to the next works for me – but is relatively small in scale. Imagine if I was trying to do this for an entire organization without losing context, detail or accuracy!
There are several methodologies for determining the best strategy for your organization to approach software modernization, including:
- Architecture Driven Modernization (ADM) is the initiative to standardize views of the existing systems in order to enable common modernization activities like code analysis and comprehension, and software transformation.
- SABA (Bennett et al., 1999) is a high-level framework for planning the evolution and migration of legacy systems, taking into account both organizational and technical issues.
- SRRT (Economic Model to Software Rewriting and Replacement Times), Chan et al. (1996), Formal model for determining optimal software rewrite and replacement timings based on versatile metrics data.
- And if all else fails: Model Driven Engineering (MDE) is being investigated as an approach for reverse engineering and then forward engineering software code
My calendar migration process evolved over time, your method for software modernization should be well planned prior to the go-live date for the new software system.