Category Archives: Healthcare

Introduction to the Healthcare Analytics Adoption Model

I have a teenaged daughter. She is interested in her appearance, but not a risk taker. As a result, she waits until a fashion trend has been established by her peers before adopting the new look. This works out well for me because I’m not on the bleeding edge of the vagaries of fashion spending on a trend that may fizzle out before it becomes more widely adopted. This works out well for my daughter because she is able to blend in with her peers.  She has developed her own adoption model for fashion.

Healthcare analytics can be considered to be similar to fashion. It can be confusing and even overwhelming without a systematic framework to guide an approach or priorities. Fortunately – a group of very smart people have been working on this and have created the Healthcare Analytics Adoption Model. The Healthcare Analytics Adoption Model is a framework to measure the adoption and meaningful use of data warehouses and analytics in healthcare – similar to the HIMSS Analytics EMRAM model. It should be considered a guide to classifying groups of analytics capabilities, and provide a methodology for health organizations to adopt analytics.

The Healthcare Analytics Adoption Model proposes that there are three phases of data analysis:

  1. data collection – systems that are designed specifically for supporting transaction based workflows and data collection. The adoption of the electronic medical record (EMR) is a great example of this phase.
  2. data sharing – The need for sharing data among members of the workflow team – similar to the capabilities of a health information exchange.
  3. data analytics – organizations realize that they can start to analyze this collected and shared data through investigating the patterns in the aggregated data.
healthcare-analytic-adoption-model

Healthcare Analytics Adoption Model

Once your organization has moved into Phase Three – you are ready to start work on the Healthcare Analytics Adoption Model which is comprised of eight levels:

Each level of adoption includes progressive expansion of analytic capabilities in four dimensions:

  1. New data sources – Data content expands as new sources of data are added to your organization
  2. Complexity – analytic algorithms and data binding become progressively more complex
  3. Data literacy – this increases among employees, leading to an increasing ability to exploit data as an asset
  4. Data timeliness – timeliness of data content increases (data latency decreases) leading to a reduction in decision cycles and mean time to improvement.

We’ll spend some time in future weeks talking through the different levels. In the meantime – where do you see your organization?

Share
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Cloud Application Integration, Data Aggregation, Data First, Data Integration, Data Quality, Governance, Risk and Compliance, Healthcare | Tagged , , , , , | Leave a comment

Payers – What They Are Good At, And What They Need Help With

healthcare_bigdata

Payers – What They Are Good At, And What They Need Help With

In our house when we paint a room, my husband does the big rolling of the walls or ceiling, I do the cut-in work. I am good at prepping the room, taping all the trim and deliberately painting the corners. However, I am thrifty and constantly concerned that we won’t have enough paint to finish a room. My husband isn’t afraid to use enough paint and is extremely efficient at painting a wall in a single even coat. As a result, I don’t do the big rolling and he doesn’t do the cutting in. It took us awhile to figure this out, and a few rooms had to be repainted while we were figuring it out.  Now we know what we are good at, and what we need help with.

Payers roles are changing. Payers were previously focused on risk assessment, setting and collecting premiums, analyzing claims and making payments – all while optimizing revenues. Payers are pretty good at selling to employers, figuring out the cost/benefit ratio from an employers perspective and ensuring a good, profitable product. With the advent of the Affordable Healthcare Act along with a much more transient insured population, payers now must focus more on the individual insured and be able to communicate with the individuals in a more nimble manner than in the past.

Individual members will shop for insurance based on consumer feedback and price. They are interested in ease of enrollment and the ability to submit and substantiate claims quickly and intuitively. Payers are discovering that they need to help manage population health at a individual member level. And population health management requires less of a business-data analytics approach and more social media and gaming-style logic to understand patients. In this way, payers can help develop interventions to sustain behavioral changes for better health.

When designing such analytics, payers should consider the following key design steps:

Due to payers’ mature predictive analytics competencies, they will have a much easier time in the next generation of population behavior compared to their provider counterparts. As clinical content is often unstructured compared to the claims data, payers need to pay extra attention to context and semantics when deciphering clinical content submitted by providers. Payers can use help from vendors that can help them understand unstructured data, individual members. They can then use that data to create fantastic predictive analytic solutions.

Share
Posted in 5 Sales Plays, Application Retirement, Big Data, Cloud Application Integration, Cloud Data Integration, Customer Acquisition & Retention, Customer Services, Data Governance, Data Quality, Governance, Risk and Compliance, Healthcare, Total Customer Relationship | Tagged | Leave a comment

Healthcare’s Love Hate Relationship with Data

Healthcare's Love Hate Relationship with Data

Healthcare’s Love Hate Relationship with Data

Healthcare and data have the makings of an epic love affair, but like most relationships, it’s not all roses. Data is playing a powerful role in finding a cure for cancer, informing cost reduction, targeting preventative treatments and engaging healthcare consumers in their own care. The downside? Data is needy. It requires investments in connectedness, cleanliness and safety to maximize its potential.

  • Data is ubiquitous…connect it.

4400 times the amount of information held at the Library of Congress – that’s how much data Kaiser Permanente alone has generated from its electronic medical record. Kaiser successfully makes every piece of information about each patient available to clinicians, including patient health history, diagnosis by other providers, lab results and prescriptions. As a result, Kaiser has seen marked improvements in outcomes: 26% reduction in office visits per member and a 57% reduction in medication errors.

Ongoing value, however, requires continuous investment in data. Investments in data integration and data quality ensure that information from the EMR is integrated with other sources (think claims, social, billing, supply chain) so that clinicians and decision makers have access in the format they need. Without this, self-service intelligence can be inhibited by duplicate data, poor quality data or application silos.

  • Data is popular…ensure it is clean.

Healthcare leaders can finally rely on electronic data to make strategic decisions. A CHRISTUS Health anecdote you might relate to – In a weekly meeting each executive reviews their strategic dashboard; these dashboards drive strategic decision making about CPOE adoption (computerized physician order entry), emergency room wait times and price per procedure. Powered by enterprise information management, these dashboards paint a reliable and consistent view across the system’s 60 hospitals. Previous to the implementation of an enterprise data platform, each executive was reliant on their own set of data.

In the pre-data investment era, seemingly common data elements from different sources did not mean the same thing. For example, “Admit Date” in one report reflected the emergency department admission date whereas “Admit Date” in another report referred to the inpatient admission date.

  •  Sharing data is necessary…make it safe.

To cure cancer, reduce costs and engage patients, care providers need access to data and not just the data they generate; it has to be shared for coordination of care through transitions of care and across settings, i.e. home care, long term care and behavioral health. Fortunately, Consumers and Clinicians agree on this, PWC reports that 56% of consumers and 30% of physicians are comfortable with data sharing for care coordination. Further progress is demonstrated by healthcare organizations willingly adopting cloud based applications –as of 2013, 40% of healthcare organizations were already storing protected health information (PHI) in the cloud.

Increased data access carries risk, leaving health data exposed, however. The threat of data breach or hacking is multiplied by the presence (in many cases necessary) of PHI on employee laptops and the fact that providers are provided increased access to PHI. Ponemon Institute, a security firm estimates that data breaches cost the industry $5.6 billion each year. Investments in data-centric security are necessary to assuage fear, protect personal health data and make secure data sharing a reality.

Early improvements in patient outcomes indicate that the relationship between data and healthcare is a valuable investment. The International Institute of Analytics supports this, reporting that although analytics and data maturity across healthcare lags other industries, the opportunity to positively impact clinical and operational outcomes is significant.

Share
Posted in Big Data, Data Integration, Data Quality, Healthcare, Master Data Management, Mergers and Acquisitions, Operational Efficiency | Tagged , | Leave a comment

Healthcare Consumer Engagement

The signs that healthcare is becoming a more consumer (think patients, members, Healthcareproviders)  driven industry are evident all around us. I see provider and payer organizations clamoring for more data, specifically data that is actionable, relatable and has integrity. Armed with this data, healthcare organizations are able to differentiate around a member/patient-centric view.

These consumer-centric views convey the total value of the relationships healthcare organizations have with consumers. Understanding the total value creates a more comprehensive understanding of consumers because they deliver a complete picture of an individual’s critical relationships including: patient to primary care provider, member to household, provider to network and even members to legacy plans. This is the type of knowledge that informs new treatments, targets preventative care programs and improves outcomes.

Payer organizations are collecting and analyzing data to identify opportunities for  more informed care management and segmentation to reach new, high value customers in individual markets. By segmenting and targeting messaging to specific populations, health plans generate increased member satisfaction and cost effectively expands and manages provider networks.

How will they accomplish this? Enabling members to interact in health and wellness forums, analyzing member behavior and trends and informing care management programs with a 360 view of members… to name a few . Payers will also drive new member programs, member retention and member engagement marketing and sales programs by investigating complete views of member households and market segments.

In the provider space, this relationship building can be a little more challenging because often consumers as patients do not interact with their doctor unless they are sick, creating gaps in data. When provider organizations have a better understanding of their patients and providers, they can increase patient satisfaction and proactively offer preventative care to the sickest (and most likely to engage) of patients before an episode occurs. These activities result in increased market share and improved outcomes.

Where can providers start? By creating a 360 view of the patient, organizations can now improve care coordination, open new patient service centers and develop patient engagement programs.

Analyzing populations of patients, and fostering patient engagement based on Meaningful Use requirements or Accountable Care requirements, building out referral networks and developing physician relationships are essential ingredients in consumer engagement. Knowing your patients and providing a better patient experience than your competition will differentiate provider organizations.

You may say “This all sounds great, but how does it work?” An essential ingredient is clean, safe and connected data.  Clean, safe and connected data requires an investment in data as an asset… just like you invest in real estate and human capital, you must invest in the accessibility and quality of your data. To be successful, arm your team with tools to govern data –ensuring ongoing integrity and quality of data, removes duplicate records and dynamically incorporates data validation/quality rules. These tools include master data management, data quality, metadata management and are focused on information quality. Tools focused on information quality support a total customer relationship view of members, patients and providers.

Share
Posted in Customer Acquisition & Retention, Data Governance, Data Quality, Healthcare, Master Data Management | Tagged , , , , | Leave a comment

Managing a Vast Amount of Data Successfully

managedataI have two kids. In school. They generate a remarkable amount of paper. From math worksheets, permission slips, book reports (now called reading responses) to newsletters from the school. That’s a lot of paper. All of it is presented in different forms with different results – the math worksheets tell me how my child is doing in math, the permission slips tell me when my kids will be leaving school property and the book reports tell me what kind of books my child is interested in reading. I need to put the math worksheet information into a storage space so I can figure out how to prop up my kid if needed on the basic geometry constructs. The dates that permission slips are covering need to go into the calendar. The book reports can be used at the library to choose the next book.

We are facing a similar problem (albeit on a MUCH larger scale) in the insurance market. We are getting data from clinicians. Many of you are developing and deploying mobile applications to help patients manage their care, locate providers and improve their health. You may capture licensing data to assist pharmaceutical companies identify patients for inclusion in clinical trials. You have advanced analytics systems for fraud detection and to check the accuracy and consistency of claims. Possibly you are at the point of near real-time claim authorization.

The amount of data generated in our world is expected to increase significantly in the coming years. There are an estimated 50 petabytes of data in the Healthcare realm, which is predicted to grow by a factor of 50 to 25,000 petabytes by 2020. Healthcare payers already store and analyze some of this data. However in order to capture, integrate and interrogate large information sets, the scope of the payer information will have to increase significantly to include provider data, social data, government data, pharmaceutical and medical product manufacturers data, and information aggregator data.

Right now – you probably depend on a traditional data warehouse model and structured data analytics to access some of your data. This has worked adequately for you up to now, but with the amount of data that will be generated in the future, you need the processing capability to load and query multi-terabyte datasets in a timely fashion. You need the ability to manage both semi-structured and unstructured data.

Fortunately, a set of emerging technologies (called “Big Data”) may provide the technical foundation of a solution. Big Data usually includes data sets with sizes beyond the ability of commonly used software tools to capture, curate, manage and process data within a tolerable amount of time. While some existing technology may prove inadequate to future tasks, many of the information management methods of the past will prove to be as valuable as ever. Assembling successful Big Data solutions will require a fusion of new technology and old-school disciplines:

Which of these technologies do you have? Which of these technologies can integrate with on-premise AND cloud based solutions? On which of these technologies does your organization have knowledgeable resources that can utilize the capabilities to take advantage of Big Data?

Share
Posted in 5 Sales Plays, Big Data, Business Impact / Benefits, Data Integration, Data Warehousing, Healthcare, Master Data Management | Tagged , , , , , | Leave a comment

Responsible Data Breach Reporting

databreachThis week, another reputable organization, Anthem Inc, reported it was ‘the target of  a very sophisticated external cyber attack’. But rather than be upset at Anthem, I respect their responsible data breach reporting.

In this post from Joseph R. Swedish, President and CEO, Anthem, Inc., does something that I believe all CEO’s should do in this situation.  He is straight up about what happened,  what information was breached, actions they took to plug the security hole, and services available to those impacted.

When it comes to a data breach, the worst thing you can do is ignore it or hope it will go away. This was not the case with Anthem.  Mr Swedish did the right thing and I appreciate it.

You only have one corporate reputation – and it is typically aligned with the CEO’s reputation.  When the CEO talks about the details of a data breach and empathizes with those impacted, he establishes a dialogue based on transparency and accountability.

Research that tells us 44% of healthcare and pharmaceutical organizations experienced a breach in 2014. And we know that when personal information when combined with health information is worth more on the black market because the data can be used for insurance fraud.   I expect more healthcare providers will be on the defensive this year and only hope that they follow Mr Swedish’s example when facing the music.

Share
Posted in Data Privacy, Data Security, Governance, Risk and Compliance, Healthcare | Tagged , , | 1 Comment

Patient Experience-The Quality of Your Data is Important!

Patient_Experience

Patient Experience-Viable Data is Important!

Patient experience is key to growth and success for all health delivery organizations.  Gartner has stated that the patient experience needs to be one of the highest priorities for organizations. The quality of your data is critical to achieving that goal.  My recent experience with my physician’s office demonstrates how easy it is for the quality of data to influence the patient experience and undermine a patient’s trust in their physician and the organization with which they are interacting.

I have a great relationship with my doctor and have always been impressed by the efficiency of the office.  I never wait beyond my appointment time, the care is excellent and the staff is friendly and professional.  There is an online tool that allows me to see my records, send messages to my doctor, request an appointment and get test results. The organization enjoys the highest reputation for clinical quality.  Pretty much perfect from my perspective – until now.

I needed to change a scheduled appointment due to a business conflict.  Since I expected some negotiation I decided to make the phone call rather than request it on line…there are still transactions for which human to human is optimal!  I had all my information at hand and made the call.  The phone was pleasantly answered and the request given.  The receptionist requested my name and date of birth, but then stated that I did not have a future appointment.  I am looking at the online tool, which clearly states that I am scheduled for February 17 at 8:30 AM.  The pleasant young woman confirms my name, date of birth and address and then tells me that I do not have an appointment scheduled.  I am reasonably savvy about these things and figured out the core problem, which is that my last name is hyphenated.  Armed with that information, my other record is found and a new appointment scheduled. The transaction is completed.

But now I am worried. My name has been like this for many years and none of my other key data has changed.   Are there parts of my clinical history missing in the record that my doctor is using?   Will that have a negative impact on the quality of my care?  If I were to be unable to clearly respond, might that older record be accessed and my current medications and history not be available?  The receptionist did not address the duplicate issue clearly by telling me that she would attend to merging the records, so I have no reason to believe that she will.  My confidence is now shaken and I am less trustful of the system and how well it will serve me going forward.  I have resolved my issue, but not everyone would be able to push back to insure that their records are now accurate.

Many millions of dollars are being spent on electronic health records.  Many more millions are being spent to redesign work flow to accommodate the new EHR’s. Physicians and other clinicians are learning new ways to access data and treat their patients. The foundation for all of this is accurate data.  Nicely displayed but inaccurate data will not result in improved care or enhanced member experience.  As healthcare organizations move forward with the razzle dazzle of new systems they need to remember the basics of good quality data and insure that it is available to these new applications.

Share
Posted in Data Privacy, Data Quality, Healthcare, Operational Efficiency | Tagged , , , | Leave a comment

The Quality of the Ingredients Make the Dish-Applies to Data Quality

Data_Quality

Data Quality Leads to Other Integrated Benefits

In a previous life, I was a pastry chef in a now-defunct restaurant. One of the things I noticed while working there (and frankly while cooking at home) is that the better the ingredients, the better the final result. If we used poor quality apples in the apple tart, we ended up with a soupy, flavorless mess with a chewy crust.

The same analogy can be applied to Data Analytics. With poor quality data, you get poor results from your analytics projects. We all know that companies that can implement fantastic analytic solutions that can provide near real-time access to consumer trends are the same companies that can do successful targeted marketing campaigns that are of the minute. The Data Warehousing Institute estimates that data quality problems cost U.S. businesses more than $600 billion a year.

The business impact of poor data quality cannot be underestimated. If not identified and corrected early on, defective data can contaminate all downstream systems and information assets, jacking up costs, jeopardizing customer relationships, and causing imprecise forecasts and poor decisions.

  • To help you quantify: Let’s say your company receives 2 million claims per month with 377 data elements per claim. Even at an error rate of .001, the claims data contains more than 754,000 errors per month and more than 9.04 million errors per year! If you determine that 10 percent of the data elements are critical to your business decisions and processes, you still must fix almost 1 million errors each year!
  • What is your exposure to these errors? Let’s estimate the risk at $10 per error (including staff time required to fix the error downstream after a customer discovers it, the loss of customer trust and loyalty and erroneous payouts. Your company’s risk exposure to poor quality claims data is $10 million a year.

Once your company values quality data as a critical resource – it is much easier to perform high-value analytics that have an impact on your bottom line. Start with creation of a Data Quality program. Data is a critical asset in the information economy, and the quality of a company’s data is a good predictor of its future success.

Share
Posted in Business Impact / Benefits, Cloud Data Integration, Customer Services, Data Aggregation, Data Integration, Data Quality, Data Warehousing, Database Archiving, Healthcare, Master Data Management, Profiling, Scorecarding, Total Customer Relationship | Tagged , , , , | 2 Comments

Software Modernization Strategies

Software Modernization

A Level-Up – Software Modernization

Every year, I get a replacement desk calendar to help keep all of our activities straight – and for a family of four, that is no easy task. I start with taking all of the little appointment cards the dentist, orthodontist, pediatrician and GP give to us for appointments that occur beyond the current calendar dates. I transcribe them all. Then I go through last year’s calendar to transfer any information that is relevant to this year’s calendar. And finally, I put the calendar down in the basement next to previous year calendars so I can refer back to them if I need. Last year’s calendar contains a lot of useful information, but no longer has the ability to solve my need to organize schedules for this year.

In a very loose way – this is very similar to application retirement. Many larger health plans have existing systems that were created several years (sometimes even several decades) ago. These legacy systems have been customized to reflect the health plan’s very specific business processes. They may be hosted on costly hardware, developed in antiquated software languages and rely on a few developers that are very close to retirement. The cost of supporting these (most likely) antiquated systems can be diverting valuable dollars away from innovation.

The process that I use to move appointment and contact data from one calendar to the next works for me – but is relatively small in scale. Imagine if I was trying to do this for an entire organization without losing context, detail or accuracy!

There are several methodologies for determining the best strategy for your organization to approach software modernization, including:

  • Architecture Driven Modernization (ADM) is the initiative to standardize views of the existing systems in order to enable common modernization activities like code analysis and comprehension, and software transformation.
  • SABA (Bennett et al., 1999) is a high-level framework for planning the evolution and migration of legacy systems, taking into account both organizational and technical issues.
  • SRRT (Economic Model to Software Rewriting and Replacement Times), Chan et al. (1996), Formal model for determining optimal software rewrite and replacement timings based on versatile metrics data.
  • And if all else fails:  Model Driven Engineering (MDE) is being investigated as an approach for reverse engineering and then forward engineering software code

My calendar migration process evolved over time, your method for software modernization should be well planned prior to the go-live date for the new software system.

Share
Posted in Application ILM, Application Retirement, Business Impact / Benefits, Data Archiving, Data Migration, data replication, Database Archiving, Healthcare, Operational Efficiency | Tagged , , | Leave a comment

How Protected is your PHI?

I live in a very small town in Maine. I don’t spend a lot of time thinking about my privacy. Some would say that by living in a small town, you give up your right to privacy because everyone knows what everyone else is doing. Living here is a choice – for me to improve my family’s quality of life. Sharing all of the details of my life – not so much.

When I go to my doctor (who also happens to be a parent from my daughter’s school), I fully expect that any sort of information that I share with him, or that he obtains as a result of lab tests or interviews, or care that he provides is not available for anyone to view. On the flip side, I want researchers to be able to take my lab information combined with my health history in order to do research on the effectiveness of certain medications or treatment plans.

As a result of this dichotomy, Congress (in 1996) started to address governance regarding the transmission of this type of data. The Health Insurance Portability and Accountability Act of 1996 (HIPAA) is a Federal law that sets national standards for how health care plans, health care clearinghouses, and most health care providers protect the privacy of a patient’s health information. With certain exceptions, the Privacy Rule protects a subset of individually identifiable health information, known as protected health information or PHI, that is held or maintained by covered entities or their business associates acting for the covered entity. PHI is any information held by a covered entity which concerns health status, provision of health care, or payment for health care that can be linked to an individual.

Many payers have this type of data in their systems (perhaps in a Claims Administration system), and have the need to share data between organizational entities. Do you know if PHI data is being shared outside of the originating system? Do you know if PHI is available to resources that have no necessity to access this information? Do you know if PHI data is being shared outside your organization?

If you can answer yes to each of these questions – fantastic. You are well ahead of the curve. If not – you need to start considering solutions that can

I want to researchers to have access to medically relevant data so they can find the cures to some horrific diseases. I want to feel comfortable sharing health information with my doctor. I want to feel comfortable that my health insurance company is respecting my privacy. Now to get my kids to stop oversharing.

Share
Posted in Customers, Data Governance, Data masking, Data Privacy, Data Security, Governance, Risk and Compliance, Healthcare | Tagged , , , , , | Leave a comment