Category Archives: Healthcare
Here it is day four of National Health IT Week and there have been lots of interesting things to talk about and be excited for the prospects of Health IT up to this point: EHRs capturing lots of great data; analytics and predictive modeling to discover insights we’ve not been able to have previously; and where the hype of big data meets reality. But having access to data and being able to do cool things with it to deliver higher value care is only half the story – the other half is how our reimbursement system is also changing to align financial incentives with quality and value.
We’re seeing the early beginnings of value-based reimbursement changing behavior, with one of the more intriguing being the Center for Medicare and Medicaid Services (CMS) Hospital Readmission Reduction Program (HRRP). Under this program, CMS calculates a risk-adjusted ‘expected readmission rate’ for each hospital in the country for congestive heart failure (CHF), acute myocardial infarction (AMI) and pneumonia. If a hospital’s actual readmission rate exceeds the expected rate, their Medicare reimbursements are penalized for the following year by up to 1% in 2013, 2% in 2014, and 3% in 2015. Given that hospitals typically receive 50% or more of their revenue from CMS, these penalties rapidly add up to real money and are prompting meaningful action on the part of hospitals to understand the causes of readmissions and address them proactively. But what’s intriguing is to get beyond thinking of the HRRP as a penalty-imposing program, and instead think of it as an outcomes-based reimbursement program.
What CMS has really done is effectively say the payment for a CHF admission, for example, is not just for the individual admission, but rather a payment to keep the patient out of the hospital for the next 30 days. And not just keep them out of the hospital for CHF, but for any reason since ‘readmission’ is all-inclusive of most any reason for the readmission and not limited to just the original reason for discharge. This requires some radical rethinking of how hospitals and health systems measure quality and value, since it brings to bear things they have traditionally had little control over such as patient compliance with discharge instructions and adherence to best-practices by physicians in the community. Even though it may not be directly under the hospital’s control, hospitals for the first time have a meaningful financial incentive to encourage patients and others to ensure appropriate follow-up care and activities are accomplished.
Tomorrow we will be a quick wrap-up and a pat on the back for all the tireless effort of an entire industry in the midst of unprecedented change.
So it’s ‘hump day’ of National Health IT Week and we’ve already talked about how EHR’s are capturing a treasure-trove of rich data, and the burgeoning enthusiasm for analytics and predictive modeling that is nature consequence of having this data.
But what about the whole “big data” thing? I’ve been on the fence as less than enthusiastic about all the hand waving and bell ringing surrounding the big data movement in healthcare, simply based upon our inability as an industry to really do anything particularly useful with the “little data” we already have. We need look no further than all of the angst and heartfelt bickering over the very modest data requirements of Meaningful Use Stage I requirements to validate this assessment of our industry readiness for “big data”. This isn’t saying that there have not been pockets of incredibly talented individuals, applying extraordinary effort, to do cool things with data, and yielding some glimmers of hope for “big data”. But as an industry, we have not really done much with the data we already have to understand what works, and doesn’t work, and use that insight to change our behavior. And when it comes to data, I don’t think you can run before you can walk, and in my mind big data is Olympic-caliber running. It’s also important to consider my opinion only in the context of strongly structured discrete data like lab results and coded clinical data entered into EHRs. This distinction is relevant since when it comes to things like digital imaging, PACS, and advanced visualization algorithms, the broader healthcare market has arguably been dealing with “big data” for more than a decade.
But I’m beginning to change my mind on big data in healthcare. And rather than walking before you run, big data may present on opportunity to leap ahead in deriving value from data in very focused areas, even before full competence in ‘small data’ analytics has been achieved. The reason for my change in thinking is really related to (a) an evolving understanding of how big data technologies can be applied to existing data problems, and (b) compelling new sources of data, and potential solutions, that have never before been possible.
Big data isn’t just about doing analysis of twitter feeds and facebook posts (which encompass all three V’s of big data – volume, variety and velocity) it can also be about having the cost-effective processing horsepower to do much more sophisticated analytics on the clinical or billing data we already have. Or the data that we’re going to have from our EHRs. Rather than testing clinical or financial models against a month’s worth of data, or a quarter’s worth of data, now we can test those same models against a year, or five, or a decade’s worth of data. This same processing horsepower means analysis that might have taken days or weeks can now be done in minutes or hours, which makes the results that much more valuable in impacting clinical care and changing frontline staff behaviors. And to the extent these big data technologies can be adopted and applied to today’s data analytics needs by mere mortals, then all the better. For example, Hadoop has been a centerpiece in the whole big data hype circus, and historically has been a complex beast that can only be mastered by the most advanced and sophisticated IT shops. And who in their right mind with all the other challenges facing healthcare IT (ICD-10 conversion, EHR implementation, HIPPA privacy audits, flat budgets, health information exchanges, etc.) wants to try something like that? But what we’re seeing is a maturing of the Hadoop technology stack and a vendor ecosystem developing around the platform with solutions that make it much more practical that healthcare organizations will be able to try out some of these big data solutions. For example, Informatica has recently announced support for Hadoop such that any transformations or data quality rules created in Informatica can be run on Hadoop unchanged – taking advantage of the lower cost and higher performance of the Hadoop platform while avoiding much of the complexity and specialized skills that have traditionally been a huge barrier to adoption. One potential area this sort of approach could be applied is crowdsourcing medical decisions – a topic I have previously written about that I think has very real implications for the providers aspiring to become “learning healthcare organizations” which you can read here.
There was also a good report from InformationWeek titled 2013 Healthcare IT Priorities that observed a growing proliferation of data coming from mobile devices and personal medical monitors, which in my mind will inevitably hit all three V’s (volume, velocity and variety) that traditionally define big data. They further state that we are rapidly heading towards a future where acquiring data ceases to be the problem, but figuring out what to do with it becomes the real challenge.
In this same vein, I also believe that some of some of the more consumer-oriented “big data things” have potential promise in healthcare. The creation of an individual insurance market is going to drive a tectonic shift in the perspective of payors as they reorient their sales and marketing from selling coverage in large chunks to employers, and instead need to understand and target the far more finicky individual consumer with a very different perspective on value and customer service than their traditional employer buyer. In this situation, all those social media feeds and the sentiment analysis they can reveal become very compelling with a demonstrable ROI, as does old-fashioned analytics on big data such as web click-stream analytics to optimize the consumer experience on their websites.
There is also clear potential in combining consumer data with clinical data to provide a more complete 360 degree view of the patient for providers – bringing key information such as what over-the-counter drugs a patient may have bought over the last 30-60-90 days that may have a marked adverse reaction with a prescription they are taking, or just be clinically undesirable (such as someone with hypertension taking an OTC antihistamine for example). Providing this insight to their provider at the time of the patient’s next visit – or better still, apply rules in near-real-time as soon as the data becomes available and alerting the patient’s care team even when no visit is scheduled – can really change the role of physicians in orchestrating a patient’s health and wellness rather than simply treating symptoms and disease during an office visit or hospital admission.
Tomorrow we will move on to what is motivating healthcare providers to finally take a genuine interest in analytics and business intelligence, with that thing being align financial incentives.
Yesterday I wrote about the proliferation of rich clinical data being captured by electronic health records and how unlocking the potential in this data will be the fuel of healthcare innovation. We’re seeing a true blossoming of the market for analytics and predictive modeling in healthcare, and no amount of delay or potentially backtracking on The Accountable Care Act is going to un-ring this bell. This focus on analytics is a natural consequence of simply having the data available to gain insight into business and clinical processes and outcomes, much as analytics was a natural follow-on in the years after the implementation of SAP and other new ERP applications motivated by the Y2K programming flaw. And the groundswell is only just beginning as organizations stick their collective toes in the water of analytics and find that sharks, and alligators, and other predators are not nearly so common as they thought. I spend a significant amount of my time meeting with senior leaders in large healthcare organizations looking to get started in a big way with business intelligence, or analytics, or enterprise data management. By any name, there’s a common perception dating back a decade or more that an enterprise data warehouse was something that cost millions of dollars, took years to implement, ultimately failed to meet business expectations, and often resulted in the CIO having the opportunity to apply their skills in a new organization. The good news is that the patterns and behaviors that can lead to failure in an enterprise data warehousing initiative are pretty well understood having been learned through hard knocks in industries that have figured out how to do enterprise data warehouses – retail, manufacturing, and financial services to name a few.
Equally inspiring that healthcare will be successful with enterprise data warehousing efforts is that our historical reluctance as an industry to looking outside healthcare to other industries for best practices seems to be taking a backseat to the practical reality that there’s a lot to be learned from folks who have already been successful in data warehousing – even if it’s not specifically in healthcare. This only makes sense, since the architecture and discipline of what it takes to make data useful at enterprise scale is a real talent, and an art form, and folks who have done it successfully are a rare breed. Would it be nice if they had specific healthcare expertise? Sure. But the reality is that a modern, successful enterprise information management strategy hinges on the ability of the business, clinical experts, and IT to collaborate in a highly dynamic, iterative fashion to really derive value from data. So with this team approach, the fact that the IT folks may be just data and process experts is very much mitigated by the other team members who are business and clinical experts.
Tomorrow we will jump on the hype wagon of “Big Data” and see what that has to offer the healthcare industry.
Another national Health IT Week is upon us, and with no immediately apparent need to send flowers, chocolates or baked goods to some worthy recipient, I am left to simply reflect on the rather remarkable state we find ourselves in as an industry and as healthcare IT professionals. Starting today and for the next four days I’ll take a moment to opine on the state of the industry and why I think there’s never been a more promising time for health IT to transform our healthcare system, or a more exciting time to be a health IT professional.
Despite lots of grousing and some inevitable bumps along the way, HITECH is largely a success in serving as a catalyst for the widespread adoption of electronic health records among providers. Health and Human Services recently announced that more than 50% of physicians and 80% of hospitals will be using an EHR by the end of 2013 — up from just 17% and 9%, respectively, in 2008. So after decades of being relegated to making due with claims data as the principle fodder for analytics, we are beginning to experience the dawning of a new age of healthcare analytics and predictive modeling with rich clinically relevant data as the raw material to feed our analytics appetite.
Meaningful Use Stage I has been a masterful first increment in beginning the culture shift necessary for healthcare providers to begin to appreciate both the value of reliable, trustworthy data, as well as understand that useful, high-quality data is the result of an end-to-end process that begins with data entry in the application and ends with useful analytics, reports, and other data products. This is not to say that technical missteps along the way don’t have a role in adversely impacting data quality, but by far the most pervasive challenges are in how the data is captured at the point of care or data entry. This shift in awareness is a critical step in moving beyond simple compliance with Meaningful Use data requirements, and instead looking at data as an enterprise asset where the business and clinical sides of the business are held accountable for the quality of the asset, rather than IT.
Tomorrow we’ll look at how analytics and predictive modeling is poised to unlock the potential of all this data to transform healthcare.
As we head into National Health IT Week … like any good writer faced with a blank sheet, I was battling writers block by perusing Facebook. Coincidentally, I came across this HBR article. Healthcare — on the front page of the Harvard Business Review; so main-stream!
I implemented a Radiology Information System in 2000 and an electronic Medication Administration Record (MAR) in 2002. Back in the day, healthcare IT was the underdog, only the geekiest of geeks were up all night comparing paper MARs to electronic MARs, working side by side with the nurses and HIM to iron out bugs and taking delivery of new code into the wee hours of the morning.
Then I thought about previous National Health IT Week events. I remember gathering in DC with a bunch of other healthcare IT geeks professionals, discussing the importance of health IT. Many may not realize the type of advocacy and awareness that occurs during this week – it’s pretty impactful. We had the unique experience of walking to the office of Senator Dick Durbin, meeting with him and requesting his assistance in making healthcare IT top of mind.
We’ve come a long way. But. We have a long way to go.
In the recent past healthcare has invested heavily in applications and infrastructure; EMR adoption is up, people are commonly using the words “healthcare analytics” and “data” is everyone’s favorite four letter word. As data surfaces to the top of minds, gaining access to it, improving the quality of it and making sure that everyone trusts it has to be the next step for healthcare providers and payers. Hand coding interactions between systems is time intensive and error prone, information in aggregate magnifies data inconsistencies and data quality errors – for example, it’s always surprising to learn how many different ways a single enterprise can document marital status.
The reason to drill into this data is that locked in this data are the keys to value driven healthcare. To derive value from data, a commiserate investment in data is necessary. I hope that this year’s National Health IT week includes a focus on and discussion of the data itself – making it accessible and trustworthy — and the types of tools required to do this. Becoming data-driven is the only way to succeed in this value based model we are moving to. The three pillars of data driven healthcare are 1) Accessing and Using Data as an Asset, 2) Having Knowledge of All Participants and Actors and 3) Taking Action on What you Know.
ROI = every executive’s favorite acronym and one that is often challenging to demonstrate.
In our interactions with provider clients and prospects we are hearing that they’ve migrated to new EMRs but aren’t receiving the ROI they had budgeted or anticipated. In many cases, they are using the new EMR for documentation but still paying to maintain the legacy EMR for access to historical data for billing and care delivery. If health systems can retire these applications and still maintain operational access to the data, they will be able to realize the expected ROI and serve patients proactively.
My colleague Julie, Lockner wrote a blog post about how Informatica Application Retirement for Healthcare is helping healthcare organizations to retire legacy applications and realize ROI.
Healthcare organizations are currently engaged in major transformative initiatives. The American Recovery and Reinvestment Act of 2009 (ARRA) provided the healthcare industry incentives for the adoption and modernization of point-of-care computing solutions including electronic medical and health records (EMRs/EHRs). Funds have been allocated, and these projects are well on their way. In fact, the majority of hospitals in the US are engaged in implementing EPIC, a software platform that is essentially the ERP for healthcare.
These Cadillac systems are being deployed from scratch with very little data being ported from the old systems into the new. The result is a dearth of legacy applications running in aging hospital data centers, consuming every last penny of HIS budgets. Because the data still resides on those systems, hospital staff continues to use them making it difficult to shut down or retire.
Most of these legacy systems are not running on modern technology platforms – they run on systems such as HP Turbo Image, Intercache Mumps, and embedded proprietary databases. Finding people who know how to manage and maintain these systems is costly and risky – risky in that if data residing in those applications is subject to data retention requirements (patient records, etc.) and the data becomes inaccessible.
A different challenge for CFOs of these hospitals is the ROI on these EPIC implementations. Because these projects are multi-phased, multi-year, boards of directors are asking about the value realized from these investments. Many are coming up short because they are maintaining both applications in parallel. Relief will come when systems can be retired – but getting hospital staff and regulators to approve a retirement project requires evidence that they can still access data while adhering to compliance needs.
Many providers have overcome these hurdles by successfully implementing an application retirement strategy based on the Informatica Data Archive platform. Several of the largest pediatrics’ children’s hospitals in the US are either already saving or expecting to save $2 Million or more annually from retiring legacy applications. The savings come from:
- Eliminating software maintenance and license costs
- Eliminate hardware dependencies and costs
- Reduced storage requirements by 95% (data archived is stored in a highly compressed, accessible format)
- Improved efficiencies in IT by eliminating specialized processes or skills associated with legacy systems
- Freed IT resources – teams can spend more of their time working on innovations and new projects
Informatica Application Retirement Solutions for Healthcare provide hospitals with the ability to completely retire legacy applications, retire and maintain access to archive data for hospital staff. And with built in security and retention management, records managers and legal teams are satisfying compliance requirements. Contact your Informatica Healthcare team for more information on how you can get that EPIC ROI the board of directors is asking for.
Interesting that I found this one. Informatica announced that two Informatica customers were named Leaders in the Ventana Research 2013 Leadership Awards, which honor the leaders and pioneers who have contributed to their organizations’ successes. While many of you may think that I’m shelling for our host, these stories are actually hard to come by, and when I find them I love to make hay.
This is not a lack of interest; it’s just the fact that those successful with data integration projects are typically the unsung heroes of enterprise IT. There are almost never awards. However, those who count on enterprise IT to provide optimal data flow in support of the business processes should understand that the data integration got them there. In this case, one of the more interesting stories was around UMASS Memorial Health Care: Leadership Award For CIO George Brenckle
“The Ventana Research Leadership Awards recognize organizations and supporting technology vendors that have effectively achieved superior results through using people, processes, information and technology while applying best practices within specific business and technology categories.” Those receiving these awards leverage the Informatica Platform, thus why the award is promoted. However, I find the approaches and the way technology is leveraged most interesting.
Just as a bit of background. UMASS Memorial Health Care undertook the Cornerstone initiative to transform the way data is used across its medical center, four community hospitals, more than 60 outpatient clinics, and the University of Massachusetts Medical School. The geographical distribution of these entities, and the different ways that they store data is always the challenge.
When approaching these problems you need two things: First, a well defined plan as to how you plan on approaching the problem, including the consumption of information from the source, the processing of that information, and the production of that information to the target. Cornerstone implements common patient, clinical and financial systems and drive information across these systems to optimize healthcare delivery and improve patient outcomes, grow the patient population and increase efficiency.
UMASS Memorial Health Care used Informatica to establish a data integration and data quality initiative to incorporate data from its clinical, financial and administrative sources and targets. Using the Informatica technology, they are able to place volatility into the domain of the integration technology, in this case, Informatica. This allows the integration administrator to add or delete systems as needed, and brings the concept of agility to a rapidly growing hospital system.
“The success of Cornerstone has resulted in primary patient panel analytics, online diabetes care, a command center for the ICU, and compliance with Medicare programs.” Indeed, the business results are apparent around the use of data integration approaches and technology, including the ability to trace this project back to an immediate and definable business benefit.
A lesson learned from other industries, like retail and financial services, is that while analytics and data warehouses are critical components to delivering big results from data — neither is easy. Gartner reported that 80% of data warehousing initiatives fail to meet expectations, often running over budget and failing to deliver a ROI.
- Executives are often frustrated because responses to their requests for new reports and edited reports take too long
- Misunderstood requirements and costly rework are the result of a lack of collaboration between stakeholders and IT
- BI consumers lose confidence in data; they don’t trust it because they lack transparency into its lineage and don’t understand why it appears differently after being aggregated with data from other applications
Expecting value from data without making a commiserate investment in data results in unmet expectations. Accessing data is hard, each request requires new effort, establishing enterprise standards for data quality are an enormous effort and transforming data to fit into a heterogeneous intelligence environment is complicated and time consuming.
Introducing multiple sources of data across organizational boundaries creates a need for an environment that supports effective collaboration between stakeholders and the information technology team implementing solutions to manage data. To be genuinely useful, data must be verifiable and trustworthy since only then will stakeholders have the confidence to make data-driven decisions. To realize the value of data, from Epic and beyond, IT leaders must implement business intelligence and data warehousing best practices that:
- bring data together across applications including clinical and financial data
- foster collaboration between clinicians, IT and business stakeholders
- establish trust and confidence in business intelligence and decision making.
EMR vendors have long encouraged that their EMR and business intelligence capabilities negate the need to have a plan to integrate data or implement a separate data warehousing and business intelligence. This philosophy begs the question – how can one transactional clinical application support the intelligence needs of an enterprise? Consider customer relationship management data for feeding customer driven marketing initiatives, time tracking data full of valuable employee utilization stats, payer claims data and newly acquired practices running an EMR independent of Epic… just to name a few.
With the recognition that an EMR accounts for only a fraction of the data needed for reliable and comprehensive business intelligence comes requirements to reconcile terminology and data quality standards across an increasingly large set of trading partners and stakeholders, to access data from other sources (like payroll, CRM and claims) and to migrate clinical data from legacy applications.
In fact, business intelligence and analytics are dependent on data from across the enterprise. Most clinical and financial decisions are dependent on data; great potential lies within data – making it a valuable asset. This is not a new idea. What is a newer concept is what it means to really elevate data to the status of an asset. Unlocking the potential of data as an asset requires that healthcare organizations begin to think about and invest in data in new ways; making investments beyond traditional infrastructure like databases and data storage. Healthcare organizations must make investments in the ongoing management and improvement of the data itself as they do with any other asset, like talent, buildings or their EMR – for example understanding its quality and allocating people and systems to managing it. Moving faster in this competitive climate and delivering differentiated results requires it.
Check back next week for Part II which explores treating data as an asset further.