Category Archives: Healthcare
Saeed, what does Decision Point do?
We are a healthcare engagement analytics company…essentially we help clients that are “at risk” organizations to improve performance, including STAR ratings. We do this by providing data driven insights to more effectively engage members and providers.
What type of data do you use to make these recommendations?
Well, taking better care of members is about emotionally involving them in their care. Information to help do this resides in data that plans already have available, i.e. utilization patterns, distance to doctors, if they are compliant with evidence based guidelines, do they call into the call center. We also seek to include information about their behavior as a consumer. such as their lifestyles, their access to technology, and so forth.
Claims data makes sense, everyone has that but the other data you mentioned, that can be harder to capture. Why does non-claims oriented data matter?
We develop predictive models that are unique for each client – specifically based on the demographics and variables of their population. Variables like exercise and technology access matter because — for example, exercise habits influence mood and access to technology demonstrates a way to contact them or invite them to participate in online communities with other members like themselves.
The predictive models then determine which members are at most risk?
Yes, yes they do but they can also determine a member’s barriers to desired behavior, and their likelihood of responding to and acting on health plan communications. For example, if we identified a diabetic member as high risk of non-compliance, found their primary barrier to compliance as health literacy, and determined that the member will likely respond positively to a combination of health coaching and mobile health initiatives, we would recommend outreach that directly addresses these findings..
Noreen, when you were working on the payer side of the house, how were you going about determining which members were in your at risk population?
We had teams of people doing mining of claims data and we were asking members to complete surveys. This made for more data but the sheer volume of data made it complex to accurately review and assess which members were at highest risk. It was very challenging to take into consideration all of the variables that impact each member. Taking data from so many disparate sources and bringing it together is a big challenge.
What made it (and continues to make it) it so challenging, specifically to STARS?
So much of the data is collected as surveys or in other non-standard formats. Members inherently are unique which creates a lot of variability and it is often difficult to interpret the relationships that exist between members and primary care physicians, specialists, facilities and the rest of their care team. Relationships are important because they can provide insights into utilization patterns, potential overlaps or gaps in care and how we can most effectively engage those members in their care.
What are Informatica and Decision Point doing together?
To optimize the predictive models, as Saeed described, it’s imperative to feed them as much data and as accurate of data as possible. Without data, insights will be missed… and insights are the path to discovery and to improving CMS STARS ratings. Informatica is the data integration company — we ensure that data is reliable), connected (from any source to any target) and safe (avoiding data breaches or HIPAA violations). Informatica is delivering data to Decision Point efficiently and effectively so that clients have access to the best data possible to derive insights and improve outcomes. Our technology also provided the Star team with a member profile which brings together that disparate data and organizes it into the 360 degree view of that member. In addition to fueling Decision Point’s powerful algorithms, this is a tool that can be used for ongoing insights into the members.
Excellent, how can readers learn more?
In my last blog, I talked about the dreadful experience of cleaning raw data by hand as a former analyst a few years back. Well, the truth is, I was not alone. At a recent data mining Meetup event in San Francisco bay area, I asked a few analysts: “How much time do you spend on cleaning your data at work?” “More than 80% of my time” and “most my days” said the analysts, and “they are not fun”.
But check this out: There are over a dozen Meetup groups focused on data science and data mining here in the bay area I live. Those groups put on events multiple times a month, with topics often around hot, emerging technologies such as machine learning, graph analysis, real-time analytics, new algorithm on analyzing social media data, and of course, anything Big Data. Cools BI tools, new programming models and algorithms for better analysis are a big draw to data practitioners these days.
That got me thinking… if what analysts said to me is true, i.e., they spent 80% of their time on data prepping and 1/4 of that time analyzing the data and visualizing the results, which BTW, “is actually fun”, quoting a data analyst, then why are they drawn to the events focused on discussing the tools that can only help them 20% of the time? Why wouldn’t they want to explore technologies that can help address the dreadful 80% of the data scrubbing task they complain about?
Having been there myself, I thought perhaps a little self-reflection would help answer the question.
As a student of math, I love data and am fascinated about good stories I can discover from them. My two-year math program in graduate school was primarily focused on learning how to build fabulous math models to simulate the real events, and use those formula to predict the future, or look for meaningful patterns.
I used BI and statistical analysis tools while at school, and continued to use them at work after I graduated. Those software were great in that they helped me get to the results and see what’s in my data, and I can develop conclusions and make recommendations based on those insights for my clients. Without BI and visualization tools, I would not have delivered any results.
That was fun and glamorous part of my job as an analyst, but when I was not creating nice charts and presentations to tell the stories in my data, I was spending time, great amount of time, sometimes up to the wee hours cleaning and verifying my data, I was convinced that was part of my job and I just had to suck it up.
It was only a few months ago that I stumbled upon data quality software – it happened when I joined Informatica. At first I thought they were talking to the wrong person when they started pitching me data quality solutions.
Turns out, the concept of data quality automation is a highly relevant and extremely intuitive subject to me, and for anyone who is dealing with data on the regular basis. Data quality software offers an automated process for data cleansing and is much faster and delivers more accurate results than manual process. To put that in math context, if a data quality tool can reduce the data cleansing effort from 80% to 40% (btw, this is hardly a random number, some of our customers have reported much better results), that means analysts can now free up 40% of their time from scrubbing data, and use that times to do the things they like – playing with data in BI tools, building new models or running more scenarios, producing different views of the data and discovering things they may not be able to before, and do all of that with clean, trusted data. No more bored to death experience, what they are left with are improved productivity, more accurate and consistent results, compelling stories about data, and most important, they can focus on doing the things they like! Not too shabby right?
I am excited about trying out the data quality tools we have here at Informtica, my fellow analysts, you should start looking into them also. And I will check back in soon with more stories to share..
Regardless of the industry, new regulatory compliance requirements are more often than not treated like the introduction of a new tax. A few may be supportive, some will see the benefits, but most will focus on the negatives – the cost, the effort, the intrusion into private matters. There will more than likely be a lot of grumbling.
Across many industries there is currently a lot of grumbling, as new regulation seems to be springing up all over the place. Pharmaceutical companies have to deal with IDMP in Europe and UDI in the USA. This is hot on the heels of the US Sunshine Act, which is being followed in Europe by Aggregate Spend requirements. Consumer Goods companies in Europe are looking at the consequences of beefed up 1169 requirements. Financial Institutes are mulling over compliance to BCBS-239. Behind the grumbling most organisations across all verticals appear to have a similar approach to regulatory compliance. The pattern seems to go like this:
- Delay (The requirements may change)
- Scramble (They want it when? Why didn’t we get more time?)
- Code to Spec (Provide exactly what they want, and only what they want)
No wonder these requirements are seen as purely a cost and an annoyance. But it doesn’t have to be that way, and in fact, it should not. Just like I have seen a pattern in response to compliance, I see a pattern in the requirements themselves:
- The regulators want data
- Their requirements will change
- When they do change, regulators will be wanting even more data!
Now read the last 3 bullet points again, but use ‘executives’ or ‘management’ or ‘the business people’ instead of ‘regulators’. The pattern still holds true. The irony is that execs will quickly sign off on budget to meet regulatory requirements, but find it hard to see the value in “infrastructure” projects. Projects that will deliver this same data to their internal teams.
This is where the opportunity comes in. pwc’s 2013 State of Compliance Report[i] shows that over 42% of central compliance budgets are in excess of $1m. A significant figure. Efforts outside of the compliance team imply a higher actual cost. Large budgets are not surprising in multi-national companies, who often have to satisfy multiple regulators in a number of countries. As an alternate to multiple over-lapping compliance projects, what if this significant budget was repurposed to create a flexible data management platform? This approach could deliver compliance, but provide even more value internally.
Almost all internal teams are currently clamouring for additional data to drive ther newest application. Pharma and CG sales & marketing teams would love ready access to detailed product information. So would consumer and patient support staff, as well as down-stream partners. Trading desks and client managers within Financial Institutes should really have real-time access to their risk profiles guiding daily decision making. These data needs will not be going away. Why should regulators be prioritised over the people who drive your bottom line and who are guardians of your brand?
A flexible data management platform will serve everyone equally. Foundational tools for a flexible data management platform exist today including Data Quality, MDM, PIM and VIBE, Informatica’s Virtual Data Machine. Each of them play a significant role in easing of regulatory compliance, and as a bonus they deliver measureable business value in their own right. Implemented correctly, you will get enhanced data agility & visibility across the entire organisation as part of your compliance efforts. Sounds like ‘Buy one Get One Free’, or BOGOF in retail terms.
Unlike taxes, BOGOF opportunities are normally embraced with open arms. Regulatory compliance should receive a similar welcome – an opportunity to build the foundations for universal delivery of data which is safe, clean and connected. A 2011 study by The Economist found that effective regulatory compliance benefits businesses across a wide range of performance metrics[ii].
Is it time to get your free performance boost?
Before I joined Informatica I worked for a health plan in Boston. I managed several programs including CMS Five Start Quality Rating System and Risk Adjustment Redesign. We recognized the need for a robust diagnostic profile of our members in support of risk adjustment. However, because the information resides in multiple sources, gathering and connecting the data presented many challenges. I see the opportunity for health plans to transform risk adjustment.
As risk adjustment becomes an integral component in healthcare, I encourage health plans to create a core competency around the development of diagnostic profiles. This should be the case for health plans and ACO’s. This profile is the source of reimbursement for an individual. This profile is also the basis for clinical care management. Augmented with social and demographic data, the profile can create a roadmap for successfully engaging each member.
Why is risk adjustment important?
Risk Adjustment is increasingly entrenched in the healthcare ecosystem. Originating in Medicare Advantage, it is now applicable to other areas. Risk adjustment is mission critical to protect financial viability and identify a clinical baseline for members.
What are a few examples of the increasing importance of risk adjustment?
1) Centers for Medicare and Medicaid (CMS) continues to increase the focus on Risk Adjustment. They are evaluating the value provided to the Federal government and beneficiaries. CMS has questioned the efficacy of home assessments and challenged health plans to provide a value statement beyond the harvesting of diagnoses codes which result solely in revenue enhancement. Illustrating additional value has been a challenge. Integrating data across the health plan will help address this challenge and derive value.
2) Marketplace members will also require risk adjustment calculations. After the first three years, the three “R’s” will dwindle down to one ‘R”. When Reinsurance and Risk Corridors end, we will be left with Risk Adjustment. To succeed with this new population, health plans need a clear strategy to obtain, analyze and process data. CMS processing delays make risk adjustment even more difficult. A Health Plan’s ability to manage this information will be critical to success.
3) Dual Eligibles, Medicaid members and ACO’s also rely on risk management for profitability and improved quality.
With an enhanced diagnostic profile — one that is accurate, complete and shared — I believe it is possible to enhance care, deliver appropriate reimbursements and provide coordinated care.
How can payers better enable risk adjustment?
- Facilitate timely analysis of accurate data from a variety of sources, in any format.
- Integrate and reconcile data from initial receipt through adjudication and submission.
- Deliver clean and normalized data to business users.
- Provide an aggregated view of master data about members, providers and the relationships between them to reveal insights and enable a differentiated level of service.
- Apply natural language processing to capture insights otherwise trapped in text based notes.
With clean, safe and connected data, health plans can profile members and identify undocumented diagnoses. With this data, health plans will also be able to create reports identifying providers who would benefit from additional training and support (about coding accuracy and completeness).
What will clean, safe and connected data allow?
- Allow risk adjustment to become a core competency and source of differentiation. Revenue impacts are expanding to lines of business representing larger and increasingly complex populations.
- Educate, motivate and engage providers with accurate reporting. Obtaining and acting on diagnostic data is best done when the member/patient is meeting with the caregiver. Clear and trusted feedback to physicians will contribute to a strong partnership.
- Improve patient care, reduce medical cost, increase quality ratings and engage members.
According to a recent article in the LA Times, healthcare costs in the United States far exceed costs in other countries. For example, heart bypass surgery costs an average of $75,345 in the U.S. compared to $15,742 in the Netherlands and $16,492 in Argentina. In the U.S. healthcare accounts for 18% of the U.S. GDP and is increasing.
Michelle Blackmer is an healthcare industry expert at Informatica. In this interview, she explains why business as usual isn’t good enough anymore. Healthcare organizations are rethinking how they do business in an effort to improve outcomes, reduce costs, and comply with regulatory pressures such as the Affordable Care Act (ACA). Michelle believes a data-driven healthcare culture is foundational to personalized medicine and discusses the importance of clean, safe and connected data in executing a successful transformation.
Q. How is the healthcare industry responding to the rising costs of healthcare?
In response to the rising costs of healthcare, regulatory pressures (i.e. Affordable Care Act (ACA)), and the need to better patient outcomes at lower costs, the U.S. healthcare industry is transforming from a volume-based to a value-based model. In this new model, healthcare organizations need to invest in delivering personalized medicine.
To appreciate the potential of personalized medicine, think about your own healthcare experience. It’s typically reactive. You get sick, you go to the doctor, the doctor issues a prescription and you wait a couple of days to see if that drug works. If it doesn’t, you call the doctor and she tries another drug. This process is tedious, painful and costly.
Now imagine if you had a chronic disease like depression or cancer. On average, any given prescription drug only works for half of those who take it. Among cancer patients, the rate of ineffectiveness jumps to 75 percent. Anti-depressants are effective in only 62 percent of those who take them.
Organizations like MD Anderson and UPMC aim to put an end to cancer. They are combining scientific research with access to clean, safe and connected data (data of all types including genomic data). The insights revealed will empower personalized chemotherapies. Personalized medicine offers customized treatments based on patient history and best practices. Personalized medicine will transform healthcare delivery. Click on the links to watch videos about their transformational work.
Q. What role does data play in enabling personalized medicine?
Data is foundational to value-based care and personalized medicine. Not just any data will do. It needs to be clean, safe and connected data. It needs to be delivered rapidly across hallways and across networks.
As an industry, healthcare is at a stage where meaningful electronic data is being generated. Now you need to ensure that the data is accessible and trustworthy so that it can be rapidly analyzed. As data is aggregated across the ecosystem, married with financial and genomic data, data quality issues become more obvious. It’s vital that you can define the data issues so the people can spend their time analyzing the data to gain insights instead of wading through and manually resolving data quality issues.
The ability to trust data will differentiate leaders from the followers. Leaders will advance personalized medicine because they rely on clean, safe and connected data to:
1) Practice analytics as a core competency
2) Define evidence, deliver best practice care and personalize medicine
3) Engage patients and collaborate to foster strong, actionable relationships
Take a look at this Healthcare eBook for more on this topic: Potential Unlocked: Transforming Healthcare by Putting Information to Work.
Q. What is holding healthcare organizations back from managing their healthcare data like other mission-critical assets?
When you say other mission-critical assets, I think of facilitates, equipment, etc. Each of these assets has people and money assigned to manage and maintain them. The healthcare organizations I talk to who are highly invested in personalized medicine recognize that data is mission-critical. They are investing in the people, processes and technology needed to ensure data is clean, safe and connected. The technology includes data integration, data quality and master data management (MDM).
What’s holding other healthcare organizations back is that while they realize they need data governance, they wrongly believe they need to hire big teams of “data stewards” to be successful. In reality, you don’t need to hire a big team. Use the people you already have doing data governance. You may not have made this a formal part of their job description and they might not have data governance technologies yet, but they do have the skillset and they are already doing the work of a data steward.
So while a technology investment is required and you need people who can use the technology, start by formalizing the data stewardship work people are doing already as part of their current job. This way you have people who understand the data, taking an active role in the management of the data and they even get excited about it because their work is being recognized. IT takes on the role of enabling these people instead of having responsibility for all things data.
Q. Can you share examples of how immature information governance is a serious impediment to healthcare payers and providers?
Sure, without information governance, data is not harmonized across sources and so it is hard to make sense of it. This isn’t a problem when you are one business unit or one department, but when you want to get a comprehensive view or a view that incorporates external sources of information, this approach falls apart.
For example, let’s say the cardiology department in a healthcare organization implements a dashboard. The dashboard looks impressive. Then a group of physicians sees the dashboard, point out erroes and ask where the information (i.e. diagnosis or attending physician) came from. If you can’t answer these questions, trace the data back to its sources, or if you have data inconsistencies, the dashboard loses credibility. This is an example of how analytics fail to gain adoption and fail to foster innovation.
Q. Can you share examples of what data-driven healthcare organizations are doing differently?
Certainly, while many are just getting started on their journey to becoming data-driven, I’m seeing some inspiring examples, including:
- Implementing data governance for healthcare analytics. The program and data is owned by the business and enabled by IT and supported by technology such as data integration, data quality and MDM.
- Connecting information from across the entire healthcare ecosystem including 3rd party sources like payers, state agencies, and reference data like credit information from Equifax, firmographics from Dun & Bradstreet or NPI numbers from the national provider registry.
- Establishing consistent data definitions and parameters
- Thinking about the internet of things (IoT) and how to incorporate device data into analysis
- Engaging patients through non-traditional channels including loyalty programs and social media; tracking this information in a customer relationship management (CRM) system
- Fostering collaboration by understanding the relationships between patients, providers and the rest of the ecosystem
- Analyzing data to understand what is working and what is not working so that they can drive out unwanted variations in care
Q. What advice can you give healthcare provider and payer employees who want access to high quality healthcare data?
As with other organizational assets that deliver value—like buildings and equipment—data requires a foundational investment in people and systems to maximize return. In other words, institutions and individuals must start managing their mission-critical data with the same rigor they manage other mission-critical enterprise assets.
Q. Anything else you want to add?
Yes, I wanted to thank our 14 visionary customer executives at data-driven healthcare organizations such as MD Anderson, UPMC, Quest Diagnostics, Sutter Health, St. Joseph Health, Dallas Children’s Medical Center and Navinet for taking time out of their busy schedules to share their journeys toward becoming data-driven at Informatica World 2014. In our next post, I’ll share some highlights about how they are using data, how they are ensuring it is clean, safe and connected and a few data management best practices. InformaticaWorld attendees will be able to download presentations starting today! If you missed InformaticaWorld 2014, stay tuned for our upcoming webinars featuring many of these examples.
This year, over one dozen healthcare leaders will share their knowledge on data driven insights at Informatica World 2014. These will be included in six tracks and over 100 breakout sessions during the conference. We are only five weeks away and I am excited that the healthcare path has grown 220% from 2013!
Join us for these healthcare sessions:
- Moving From Vision to Reality at UPMC : Structuring a Data Integration and Analytics Program: University of Pittsburgh Medical Center (UPMC) partnered with Informatica IPS to establish enterprise analytics as a core organizational competency through an Integration Competency Center engagement. Join IPS and UPMC to learn more.
- HIPAA Validation for Eligibility and Claims Status in Real Time: Healthcare reform requires healthcare payers to exchange and process HIPAA messages in less time with greater accuracy. Learn how HealthNet tackled this challenge.
- Application Retirement for Healthcare ROI : Dallas Children’s Hospital needed to retire outdated operating systems, hardware, and applications while retaining access to their legacy data for compliance purposes. Learn why application retirement is critical to the healthcare industry, how Dallas Children’s selected which applications to retire and the healthcare specific functionality that Informatica is delivering.
- UPMC’s story of implementing a Multi-Domain MDM healthcare solution in support of Data Governance : This presentation will unfold the UPMC story of implementing a Multi-Domain MDM healthcare solution as part of an overall enterprise analytics / data warehousing effort. MDM is a vital part of the overall architecture needed to support UPMC’s efforts to improve the quality of patient care and help create methods for personalized medicine. Today, the leading MDM solution developer will discuss how the team put together the roadmap, worked with domain specific workgroups, created the trust matrix and share his lessons learned. He will also share what they have planned for their consolidated and trusted Patient, Provider and Facility master data in this changing healthcare industry. This will also explain how the MDM program fits into the ICC (Integration Competency Center) currently implemented at UPMC.
- Enterprise Codeset Repositories for Healthcare: Controlling the Chaos: Learn the benefit of a centralized storage point to govern and manage codes (ICD-9/10, CPT, HCPCS, DRG, SNOMED, Revenue, TOS, POS, Service Category, etc.), mappings and artifacts that reference codes.
- Christus Health Roadmap to Data Driven Healthcare : To organize information and effectively deliver services in a hypercompetitive market, healthcare organizations must deliver data in an accurate, timely, efficient way while ensuring its clarity. Learn how CHRISTUS Health is developing and pursuing its vision for data management, including lessons adopted from other industries and the business case used to fund data management as a strategic initiative.
- Business Value of Data Quality : This customer panel will address why data quality is a business imperative which significantly affects business success.
- MD Anderson – Foster Business and IT Collaboration to Reveal Data Insights with Informatica: Is your integration team intimidated by the new Informatica 9.6 tools? Do your analysts and business users require faster access to data and answers about where data comes from. If so, this session is a must attend.
- The Many Faces of the Healthcare Customer : In the healthcare industry, the customer paying for services (individuals, insurers, employers, the government) is not necessarily the decision-influencer (physicians) or even the patient — and the provider comes in just as many varieties. Learn how, Quest, the world’s leading provider of diagnostic information leverages master data management to resolve the chaos of serving 130M+ patients, 1200+ payers, and almost half of all US physicians and hospitals.
- Lessons in Healthcare Enterprise Information Management from St. Joseph Health and Sutter Health St. Joseph : Health created a business case for enterprise information management, then built a future-proofed strategy and architecture to unlock, share, and use data. Sutter Health engaged the business, established a governance structure, and freed data from silos for better organizational performance and efficiency. Come hear these leading health systems share their best practices and lessons learned in making data-driven care a reality.
- Navinet, Inc and Informatica – Delivering Network Intelligence, The Value to the Payer, Provider and Patient: Today, healthcare payers and providers must share information in unprecedented ways to reduce redundancy, cut costs, coordinate care, and drive positive outcomes. Learn how NaviNet’s vision of a “smart” communications network combines Big Data and network intelligence to share proactive real-time information between insurers and providers.
- Providence Health Services takes a progressive approach to automating ETL development and documentation: A newly organized team of BI Generalists, most of whom have no ETL experience and even fewer with Informatica skills, were tasked with Informatica development when Providence migrated from Microsoft SSIS to Informatica. Learn how the team relied on Informatica to alleviate the burden of low value tasks.
- Using IDE for Data On-boarding Framework at HMS : HMS’s core business is to onboard large amounts of external data that arrive in different formats. HMS developed a framework using IDE to standardize the on-boarding process. This tool can be used by non-IT analysts and provides standard profiling reports and reusable mapping “templates” which has improved the hand-off to IT and significantly reduced misinterpretations and errors.
Additionally, this year’s attendees are invited to:
- Over 100 breakout sessions: Customers from other industries, including financial services, insurance, retail, manufacturing, oil and gas will share their data driven stories.
- Healthcare networking reception on Wednesday, May 14th: Join your healthcare peers and Informatica’s healthcare team on Wednesday from 6-7:30pm in the Vesper bar of the Cosmopolitan Resort for a private Healthcare networking reception. Come and hear firsthand how others are achieving a competitive advantage by maximizing return on data while enjoying hors d’oeuvres and cocktails.
- Data Driven Healthcare Roundtable Breakfast on Wednesday, May 14th. Customer led roundtable discussion.
- Personal meetings: Since most of the Informatica team will be in attendance, this is a great opportunity to meet face to face with Informatica’s product, services and solution teams.
- Informatica Pavilion and Partner Expo: Interact with the latest Informatica and our partners provide.
- An expanded “Hands-on-Lab”: Learn from real-life case studies and talk to experts about your unique environment.
The Healthcare industry is facing extraordinary changes and uncertainty — both from a business and a technology perspective. Join us to learn about key drivers for change and innovative uses of data technology solutions to discover sources for operational and process improvement. There is still time to Register now!
According to Health IT Portal, “Having an integrated health IT infrastructure allows a healthcare organization and its providers to streamline the flow of data from one department to the next. Not all health settings, however, find themselves in this situation. Either through business agreements or vendor selection processes, many a healthcare organization has to spend considerable time and resources getting their disparate health IT systems to talk to each.”
In other words, you can’t leverage Health Information Exchanges (HIEs) without a sound data integration strategy. This is something I’ve ranted about for years. The foundation of any entity-to-entity exchange, health, finance, or other, is that all relevant systems freely communicate, and thus able to consume and produce information that’s required by any information exchange.
The article cites the case of Memorial Healthcare, a community health care system in Owosso, MI. Memorial Healthcare has Meditech on the hospital side and Allscripts in its physician offices. Frank Fear, the CIO of Memorial Healthcare, spent the last few years working on solutions to enable data integration. The resulting solution between the two vendors’ offerings, as well as within the same system, is made up of both an EHR and a practice management solution.
Those in the world of healthcare are moving headlong into these exchanges. Most have no clue as to what must change within internal IT to get ahead of the need for the free flow of information. Moreover, there needs to be a good data governance strategy in place, as well as security, and a focus on compliance issues as well.
The reality is that, for the most part, data integration in the world of healthcare is largely ad-hoc, and tactical in nature. This has led to no standardized method for systems to talk one-to-another, and certainly no standard ways for data to flow out through exchanges. Think of plumbing that was built haphazardly and ad hoc over the years, with whatever was quick and easy. Now, you’ve finally turned on the water and there are many, many leaks.
In terms of data integration, healthcare has been underfunded for far too long. Now clear regulatory changes require better information management and security approaches. Unfortunately, healthcare IT is way behind, in terms of leveraging proper data integration approaches, as well as leveraging the right data integration technology.
As things change in the world of healthcare, including the move to HIEs, I suspect that data integration will finally get a hard look from those who manage IT in healthcare organizations. However, they need to do this with some sound planning, which should include an understanding of what the future holds in terms of information management, and how to create a common infrastructure that supports most of the existing and future use cases. Healthcare, you’re about 10 years behind, so let’s get moving this year.
The transition to value-based care is well underway. From healthcare delivery organizations to clinicians, payers, and patients, everyone feels the impact. Each has a role to play. Moving to a value-driven model demands agility from people, processes, and technology. Organizations that succeed in this transformation will be those in which:
- Collaboration is commonplace
- Clinicians and business leaders wear new hats
- Data is recognized as an enterprise asset
The ability to leverage data will differentiate the leaders from the followers. Successful healthcare organizations will:
1) Establish analytics as a core competency
2) Rely on data to deliver best practice care
3) Engage patients and collaborate across the ecosystem to foster strong, actionable relationships
Trustworthy data is required to power the analytics that reveal the right answers, to define best practice guidelines and to identify and understand relationships across the ecosystem. In order to advance, data integration must also be agile. The right answers do not live in a single application. Instead, the right answers are revealed by integrating data from across the entire ecosystem. For example, in order to deliver personalized medicine, you must analyze an integrated view of data from numerous sources. These sources could include multiple EMRs, genomic data, data marts, reference data and billing data.
A recent PWC survey showed that 62% of executives believe data integration will become a competitive advantage. However, a July 2013 Information Week survey reported that 40% of healthcare executives gave their organization only a grade D or F on preparedness to manage the data deluge.
What grade would you give your organization?
You can improve your organization’s grade, but it will require collaboration between business and IT. If you are in IT, you’ll need to collaborate with business users who understand the data. You must empower them with self-service tools for improving data quality and connecting data. If you are a business leader, you need to understand and take an active role with the data.
To take the next step, download our new eBook, “Potential Unlocked: Transforming healthcare by putting information to work.” In it, you’ll learn:
- How to put your information to work
- New ways to govern your data
- What other healthcare organizations are doing
- How to overcome common barriers
So go ahead, download it now and let me know what you think. I look forward to hearing your questions and comments….oh, and your grade!