Category Archives: Healthcare
Data integration has always been a core technology in the world of healthcare. The latest trend is to leverage data integration technology as an emerging key strategic advantage within healthcare systems. The objective is to improve care through the use of existing and new information, and thus provide the ability to be more proactive, saving money and saving lives. (more…)
Expert Advice for ACO Success: Gain Consensus on Metrics and Use High Quality Healthcare Data for Better Patient Care
It’s amazing how often Accountable Care Organizations (ACOs) are in the news. This Fierce Healthcare article, Maps of 2012 Medicare Accountable Care Organizations, states that ACOs are “the biggest change to Medicare in decades.” This Forbes’ article, Obamacare’s Accountable Care Approach Reaches 1 in 10 in US, reveals that “more than 2.4 million Medicare beneficiaries will be receiving care from more than 150 ACOs that have signed up to participate.”
But what do ACOs need to deliver on their promise of giving coordinated high quality care to Medicare patients at the right time, avoiding unnecessary services and preventing medical errors?
Healthcare industry experts agree on two basics required for ACO success. At a minimum, physicians, hospitals and healthcare providers need:
1) consensus on ACO success metrics to measure performance, and
2) access to timely, high quality healthcare data to better manage patient care and track performance.
Without an agreed upon set of ACO success metrics and better healthcare data management, how can ACOs better manage patients and identify and monitor at-risk patients? Without these basics, how can they improve patient experience, outcomes and healthcare costs?
Bridging the gap and reaching consensus on ACO metrics is one of the greatest challenges for ACOs. The good news is there are experts who can help. Please join us on Wednesday January 23rd for a live Let’s Talk Healthcare Webinar: Bridging the Gap: Reaching Consensus on ACO Metrics. Three healthcare experts:
- Senthil Balasubramanian, Manager of Decision Support at University of Pennsylvania Health System
- Maury DePalo, Director of Edgewater’s Healthcare Practices
- Richard Cramer, Informatica’s Healthcare Strategist
will talk about gaining consensus on ACO metrics, monitoring metrics, making adjustments as needed, and leveraging high quality healthcare data to better manage patient care, healthcare costs and performance.
I had the privilege to be invited to testify to the Health I.T. Policy Committee workgroup on the topic of data quality back in November. I’ve been an advocate for the work of the committee for years and am constantly impressed with the considerable insight and genuine passion they bring to their work. The opportunity to testify, however, was my first opportunity to actually participate in the policy-making process and it certainly was both a learning opportunity for me, as well as a chance to share my thoughts on the important topic of data quality. (more…)
Earlier this week I met with security leaders at some of the largest organizations in the San Francisco Bay Area. They highlighted disturbing trends, in addition to the increased incidence of breaches they see increased:
- Numbers of customer who want to do security audits of their company
- Number of RFPs in which information is required about data security
- Litigation from data security breaches— and occurrences of class action lawsuits—as opposed to regulatory fines driving concerns
So much attention has been placed on defending the perimeter that many organizations feel they are in an arms race. Part of the problem is that it’s not clear how effective the firewalls are. While firewalls may be a part of the solution, organizations are increasingly looking at how to make their applications bulletproof and centralize controls. One of the high risk areas are systems where people have more access than they need to.
For example, many organizations have created copies of production environments for test, development and training purposes. As a result this data can be completely exposed and the confidential aspects are at risk of being leaked intentionally or unintentionally. I spoke to a customer a couple of weeks ago who had tried to change the email addresses in their test database. But they missed a few. As a result, during a test run, they sent their customers emails. Their customers called back and asked what was going on. That was when we started talking to them about a masking solution that would permanently mask the data in these environments. In this way they would have the best data to test with and all sensitive details obliterated.
Another high risk area is with certain users, for example cloud administrators, who have access to all data in the clear. As a result, the administrators have access to account numbers and social security numbers that they don’t need in order to do their jobs. Here, masking these values would enable them to still see the passwords they need to do their jobs. But it would prevent the breach of the other confidential data.
Going back to the concerns the security leaders had, how do you prove to your customers that you have data security? Especially, if it’s difficult to prove the effectiveness of a firewall? This is where reports on what data was masked and what it was masked to comes in. Yes, you can pay for cyberinsurance to cover your losses for when you have a breach. But wouldn’t it be better to prevent the breaches in the first place and showing how you’ve done it? Try looking at the problem from the inside—out.
The widespread adoption of electronic health records (EHRs) is a key objective of the Health Information Technology for Economic and Clinical Health (HITECH) Act, enacted as part of the American Recovery and Reinvestment Act of 2009. With the pervasive use of EHRs, an enormous volume of clinical data will be readily accessible that has previously been locked away in paper charts. The potential value of this data to yield insights into what works in healthcare, and what doesn’t work, dwarfs the benefits of simply replacing a paper chart with an electronic system. There’s appropriate enthusiasm that this data is going to be a veritable goldmine for enterprise data warehousing, business intelligence, and comparative effectiveness research. However, there are other, equally valuable, uses for this data to enhance clinical decision-making and improve the value of healthcare spending. Simply having instant access to large volumes of data that span thousands or tens-of-thousands of physicians, hundreds-of-thousands of patients and millions of encounters, offers an unparalleled opportunity to increase the quality and lower the cost of healthcare. (more…)
In a May 2012 survey by the Ponemon Institute, 66 percent said they are not confident their organization would be able to detect the loss or theft of sensitive personal information contained in systems operated by third parties, including cloud providers. In addition, the majority are not confident that their organization would be able detect the loss or theft of sensitive personal information in their company’s production environment.
Which aspect of data security for your cloud solution is most important?
1. Is it to protect the data in copies of production/cloud applications used for test or training purposes? For example, do you need to secure data in your Salesforce.com Sandbox?
2. Is it to protect the data so that a user will see data based on her/his role, privileges, location and data privacy rules?
3. Is it to protect the data before it gets to the cloud?
As compliance continues to drive people to action, compliance with contractual agreements, especially for the cloud infrastructure continues to drive investment. In addition, many organizations are supporting Salesforce.com as well as packaged solutions such as Oracle eBusiness, Peoplesoft, SAP, and Siebel.
Of the available data protection solutions, tokenization has been used and is well known for supporting PCI data and preserving the format and width of a table column. But because many tokenization solutions today require creating database views or changing application source code, it has been difficult for organizations to support packaged applications that don’t allow these changes. In addition, databases and applications take a measurable performance hit to process tokens.
What might work better is to dynamically tokenize data before it gets to the cloud. So there would be a transparent layer between the cloud and on-premise data integration that would replace the sensitive data with tokens. In this way, additional code to the application would not be required.
In the Ponemon survey, most said the best control is to dynamically mask sensitive information based on the user’s privilege level. After dynamically masking sensitive data, people said encrypting all sensitive information contained in the record is the best option.
The strange thing is that people recognize there is a problem but are not spending accordingly. In the same survey from Ponemon, 69% of organizations find it difficult to restrict user access to sensitive information in IT and business environments. However, only 33% say they have adequate budgets to invest in the necessary solutions to reduce the insider threat.
Is this an opportunity for you?
Hear Larry Ponemon discuss the survey results in more detail during a CSOonline.com/Computerworld webinar, Data Privacy Challenges and Solutions: Research Findings with Ponemon Institute, on Wednesday, June 13.
By John Wollman, Executive Vice President, HighPoint Solutions, www.highpoint-solutions.com
Over the next two years, leading up to the ICD-10 “go live” date of October 1, 2014, there will be many procurement cycles for ICD-10 mapping and crosswalk tools. At present, we are seeing an evolution in the philosophy relating to the management of codes and mappings that will influence buyer decisions on the types of tools to incorporate into an ICD-10 program.
In considering mapping and crosswalk tools, leading companies are viewing the problem from an ongoing operational perspective, and not simply through a transition/conversion lens. The notion is that the complexity of ICD-10 is not limited to a one-time transition or conversion. Rather, the complexity will continue to be a problem well after October 1, 2014. The ongoing operational requirements call more for Master Data Management (MDM) approaches to managing codesets, mappings and the enterprise artifacts that are comprised of codes. (more…)
Having spoken with a number of healthcare providers at recent events including the Gartner MDM Summit, Informatica’s own MDM workshops and a variety of other meetings, I’ve noticed their use of MDM is evolving as of late. It’s clear that within the provider community, MDM is quickly maturing beyond a single virtual view of the patient using an Enterprise Master Patient Index (EMPI). (more…)
I’ve been advocating for years that replacing the paper chart with an electronic system is not the value of the EHR, but rather collecting data that can be used to understand and improve care. So I was very pleased to see Dr. John Showalter’s blog address this very issue – making a compelling case with real-world examples where wisdom derived from data has made demonstrable improvements in healthcare quality and corresponding reductions in cost. (more…)