Tag Archives: Trusted Data
Expert Advice for ACO Success: Gain Consensus on Metrics and Use High Quality Healthcare Data for Better Patient Care
It’s amazing how often Accountable Care Organizations (ACOs) are in the news. This Fierce Healthcare article, Maps of 2012 Medicare Accountable Care Organizations, states that ACOs are “the biggest change to Medicare in decades.” This Forbes’ article, Obamacare’s Accountable Care Approach Reaches 1 in 10 in US, reveals that “more than 2.4 million Medicare beneficiaries will be receiving care from more than 150 ACOs that have signed up to participate.”
But what do ACOs need to deliver on their promise of giving coordinated high quality care to Medicare patients at the right time, avoiding unnecessary services and preventing medical errors?
Healthcare industry experts agree on two basics required for ACO success. At a minimum, physicians, hospitals and healthcare providers need:
1) consensus on ACO success metrics to measure performance, and
2) access to timely, high quality healthcare data to better manage patient care and track performance.
Without an agreed upon set of ACO success metrics and better healthcare data management, how can ACOs better manage patients and identify and monitor at-risk patients? Without these basics, how can they improve patient experience, outcomes and healthcare costs?
Bridging the gap and reaching consensus on ACO metrics is one of the greatest challenges for ACOs. The good news is there are experts who can help. Please join us on Wednesday January 23rd for a live Let’s Talk Healthcare Webinar: Bridging the Gap: Reaching Consensus on ACO Metrics. Three healthcare experts:
- Senthil Balasubramanian, Manager of Decision Support at University of Pennsylvania Health System
- Maury DePalo, Director of Edgewater’s Healthcare Practices
- Richard Cramer, Informatica’s Healthcare Strategist
will talk about gaining consensus on ACO metrics, monitoring metrics, making adjustments as needed, and leveraging high quality healthcare data to better manage patient care, healthcare costs and performance.
Lockton is the world’s largest private insurance broker. Their goal is to achieve 95% client retention. The company, which operates in 60 countries, successfully adopted Salesforce to empower 4,450 associates to continually improve cross-sell and up-sell to existing clients.
To succeed, the director of operations at Lockton knew that the associates need to know who their customers and prospects are and which products and services they already have. When he investigated, he found several customer information gaps in Salesforce.
Below are five customer information gaps in Salesforce CRM that can impact sales:
Gap #1: Which customer record can I trust?
Before reaching out to a customer (let’s use fictitious client Mark Niles), the sales rep needs to access Mark’s contact information in Salesforce. Chances are Mark Niles’ customer information is spread across multiple duplicate lead, account and opportunity records with inaccuracies, inconsistencies and incomplete information. For example, potentially four records exist in Salesforce for one customer 1) Mark Niles 2) Marc Niles 3) M. Niles 4) Mark. The customer information gap becomes worse when a company has multiple Salesforce orgs. Sales dilemma: Which Salesforce customer record can I trust and update?
Gap#2: Which products and services does my customer already have?
Before a sales rep can identify which product or service to offer Mark Niles, she needs to know which ones Mark already has and if he has any outstanding issues. Chances are Mark Niles’ product information is stored in enterprise systems such as SAP, Oracle or JD Edwards ERP, customer support systems and maybe cloud applications such as NetSuite and Eloqua that are not integrated with Salesforce. Sales dilemma: Why can’t I access all relevant customer information from my Salesforce customer record?
Gap #3: What is impacting my customer right now?
Sales reps want to be up-to-date before reaching out to customers. They may need to go outside of Salesforce to get information such as credit scores and news announcements that may impact the timing of customer contact and the conversation. Sales dilemma: Why can’t relevant third-party data be included in my Salesforce customer record? (more…)
You probably know that you can and must validate the integrity of your Data Model using the Metadata Manager Tool workbench (MET), but you can also do much more. In this blog I focus on three additional things you can do with the MET.
Once you have used MET to obtain a valid Data Model, you can continue using MET to:
Migrate your project from development to test, and to production via change lists. (more…)
I had the opportunity to review and comment on the draft of a new Hadoop technical guide. It’s great to see the published paper: Technical Guide: Unleashing the Power of Hadoop with Informatica. This guide outlines the following five steps to get started with Hadoop from a data integration perspective.
(1) Select the Right Projects for Hadoop Implementation
Choose projects that fit Hadoop’s strengths and minimize its disadvantages. Enterprises use Hadoop in data-science applications for log analysis, data mining, machine learning and image processing involving unstructured or raw data. Hadoop’s lack of fixed-schema works particularly well for answering ad-hoc queries and exploratory “what if” scenarios. Hadoop Distributed File System (HDFS) and MapReduce address growth in enterprise data volumes from terabytes to petabytes and more; and the increasing variety of complex multi-dimensional data from disparate sources. (more…)
Gartner recently released their 2011 Magic Quadrant for Data Quality Tools and I’m happy to announce that Informatica is positioned in the Leaders’ quadrant. We believe our position is a testament to the fact that customers like Station Casinos and U.S. Xpress continue to turn to Informatica to solve their most critical data quality challenges.
The publishing of the Magic Quadrant is often a great opportunity to reflect on the state of the data quality market. It should come as no surprise that data quality as a business imperative isn’t going away any time soon. We are continuing to see customers looking for help and expertise in solving a wide range of data quality problems, largely associated with data governance initiatives, master data management (MDM), business intelligence and application modernization. And the association of data quality in these areas is only getting stronger. (more…)
Master data management is a hot topic in Asia Pacific. That was shown by the keen interest in MDM during Gartner’s BI Summit in Sydney, Australia, in end-February, and in a recent survey that Gartner conducted of the APAC community.
In that survey, master data management was ranked the #1 data-related technology under consideration for deployment in APAC. Forty-two percent of Gartner’s respondents put MDM at the top of their lists, ahead of dashboards/scorecards, predictive analytics, performance management, and 14 other technologies. Given that the survey found only 20 percent of APAC organizations are using MDM today, there’s clearly a lot of room for growth. (more…)