Tag Archives: B2B

Informatica Rev: Data Democracy At Last – Part 2

This is a continuation from Part 1 of the Blog which you can read here.

Now, if you are in IT, reading about how Informatica Rev enables the everyday business users in your company to participate in the Data Democracy might feel like treachery. You are likely thinking that Informatica is letting the bull loose in your own fine china shop. You likely feel, first of all, that Informatica is supporting the systemic bypass of all the data governance that IT has worked hard to put in place and then second of all, that Informatica is supporting the alienation of the very IT people that have approved of and invested in Informatica for decades.

While I can understand this thought process I am here to, proudly, inform you that your thoughts cannot be further from the truth! In fact, in the not too distant future, Informatica is in a very strong position to create a very unique technology solution to ensure you can better govern all the data in your enterprise and do it in a way that will allow you to proactively deliver the right data to the business, yes, before the masses of everyday business users have started to knock your door down to even ask for it. Informatica’s unique solution will ensure the IT and Business divide that has existed in your company for decades, actually becomes a match made in heaven. And you in IT get the credit for leading this transformation of your company to a Data Democracy. Listen to this webinar to hear Justin Glatz, Executive Director of Information Technology at Code Nast speak about how he will be leading Conde Nast’s transformation to Data Democracy.

Data Democracy At Last

Data Democracy At Last

“How?” you might ask. Well, first let’s face it, today you do not have any visibility into how the business is procuring and using most data, and therefore you are not governing most of it. Without a change in your tooling, your ability to gain this visibility is diminishing greatly, especially since the business does not have to come to you to procure and use their cloud based applications.  By having all of your everyday business users use Informatica Rev, you, for the first time will have the potential to gain a truly complete picture of how data is being used in your company. Even the data they do not come to you to procure.

In the not too distant future, you will gain this visibility through an IT companion application to Informatica Rev. You will then gain the ability to easily operationalize your business user’s exact transformation logic or Recipe as we call it in Informatica Rev, into your existing repositories be they your enterprise data warehouse, datamart or master data management repository for example. And by-the-way you are likely already using Informatica PowerCenter or Informatica Cloud or Informatica MDM to manage these repositories anyway so you already have the needed infrastructure we will be integrating Informatica Rev with. And if you are not using Informatica for managing these repositories, the draw of becoming proactive with your business and leading the transformation of your company to a Data Democracy will be enough to make you want to go get Informatica.

Just as these Professionals have found success by participating in the Data Democracy, with Informatica Rev you finally can do so, too. You can try Informatica Rev for free by clicking here.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Data First, Data Governance | Tagged , , , , | Leave a comment

Thoughts on the CMS Medicare Quality “Five-Star” Rating System

CMS Medicare Quality "Five-Star" Rating System

CMS Medicare Quality “Five-Star” Rating System

This past October, the Centers for Medicare & Medicaid Services (CMS) announced two Medicare Quality Improvement Initiatives. I spent part of Thanksgiving weekend analyzing the most recent Star Quality bonus scores and trying to figure out where this program is going and what value we will get from it as an industry.  Much of the work that I am doing these days is focused on data.  I look at that through the prism of health plan operations where I spent a number of years.

CMS points out the overall improvement in quality which they position as the result of focusing, and incenting quality.  I agree that putting funding behind a quality program was a valuable strategy to motivate the industry.   This has not always been the case, in fact a former colleague who related a common dialog previous to this program:

  1. He would present a quality initiative to executive management
  2. They would nod politely and say, “Yes, of course we are interested in quality!”
  3. The conversation would continue until the cost of the program was disclosed.

The faces would change, and the response was, “Well, yes, quality is important, but funding is tight right now.  We need to focus on programs with a clear ROI”.

Thankfully the Star program has given quality initiatives a clear ROI – for which we are all grateful!

The other dynamic which is positive is that Medicare Advantage has provided a testing ground for new programs, largely the result of ACA.  Programs very similar to the Star program are part of the ACO program and the marketplace membership.  Risk Adjustment is being fitted to meet these programs also.  Private insurance will likely borrow similar structures to insure quality and fair compensation in their various risk sharing arrangements.  MA is a significant subset of the population and is providing an excellent sandbox for these initiatives while improving the quality of care that our senior population receives.

My concerns are around the cultures and mission of those plans who are struggling to get to the magic four star level where they will receive the bonus dollars.

Having worked in a health plan for almost nine years, and continuing to interact with my current customers, has shown me the dedication of the staffs that work in these plans.  One of my most rewarding experiences was leading the call center for the Medicare population.  I was humbled each day by the caring and patience the reps on the phones showed to the senior population.  I have also seen the dedication of clinical staffs to insuring the care for members is carefully coordinated and that their dignity and wishes were always respected.  I sincerely hope that plans with a clear mission find the right tools and supports to improve their ratings to the point where they receive the additional funding to maintain their viability and continue to serve their members and the medical community.  I am sure that there are poor quality plans out there, and I agree that they should be eliminated.  But I am also rooting for the plans with a mission who are striving to be a bit better.

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, General, Healthcare, Life Sciences | Tagged , , | Leave a comment

Securing Sensitive Data in Test and Development Environments

Securing Sensitive Data in Test and Development Environments

Securing Test and Development Environments

Do you use copies of production data in test and development environments? This is common practice in IT organizations. For this reason, test environments have become the number one target for outside intruders. That being said, most data breaches occur when non-malicious insiders accidentally expose sensitive data in the course of their daily work. Insider data breaches can be more serious and harder to detect than intruder events.

If you use production data in test and development environments or are looking for alternative approaches, register for the first webinar in a three part series on data security gaps and remediation.  On December 9th, Adrian Lane, Security Analyst at Securosis, will join me to discuss security for test environments.

This is the first webinar in a three part series on data security gaps and remediation. This webinar will focus on how data centric security can be used to shore up vulnerabilities in one of the key focus areas, test and development environments. It’s common practice that non-production database environments are created by making copies of production data. This potentially exposes sensitive and confidential production data to developers, testers, and contractors alike. Commonly, 6-10 copies of production databases are created for each application environment and they are regularly provisioned to support development, testing and training efforts. Since security controls deployed for the source database are not replicated in the test environments, this is a glaring hole in data security and a target for external or internal exploits.

In this webinar, we will cover:

  • Key trends in enterprise data security
  • Vulnerabilities in non-production application environments (test and development)
  • Alternatives to consider when protecting test and development environments
  • Priorities for enterprises in reducing attack surface for their organization
  • Compliance and internal audit cost reduction
  • Data masking and synthetics data use cases
  • Informatica Secure Testing capabilities

Register for the webinar today at http://infa.media/1pohKov. If you cannot attend the live event, be sure to watch the webinar on-demand.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Business Impact / Benefits, Data Security, Data Services | Tagged , , , | Leave a comment

Data Integration Webinar Follow-Up: By Our First Strange and Fatal Interview

How to Maximize Value of Data Management Investments

How to Maximize Value of Data Management Investments

This is a guest author post by Philip Howard, Research Director, Bloor Research.

I recently posted a blog about an interview style webcast I was doing with Informatica on the uses and costs associated with data integration tools.

I’m not sure that the poet John Donne was right when he said that it was strange, let alone fatal. Somewhat surprisingly, I have had a significant amount of feedback following this webinar. I say “surprisingly” because the truth is that I very rarely get direct feedback. Most of it, I assume, goes to the vendor. So, when a number of people commented to me that the research we conducted was both unique and valuable, it was a bit of a thrill. (Yes, I know, I’m easily pleased).

There were a number of questions that arose as a result of our discussions. Probably the most interesting was whether moving data into Hadoop (or some other NoSQL database) should be treated as a separate use case. We certainly didn’t include it as such in our original research. In hindsight, I’m not sure that the answer I gave at the time was fully correct. I acknowledged that you certainly need some different functionality to integrate with a Hadoop environment and that some vendors have more comprehensive capabilities than others when it comes to Hadoop and the same also applies (but with different suppliers, when it comes to integrating with, say, MongoDB or Cassandra or graph databases). However, as I pointed out in my previous blog, functionality is ephemeral. And, just because a particular capability isn’t supported today, doesn’t mean it won’t be supported tomorrow. So that doesn’t really affect use cases.

However, where I was inadequate in my reply was that I only referenced Hadoop as a platform for data warehousing, stating that moving data into Hadoop was not essentially different from moving it into Oracle Exadata or Teradata or HP Vertica. And that’s true. What I forgot was the use of Hadoop as an archiving platform. As it happens we didn’t have an archiving use case in our survey either. Why not? Because archiving is essentially a form of data migration – you have some information lifecycle management and access and security issues that are relevant to archiving once it is in place but that is after the fact: the process of discovering and moving the data is exactly the same as with data migration. So: my bad.

Aside from that little caveat, I quite enjoyed the whole event. Somebody or other (there’s always one!) didn’t quite get how quantifying the number of end points in a data integration scenario was a surrogate measure for complexity (something we took into account) and so I had to explain that. Of course, it’s not perfect as a metric but it’s the only alternative to ask eye of the beholder type questions which aren’t very satisfactory.

Anyway, if you want to listen to the whole thing you can find it HERE:

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Data Integration Platform, Data Quality | Tagged , , , | Leave a comment

Take Action – Reduce Your Risk of Identity Theft This Holiday Season

Reduce Your Risk of Identify Theft This Holiday Season

Reduce Your Risk of Identify Theft This Holiday Season

What is our personal information worth? 

With this 2014 holiday season rolling into full swing, Americans will spend more than $600 Billion, a 4.1% increase from last year. According to the Credit Union National Association, a poll showed that 45% of credit and debit card users will think twice about how they shop and pay given the tens of millions of shoppers impacted by breaches. Stealing identities is a lucrative pastime for those with ulterior motives. The Black Market pays between $10-$12 per stolen record. Yet when enriched with health data, the value is as high as $50 per record because it can be used for insurance fraud.

Are the thieves getting smarter or are we getting sloppy?  

With ubiquitous access to technology globally, general acceptance to online shopping, and the digitization of health records, there is more data online with more opportunities to steal our data than ever before.  Unfortunately for shoppers, 2013 was known as ‘the year of the retailer breach’ according to the Verizon’s 2014 data breach report. Unfortunately for patients, Healthcare providers were most noted for the highest percentage of losing protected healthcare data.

So what can we do to be a smarter and safer consumer?

No one wants to bank roll the thieves’ illegal habits. One way would be to regress 20 years, drive to the mall and make our purchases cash in hand or go back to completely paper-based healthcare.   Alternatively, here are a few suggestions to avoid being on the next list of victims:

1. Avoid irresponsible vendors and providers by being an educated consumer

Sites like The Identify Theft Resource Center and the US Department of Health and Human Services expose the latest breaches in retail and healthcare respectively. Look up who you are buying from and receiving care from and make sure they are doing everything they can to protect your data. If they didn’t respond in a timely fashion, tried to hide the breach, or didn’t implement new controls to protect your data, avoid them. Or take your chances.

2. Expect to be hacked, plan for it

Most organizations you trust with your personal information have already experienced a breach. In fact, according to a recent survey conducted by the Ponemon Group sponsored by Informatica, 72% of organizations polled experienced a breach within the past 12 months; more than 20% had 2 or more breaches in the same timeframe. When setting passwords, avoid using words or phrases that you publicly share on Facebook.  When answering security questions, most security professionals suggest that you lie!

3. If it really bothers you, be vocal and engage

Many states are invoking legislation to make organizations accountable for notifying individuals when a breach occurs. For example, Florida enacted FIPA – the Florida Information Protection Act – on July 1, 2014 that stipulates that all breaches, large or small, are subject to notification.  For every day that a breach goes undocumented, FIPA stipulates $1,000 per day penalty up to an annual limit of $500,000.

In conclusion, as the holiday shopping season approaches, now is the perfect time for you to ensure that you’re making the best – and most informed – purchasing decisions. You have the ability to take matters into your own hands; keep your data secure this year and every year.

To learn more about Informatica Data Security products, visit our Data Privacy solutions website.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data masking, Data Privacy, Healthcare | Tagged , , , | Leave a comment

When was Your Last B2B Integration Health Check?

If you haven’t updated your B2B integration capabilities in the past five years, are you at risk of being left behind?  This is the age of superior customer experience and rapid time-to-value so speedy customer on-boarding and support of specialized integration services means the difference between winning and losing business.  A health check starts with asking some simple questions: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange | Tagged , , , , , , , | 2 Comments

Manual Processes Got You Down? 4 Steps to Small Partner Enablement

In a recent Aberdeen research, they found that 95% of respondents (of 122 responses) replied on some level of manual processing in order to integration external data sources.   Manual processing to integrate external data is time consuming, expensive and error prone so why do so many do it?  Well, they often have little choice. If you look deeper, most of the time these data exchanges are with small partners and small partner enablement is a significant challenge for most organizations.  For the most part, (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B Data Exchange | Tagged , , , , , | Leave a comment

Avnet Story: How to Integrate 100% of B2B Data Sources

The other week we had Ayman Taha, Director of IT for Enterprise Solutions Integration, from Avent in to talk about how he automated the processing of unstructured, non-traditional B2B exchanges with external partners.  Avent is a large distributor of electronic components with a diverse ecosystem of customer and suppliers.  Their B2B infrastructure reflects the complexity of their environment but, despite a sophisticated and mature EDI infrastructure, they were still manually re-keying invoices and product updates from hundreds of spreadsheets and .PDFs received from partners.  This was because many of their smaller customers and partners could not send or receive EDI messages. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B Data Exchange | Tagged , , | 1 Comment

The Importance of User Experience to Cloud Integration Adoption

The Informatica Winter 2013 announcement included the following customer quote:

“The Winter 2013 release will accelerate the time it takes to access, integrate and deliver valuable data in order to meet our business imperatives.”

It was also noted that, “the new Informatica Cloud user interface will make the cloud integration solution even more user friendly.”  There are a number of user experience enhancements with this upgrade, so I sat down with Joshua Vaughn, Principal User Experience Designer for Informatica Cloud, to learn more about the impetus behind the new design and features, what’s on the horizon for the future releases, and why user interface (UI) design is so important for cloud applications.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Synchronization | Tagged , , , , , , , , , , , , , , , | Leave a comment

Hierarchical Data – More Than Just XML

In a recent Aberdeen Group Analyst Insight paper it was identified that 50% of their survey respondents were currently integrating Hierarchical data sources with 13% planning to implement this capability in the next 12 months. But the changing trend is that of those organisations currently integrating XML data where nearly a third are using or are planning to integrate other Hierarchical sources with the need to integrate JSON coming out in the lead with COBOL records and Google Protocol Buffers close behind. Apache AVRO has not been integrated much currently but shows the biggest growth in planned integration and also number of projects. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, Big Data, Data Transformation, Uncategorized | Tagged , , , , , , , , , | Leave a comment