Tag Archives: security

Informatica Responsible Disclosure Policy

Notifying Informatica of Security Issues

Our team of security experts strive to quickly address security issues involving our products and services. For guidance please see below:

Problem or issue Informatica contact or resource
How do I report a security problem in an Informatica application, online service, or website? Email security@informatica.com
How do I provide feedback on an Informatica product or service? Go to Informatica Support orEmail support@informatica.com
How do I report an email, website or pop-up window that falsely claims to represent Informatica? Contact security@informatica.com
Does Informatica have a Bug Bounty Program No, but we do participate in a Responsible Disclosure Program.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Security | Tagged , , | Leave a comment

Informatica and the Shellshock Security Vulnerability

The security of information systems is a complex, shared responsibility between infrastructure, system and application providers. Informatica doesn’t take lightly the responsibility our customers have entrusted to us in this complex risk equation.

As Informatica’s Chief Information Security Officer, I’d like to share three important security updates with our customers:

  1. What you need to know about Informatica products and services relative to the latest industry-wide security concern,
  2. What you need to do to secure Informatica products against the ShellShock vulnerability, and
  3. How to contact Informatica if you have questions about Informatica product security.

1 – What you need to know

On September 24, 2014 a serious new cluster of vulnerabilities to Linux/Unix distributions was announced, classified as (CVE-2014-6271, CVE-2014-7169, CVE-2014-7186, CVE-2014-7187, CVE-2014-6277 and CVE-2014-6278) aka “Shellshock” or “Bashdoor”. What makes ShellShock so impactful is that it requires relatively low effort or expertise to exploit and gain privileged access to vulnerable systems.

Informatica’s cloud-hosted products, including Informatica Cloud Services (ICS) and our recently-launched Springbok beta, have already been patched to address this issue. We continue to monitor for relevant updates to both vulnerabilities and available patches.

Because this vulnerability is a function of the underlying Operating System, we encourage administrators of potentially vulnerable systems to assess their risk levels and apply patches and/or other appropriate countermeasures.

Informatica’s Information Security team coordinated an internal response with product developers to assess the vulnerability and make recommendations necessary for our on-premise products. Specific products and actions are listed below.

2 – What you need to do

Informatica products themselves require no patches to address the Shellshock vulnerability, they are not directly impacted. However, Informatica strongly recommends that you apply your OS vendors’ patches as they become available, since some applications allow customers to use shell scripts in their pre-and post-processing scripts. Specific Informatica products and remediations are listed below:

Cloud Service Version Patch / Remediation
Springbok Beta No action necessary. The Springbok infrastructure has been patched by Informatica Cloud Operations.
ActiveVOS/Cloud All No action necessary. The ActiveVOS/Cloud infrastructure has been patched by Informatica Cloud Operations.
Cloud/ICS All Customers should apply OS patches to all of their machines running a Cloud agent. Relevant Cloud/ICS hosted infrastructure has already been patched by Informatica Cloud Operations.

 

Product Version Patch / Remediation
PowerCenter All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
IDQ All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
MM, BG, IDE All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
PC Express All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
Data Services / Mercury stack All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
PWX mainframe & CDC All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
UM, VDS All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IDR, IFC All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
B2B DT, UDT, hparser, Atlantic All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
Data Archive All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
Dynamic data masking All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IDV All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
SAP Nearline No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed..
TDM No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
MDM All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IR / name3 No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
B2B DX / DIH All DX & DIH on Red Hat Customers should apply OS patches.  Other OS customers still recommended to apply OS patch.
PIM All PIM core and Procurement are not not directly impacted. Recommend Media Manager customers apply OS patch to all machines with INFA product installed.
ActiveVOS All No direct impact for on-premise ActiveVOS product.  Cloud-realtime has already been patched.
Address Doctor All No direct impact for AD services run on Windows.  Procurement service has already been patched by Informatica Cloud Operations.
StrikeIron All No direct impact.

3 – How to contact Informatica about security

Informatica takes the security of our customers’ data very seriously. Please contact our Informatica’s Knowledge Base (article ID 301574), or our Global Customer Support team if you have any questions or concerns. The Informatica support portal is always available at http://mysupport.informatica.com.

If you are security researcher and have identified a potential vulnerability in an Informatica product or service, please follow our Responsible Disclosure Program.

Thank you,

Bill Burns, VP & Chief Information Security Officer

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud, Data Security, IaaS | Tagged , , , | Leave a comment

Master Data and Data Security …It’s Not Complicated

Master Data and Data Security…It’s Not Complicated

Master Data and Data Security…It’s Not Complicated

The statement on Master Data and Data security was well intended.  I can certainly understand the angst around data security.  Especially after Target’s data breach, it is top of mind for all IT and now business executives.  But the root of the statement was flawed.  And it got me thinking about master data and data security.

“If I use master data technology to create a 360-degree view of my client and I have a data breach, then someone could steal all the information about my client.”

Um, wait, what?  Insurance companies take personally identifiable information very seriously.  The statement is flawed in the relationship between client master data and securing your client data.  Let’s dissect the statement and see what master data and data security really mean for insurers.  We’ll start by level setting a few concepts.

What is your Master Client Record?

Your master client record is your 360-degree view of your client.  It represents everything about your client.  It uses Master Data Management technology to virtually integrate and syndicate all of that data into a single view.  It leverages identifiers to ensure integrity in the view of the client record.  And finally it makes an effort through identifiers to correlate client records for a network effect.

There are benefits to understanding everything about your client.  The shape and view of each client is specific to your business.  As an insurer looks at their policyholders, the view of “client” is based on relationships and context that the client has to the insurer.  This are policies, claims, family relationships, history of activities and relationships with agency channels.

And what about security?

Naturally there is private data in a client record.  But there is nothing about the consolidated client record that contains any more or less personally identifiable information.  In fact, most of the data that a malicious party would be searching for can likely be found in just a handful of database locations.  Additionally breaches happen “on the wire”.  Policy numbers, credit card info, social security numbers, and birth dates can be found in less than five database tables.  And they can be found without a whole lot of intelligence or analysis.

That data should be secured.  That means that the data should be encrypted or masked so that any breach will protect the data.  Informatica’s data masking technology allows this data to be secured in whatever location.  It provides access control so that only the right people and applications can see the data in an unsecured format.  You could even go so far as to secure ALL of your client record data fields.  That’s a business and application choice.  Do not confuse field or database level security with a decision to NOT assemble your golden policyholder record.

What to worry about?  And what not to worry about?

Do not succumb to fear of mastering your policyholder data.  Master Data Management technology can provide a 360-degree view.  But it is only meaningful within your enterprise and applications.  The view of “client” is very contextual and coupled with your business practices, products and workflows.  Even if someone breaches your defenses and grabs data, they’re looking for the simple PII and financial data.  Then they’re grabbing it and getting out. If the attacker could see your 360-degree view of a client, they wouldn’t understand it.  So don’t over complicate the security of your golden policyholder record.  As long as you have secured the necessary data elements, you’re good to go.  The business opportunity cost of NOT mastering your policyholder data far outweighs any imagined risk to PII breach.

So what does your Master Policyholder Data allow you to do?

Imagine knowing more about your policyholders.  Let that soak in for a bit.  It feels good to think that you can make it happen.  And you can do it.  For an insurer, Master Data Management provides powerful opportunities across everything from sales, marketing, product development, claims and agency engagement.  Each channel and activity has discreet ROI.  It also has direct line impact on revenue, policyholder satisfaction and market share.  Let’s look at just a few very real examples that insurers are attempting to tackle today.

  1. For a policyholder of a certain demographic with an auto and home policy, what is the next product my agent should discuss?
  2. How many people live in a certain policyholder’s household?  Are there any upcoming teenage drivers?
  3. Does this personal lines policyholder own a small business?  Are they a candidate for a business packaged policy?
  4. What is your policyholder claims history?  What about prior carriers and network of suppliers?
  5. How many touch points have your agents and had with your policyholders?  Were they meaningful?
  6. How can you connect with you policyholders in social media settings and make an impact?
  7. What is your policyholder mobility usage and what are they doing online that might interest your Marketing team?

These are just some of the examples of very streamlined connections that you can make with your policyholders once you have your 360-degree view. Imagine the heavy lifting required to do these things without a Master Policyholder record.

Fear is the enemy of innovation.  In mastering policyholder data it is important to have two distinct work streams.  First, secure the necessary data elements using data masking technology.  Once that is secure, gain understanding through the mastering of your policyholder record.  Only then will you truly be able to take your clients’ experience to the next level.  When that happens watch your revenue grow in leaps and bounds.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Security, Financial Services, Master Data Management | Tagged , , | Leave a comment

Takeaways from the Gartner Security and Risk Management Summit (2014)

Last week I had the opportunity to attend the Gartner Security and Risk Management Summit. At this event, Gartner analysts and security industry experts meet to discuss the latest trends, advances, best practices and research in the space. At the event, I had the privilege of connecting with customers, peers and partners. I was also excited to learn about changes that are shaping the data security landscape.

Here are some of the things I learned at the event:

  • Security continues to be a top CIO priority in 2014. Security is well-aligned with other trends such as big data, IoT, mobile, cloud, and collaboration. According to Gartner, the top CIO priority area is BI/analytics. Given our growing appetite for all things data and our increasing ability to mine data to increase top-line growth, this top billing makes perfect sense. The challenge is to protect the data assets that drive value for the company and ensure appropriate privacy controls.
  • Mobile and data security are the top focus for 2014 spending in North America according to Gartner’s pre-conference survey. Cloud rounds out the list when considering worldwide spending results.
  • Rise of the DRO (Digital Risk Officer). Fortunately, those same market trends are leading to an evolution of the CISO role to a Digital Security Officer and, longer term, a Digital Risk Officer. The DRO role will include determination of the risks and security of digital connectivity. Digital/Information Security risk is increasingly being reported as a business impact to the board.
  • Information management and information security are blending. Gartner assumes that 40% of global enterprises will have aligned governance of the two programs by 2017. This is not surprising given the overlap of common objectives such as inventories, classification, usage policies, and accountability/protection.
  • Security methodology is moving from a reactive approach to compliance-driven and proactive (risk-based) methodologies. There is simply too much data and too many events for analysts to monitor. Organizations need to understand their assets and their criticality. Big data analytics and context-aware security is then needed to reduce the noise and false positive rates to a manageable level. According to Gartner analyst Avivah Litan, ”By 2018, of all breaches that are detected within an enterprise, 70% will be found because they used context-aware security, up from 10% today.”

I want to close by sharing the identified Top Digital Security Trends for 2014

  • Software-defined security
  • Big data security analytics
  • Intelligent/Context-aware security controls
  • Application isolation
  • Endpoint threat detection and response
  • Website protection
  • Adaptive access
  • Securing the Internet of Things
FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, CIO, Data Governance, Data Privacy, Data Security, Governance, Risk and Compliance | Tagged , , , , , , , , | Leave a comment

Columnar Deduplication and Column Tokenization: Improving Database Performance, Security and Interoperability

For some time now, a special technique called columnar deduplication has been implemented by a number of commercially available relational database management systems. In today’s blog post, I discuss the nature and benefits of this technique, which I will refer to as column tokenization for reasons that will become evident.

Column tokenization is a process in which a unique identifier (called a Token ID) is assigned to each unique value in a column, and then employed to represent that value anywhere it appears in the column. Using this approach, data size reductions of up to 50% can be achieved, depending on the number of unique values in the column (that is, on the column’s cardinality). Some RDBMSs use this technique simply as a way of compressing data; the column tokenization process is integrated into the buffer and I/O subsystems, and when a query is executed, each row needs to be materialized and the token IDs replaced by their corresponding values. At Informatica for the File Archive Service (FAS) part of the Information Lifecycle Management product family, column tokenization is the core of our technology: the tokenized structure is actually used during query execution, with row materialization occurring only when the final result set is returned. We also use special compression algorithms to achieve further size reduction, typically on the order of 95%.

(more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Integration | Tagged , , , , , , | Leave a comment

Enterprise Data Archiving Is White Hot

Informatica recently hosted a webinar on Enterprise Data Archiving Best Practices with guest speakers, Tony Baer from Ovum and Murali Rathnam from Symantec IT.  With over 600 registrations, I would say that enterprise data archiving is not hot, it is white hot.  At least for Informatica.  With Big Data entering the data center, organizations are looking for ways to make room – either in the budget or in the data center itself.  Archiving is a proven approach that achieves both.  Given the complexities and interconnections of enterprise applications, Enterprise Data Archive solutions based on market leading technologies such as Informatica Data Archive, can deliver on the value proposition while meeting tough requirements.  (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Archiving | Tagged , , , , , , , | Leave a comment

Ten Facets of Data Governance

In this video, Rob Karel, vice president of product strategy, Informatica, outlines the Informatica Data Governance Framework, highlighting the 10 facets that organizations need to focus on for an effective data governance initiative:

  • Vision and Business Case to deliver business value
  • People
  • Tools and Architecture to support architectural scope of data governance
  • Policies that make up data governance function (security, archiving, etc.)
  • Measurement: measuring the level of influence of a data governance initiative and measuring its effectiveness (business value metrics, ROI metrics, such as increasing revenue, improving operational efficiency, reducing risk, reducing cost or improving customer satisfaction)
  • Change Management: incentives to workforce, partners and customers to get better quality data in and potential repercussions if data is not of good quality
  • Organizational Alignment: how the organization will work together across silos
  • Dependent Processes: identifying data lifecycles (capturing, reporting, purchasing and updating data into your environment), all processes consuming the data and processes to store and manage the data
  • Program Management: effective program management skills to build out communication strategy, measurement strategy and a focal point to escalate issues to senior management when necessary
  • Define Processes that make up the data governance function (discovery, definition, application and measuring and monitoring).

For more information from Rob Karel on the Informatica Data Governance Framework, visit his Perspectives blogs.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Governance | Tagged , , , , , , , , , , , , , | Leave a comment

The Individual in the European Data Tug of War

LinkedIn’s security breach this summer exposed a massive 6.5 million user passwords and was yet another reminder of the blanket lack of protection over consumer data.  The constant deluge of reports over personal data leakages has left 70% of EU citizens worried about the misuse of their personal data, according to the European Commission. That’s why the EU stepped in to look at strengthening the right to access, change or delete personal data. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Privacy | Tagged , , , , | Leave a comment

Informatica at EMC World – Where Big Data Needs Integration

This week the EMC World 2012 conference is taking place in Las Vegas.  Informatica is participating as a partner continuing its commitment to the EMC Select Partnership for the Informatica ILM and MDM solutions.   Informatica has continued to expand its partnership to include support for its Greenplum Hadoop distribution – mostly to support organizations needs for big data integration while making big data manageable and secure. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data Privacy, Master Data Management, Ultra Messaging | Tagged , , , , , , | Leave a comment

People, Processes and Technology – Don’t Forget the TECHNOLOGY!

As a routine matter of delivering care, billing for services and operating their hospitals and physician practices, healthcare providers deal with patient’s protected health information all day, every day. Dealing with the data becomes routine and it’s easy for sometimes onerous security and privacy policies and procedures to be overlooked. While we’d all like that not to be the case, delivering healthcare (and getting paid for it) is a hugely complex undertaking and focusing exclusively on human processes and calling for constant vigilance and attention to detail can only go so far. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data masking, Healthcare, Professional Services | Tagged , , , , , , , , | Leave a comment