Category Archives: Data Security

IDC Life Sciences and Ponemon Research Highlights Need for New Security Measures

The Healthcare and Life Sciences industry has demonstrated its ability to take advantage of data to fuel research, explore new ways to cure life threatening diseases, and save lives.  With the adoption of technology innovation especially in the mobile technology segment, this industry will need to find a balance between investments and risk.

ModernMedicine.com published an article in May, 2014 stating how analysts worry that a wide-scale security breach could occur in healthcare and pharmaceuticals industry this year.  The piece calls out that this industry category ranked the lowest in an S&P500 cyber health study because of its high volume of incidents and slow response rates.

In the Ponemon Institute’s research, The State of Data Centric Security, respondents from the Healthcare and Life Sciences stated the data they considered most at risk was customer, consumer and patient record data.  Intellectual Property, Business Intelligence and Classified Data responses ranked a close second.

Data security

In an Informatica webinar with Alan Louie, Research Analyst from IDC Health Insights (@IDCPharmaGuru), we discussed his research on ‘Changing Times in the Life Sciences – Enabled and Empowered by Tech Innovation’.  The megatrends of cloud, mobile, social networks and Big Data analytics are all moving in a positive direction with various phases of adoption.  Mobile technologies tops the list of IT priorities – likely because of the productivity gains that can be achieved by mobile devices and applications. Security/Risk Management technologies listed as the second-highest priority.

When we asked Security Professionals in Life Sciences in the Ponemon Survey, ‘What keeps you up at night?’, the top answer was ‘migrating to new mobile platforms’.  The reason I call this factoid out is that all other industry categories ranked ‘not knowing where sensitive data resides’ as the biggest concern. Why is Life Sciences different from other industries?

Life sciences

One reason could be the intense scrutiny over Intellectual Property protection and HIPPA compliance has already shone a light on where sensitive data reside. Mobile makes it difficult to track and contain a potential breach given that cell phones are the number 1 item left behind in taxi cabs.

With the threat of a major breach on the horizon, and the push to leverage technology such as mobile and cloud, it is evident that the investments in security and risk management need to focus on the data itself – rather than tie it to a specific technology or platform.

Enter Data-Centric Security.  The call to action is to consider applying a new approach to the information security paradigm that emphasizes the security of the data itself rather than the security of networks or applications.   Informatica recently published an eBook ‘Data-Centric Security eBook New Imperatives for a New Age of Data’.  Download it, read it. In an industry with so much at stake, we highlight the need for new security measures such as these. Do you agree?

I encourage your comments and open the dialogue!

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Governance, Data masking, Data Privacy, Data Security | Tagged , , , , , | Leave a comment

Just In Time For the Holidays: How The FTC Defines Reasonable Security

Reasonable Security

How The FTC Defines Reasonable Security

Recently the International Association of Privacy Professionals (IAPP, www.privacyassociation.org ) published a white paper that analyzed the Federal Trade Commission’s (FTC) data security/breach enforcement. These enforcements include organizations from the finance, retail, technology and healthcare industries within the United States.

From this analysis in “What’s Reasonable Security? A Moving Target,” IAPP extrapolated the best practices from the FTC’s enforcement actions.

While the white paper and article indicate that “reasonable security” is a moving target it does provide recommendations that will help organizations access and baseline their current data security efforts.  Interesting is the focus on data centric security, from overall enterprise assessment to the careful control of access of employees and 3rd parties.  Here some of the recommendations derived from the FTC’s enforcements that call for Data Centric Security:

  • Perform assessments to identify reasonably foreseeable risks to the security, integrity, and confidentiality of personal information collected and stored on the network, online or in paper files.
  • Limited access policies curb unnecessary security risks and minimize the number and type of network access points that an information security team must monitor for potential violations.
  • Limit employee access to (and copying of) personal information, based on employee’s role.
  • Implement and monitor compliance with policies and procedures for rendering information unreadable or otherwise secure in the course of disposal. Securely disposed information must not practicably be read or reconstructed.
  • Restrict third party access to personal information based on business need, for example, by restricting access based on IP address, granting temporary access privileges, or similar procedures.

How does Data Centric Security help organizations achieve this inferred baseline? 

  1. Data Security Intelligence (Secure@Source coming Q2 2015), provides the ability to “…identify reasonably foreseeable risks.”
  2. Data Masking (Dynamic and Persistent Data Masking)  provides the controls to limit access of information to employees and 3rd parties.
  3. Data Archiving provides the means for the secure disposal of information.

Other data centric security controls would include encryption for data at rest/motion and tokenization for securing payment card data.  All of the controls help organizations secure their data, whether a threat originates internally or externally.   And based on the never ending news of data breaches and attacks this year, it is a matter of when, not if your organization will be significantly breached.

For 2015, “Reasonable Security” will require ongoing analysis of sensitive data and the deployment of reciprocal data centric security controls to ensure that the organizations keep pace with this “Moving Target.”

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data masking, Data Privacy, Data Security | Tagged , , , | Leave a comment

10 Insights From The Road To Data Governance

10 Insights From The Road To Data Governance

10 Insights From The Road To Data Governance

I routinely have the pleasure of working with Terri Mikol, Director of Data Governance, UPMC. Terri has been spearheading data governance for three years at UPMC. As a result, she has a wealth of insights to offer on this hot topic. Enjoy her top 10 lessons learned from UPMC’s data governance journey:

1. You already have data stewards.

Commonly, health systems think they can’t staff data governance such as UPMC has becauseof a lack of funding. In reality, people are already doing data governance everywhere, across your organization! You don’t have to secure headcount; you locate these people within the business, formalize data governance as part of their job, and provide them tools to improve and manage their efforts.

2. Multiple types of data stewards ensure all governance needs are being met.

Three types of data stewards were identified and tasked across the enterprise:

I. Data Steward. Create and maintain data/business definitions. Assist with defining data and mappings along with rule definition and data integrity improvement.

II. Application Steward. One steward is named per application sourcing enterprise analytics. Populate and maintain inventory, assist with data definition and prioritize data integrity issues.

III. Analytics Steward. Named for each team providing analytics. Populate and maintain inventory, reduce duplication and define rules and self-service guidelines.

3. Establish IT as an enabler.

IT, instead of taking action on data governance or being the data governor, has become anenabler of data governance by investing in and administering tools that support metadata definition and master data management.

4. Form a governance council.

UPMC formed a governance council of 29 executives—yes, that’s a big number but UPMC is a big organization. The council is clinically led. It is co-chaired by two CMIOs and includes Marketing, Strategic Planning, Finance, Human Resources, the Health Plan, and Research. The council signs off on and prioritizes policies. Decision-making must be provided from somewhere.

5. Avoid slowing progress with process.

In these still-early days, only 15 minutes of monthly council meetings are spent on policy and guidelines; discussion and direction take priority. For example, a recent agenda item was “Length of Stay.” The council agreed a single owner would coordinate across Finance, Quality and Care Management to define and document an enterprise definition for “Length of Stay.”

6. Use examples.

Struggling to get buy-in from the business about the importance of data governance? An example everyone can relate to is “Test Patient.” For years, in her business intelligence role, Terri worked with “Test Patient.” Investigation revealed that these fake patients end up in places they should not. There was no standard for creation or removal of test patients, which meant that test patients and their costs, outcomes, etc., were included in analysis and reporting that drove decisions inside and external to UPMC. The governance program created a policy for testing in production should the need arise.

7. Make governance personal through marketing.

Terri holds monthly round tables with business and clinical constituents. These have been a game changer: Once a month, for two hours, ten business invitees meet and talk about the program. Each attendee shares a data challenge, and Terri educates them on the program and illustrates how the program will address each challenge.

8. Deliver self-service.

Providing self-service empowers your users to gain access and control to the data they need to improve their processes. The only way to deliver self-service business intelligence is to make metadata, master data, and data quality transparent and accessible across the enterprise.

9. IT can’t do it alone.

Initially, IT was resistant to giving up control, but now the team understands that it doesn’t have the knowledge or the time to effectively do data governance alone.

10. Don’t quit!

Governance can be complicated, and it may seem like little progress is being made. Terri keeps spirits high by reminding folks that the only failure is quitting.

Getting started? Assess the data governance maturity of your organization here: http://governyourdata.com/

FacebookTwitterLinkedInEmailPrintShare
Posted in Data First, Data Governance, Data Integration, Data Security | Tagged , , , | Leave a comment

Considering Data Integration? Also Consider Data Security Best Practices

Considering Data Integration? Also Consider Data Security

Consider Data Security Best Practices

It seems you can’t go a week without hearing about some major data breach, many of which make front-page news.  The most recent was from the State of California, that reported a large number of data breaches in that state alone.  “The number of personal records compromised by data breaches in California surged to 18.5 million in 2013, up more than six times from the year before, according to a report published [late October 2014] by the state’s Attorney General.”

California reported a total of 167 data breaches in 2013, which is up 28 percent from the 2012.  Two major data breaches caused most of this uptick, including the Target attack that was reported in December 2013, and the LivingSocial attack that occurred in April 2013.  This year, you can add the Home Depot data breach to that list, as well as the recent breach at the US Post Office.

So, what the heck is going on?  And how does this new impact data integration?  Should we be concerned, as we place more and more data on public clouds, or within big data systems?

Almost all of these breaches were made possible by traditional systems with security technology and security operations that fell far enough behind that outside attackers found a way in.  You can count on many more of these attacks, as enterprises and governments don’t look at security as what it is; an ongoing activity that may require massive and systemic changes to make sure the data is properly protected.

As enterprises and government agencies stand up cloud-based systems, and new big data systems, either inside (private) or outside (public) of the enterprise, there are some emerging best practices around security that those who deploy data integration should understand.  Here are a few that should be on the top of your list:

First, start with Identity and Access Management (IAM) and work your way backward.  These days, most cloud and non-cloud systems are complex distributed systems.  That means IAM is is clearly the best security model and best practice to follow with the emerging use of cloud computing.

The concept is simple; provide a security approach and technology that enables the right individuals to access the right resources, at the right times, for the right reasons.  The concept follows the principle that everything and everyone gets an identity.  This includes humans, servers, APIs, applications, data, etc..  Once that verification occurs, it’s just a matter of defining which identities can access other identities, and creating policies that define the limits of that relationship.

Second, work with your data integration provider to identify solutions that work best with their technology.  Most data integration solutions address security in one way, shape, or form.  Understanding those solutions is important to secure data at rest and in flight.

Finally, splurge on monitoring and governance.  Many of the issues around this growing number of breaches exist with the system managers’ inability to spot and stop attacks.  Creative approaches to monitoring system and network utilization, as well as data access, will allow those in IT to spot most of the attacks and correct the issues before the ‘go nuclear.’  Typically, there are an increasing number of breach attempts that lead up to the complete breach.

The issue and burden of security won’t go away.  Systems will continue to move to public and private clouds, and data will continue to migrate to distributed big data types of environments.  And that means the need data integration and data security will continue to explode.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Privacy, Data Security | Tagged , , , | Leave a comment

Data First: Five Tips To Reduce the Risk of A Breach

Reduce the Risk of A Breach

Reduce the Risk of A Breach

This article was originally published on www.federaltimes.com

November – that time of the year. This year, November 1 was the start of Election Day weekend and the associated endless barrage of political ads. It also marked the end of Daylight Savings Time. But, perhaps more prominently, it marked the beginning of the holiday shopping season. Winter holiday decorations erupted in stores even before Halloween decorations were taken down. There were commercials and ads, free shipping on this, sales on that, singing, and even the first appearance of Santa Claus.

However, it’s not all joy and jingle bells. The kickoff to this holiday shopping season may also remind many of the countless credit card breaches at retailers that plagued last year’s shopping season and beyond. The breaches at Target, where almost 100 million credit cards were compromised, Neiman Marcus, Home Depot and Michael’s exemplify the urgent need for retailers to aggressively protect customer information.

In addition to the holiday shopping season, November also marks the next round of open enrollment for the ACA healthcare exchanges. Therefore, to avoid falling victim to the next data breach, government organizations as much as retailers, need to have data security top of mind.

According to the New York Times (Sept. 4, 2014), “for months, cyber security professionals have been warning that the healthcare site was a ripe target for hackers eager to gain access to personal data that could be sold on the black market. A week before federal officials discovered the breach at HealthCare.gov, a hospital operator in Tennessee said that Chinese hackers had stolen personal data for 4.5 million patients.”

Acknowledging the inevitability of further attacks, companies and organizations are taking action. For example, the National Retail Federation created the NRF IT Council, which is made up of 130 technology-security experts focused on safeguarding personal and company data.

Is government doing enough to protect personal, financial and health data in light of these increasing and persistent threats? The quick answer: no. The federal government as a whole is not meeting the data privacy and security challenge. Reports of cyber attacks and breaches are becoming commonplace, and warnings of new privacy concerns in many federal agencies and programs are being discussed in Congress, Inspector General reports and the media. According to a recent Government Accountability Office report, 18 out of 24 major federal agencies in the United States reported inadequate information security controls. Further, FISMA and HIPAA are falling short and antiquated security protocols, such as encryption, are also not keeping up with the sophistication of attacks. Government must follow the lead of industry and look for new and advanced data protection technologies, such as dynamic data masking and continuous data monitoring to prevent and thwart potential attacks.

These five principles can be implemented by any agency to curb the likelihood of a breach:

1. Expand the appointment and authority of CSOs and CISOs at the agency level.

2. Centralize the agency’s data privacy policy definition and implement on an enterprise level.

3. Protect all environments from development to production, including backups and archives.

4. Data and application security must be prioritized at the same level as network and perimeter security.

5. Data security should follow data through downstream systems and reporting.

So, as the season of voting, rollbacks, on-line shopping events, free shipping, Black Friday, Cyber Monday and healthcare enrollment begins, so does the time for protecting personal identifiable information, financial information, credit cards and health information. Individuals, retailers, industry and government need to think about data first and stay vigilant and focused.

This article was originally published on www.federaltimes.com. Please view the original listing here

FacebookTwitterLinkedInEmailPrintShare
Posted in B2B, B2B Data Exchange, Data First, Data Security, Data Services | Tagged , , , | Leave a comment

Informatica’s Response To the POODLE SSL v3 Vulnerability

The information security industry is an “arms race”, with attacks always getting better. To that end, it’s important that security controls and implementations be designed with flexibility and agility in mind. The SSL protocol was originally developed by Netscape Communications, the company that helped fuel the Internet generation. For the past 20 years, we’ve seen advancement in computing power and and novel new attacks against SSL implementations. As new vulnerabilities are discovered, enhancements are proposed and TLS was developed to replace SSL. The cycle continues…we’re now at TLS 1.2, and with this “POODLE” vulnerability it’s finally time to say goodbye forever to the SSL foundations that brought us here.

The recent announcement of POODLE should be another wake-up call to security practitioners and implementors that patches and updates are now the “New Normal” for infrastructure that handles sensitive data. Even foundational protocols like SSL require care and feeding, regular maintenance and updates. For Informatica customers, this note is our response to help you keep your software patched via important updates provided by our Global Customer Support organization.

1 – What you need to know

On October 14, 2014 Google security researchers released details of a vulnerability within the design of SSL version 3 protocol. An attacker in a privileged position on a network could intercept encrypted traffic and methodically decrypt the messages to reveal sensitive information such as credentials. This is an industry-wide issue, affecting nearly every system that implements or supports SSL. TLS is its replacement, but not every product is guaranteed compatible with an SSL -> TLS upgrade so patches need to be applied and tested carefully.

Informatica’s cloud-hosted products including Informatica Cloud Services (ICS) and our recently-launched Project Springbok beta, have already been patched to address this issue and will only support TLS going forward. We continue to monitor for relevant updates to both vulnerabilities and available patches.

Because this vulnerability affects connectivity to other systems, it is important that our customers carefully assess (a) the level of risk they are actually subject to, and (b) the impact disabling SSLv3 will have against other clients, servers, and API endpoints in their ecosystem. It may be acceptable, for example, to leave SSLv3 enabled if sufficient compensating controls are enforced and the risk of change is too great.

2 – What you need to do

Informatica’s Information Security team coordinated an internal response with developers to assess the vulnerability within our products and cloud services.

Some Informatica products require patches or configuration changes to be able to address this SSL parameter tuning capability. Please contact Informatica Global Customer Support or your account executive for more information or technical questions.

Informatica cloud-based services were patched by our operations team. Please check that your connectivity to these services is operating normally, especially any SSL or TLS settings.

Cloud Service Version Patch / Remediation
Springbok Beta No action necessary. The Springbok infrastructure has been patched by Informatica Cloud Operations.
ActiveVOS/Cloud All No action necessary. The ActiveVOS/Cloud infrastructure has been patched by Informatica Cloud Operations.
Cloud/ICS All No action necessary. The ICS infrastructure has been patched by Informatica Cloud Operations.

Informatica takes the security of our customers’ data very seriously. Please refer to this Informatica’s Knowledge Base article, or contact our Global Customer Support team if you have any questions or concerns about our product SSL configurations in your environment. The Informatica support portal is always available at http://mysupport.informatica.com.

3 – How to contact Informatica about security

If you are security researcher and have identified a potential vulnerability in an Informatica product or service, please follow our Responsible Disclosure Program.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud Computing, Data Security | Tagged , , , | Leave a comment

Which Method of Controls Should You Use to Protect Sensitive Data in Databases and Enterprise Applications? Part II

Sensitive Data

Protecting Sensitive Data

To determine what is the appropriate sensitive data protection method to use, you should first answer the following questions regarding the requirements:

  • Do you need to protect data at rest (in storage), during transmission, and/or when accessed?
  • Do some privileged users still need the ability to view the original sensitive data or does sensitive data need to be obfuscated at all levels?
  • What is the granularity of controls that you need?
    • Datafile level
    • Table level
    • Row level
    • Field / column level
    • Cell level
    • Do you need to be able to control viewing vs. modification of sensitive data?
    • Do you need to maintain the original characteristics / format of the data (e.g. for testing, demo, development purposes)?
    • Is response time latency / performance of high importance for the application?  This can be the case for mission critical production applications that need to maintain response times in the order of seconds or sub-seconds.

In order to help you determine which method of control is appropriate for your requirements, the following table provides a comparison of the different methods and their characteristics.

data

A combination of protection method may be appropriate based on your requirements.  For example, to protect data in non-production environments, you may want to use persistent data masking to ensure that no one has access to the original production data, since they don’t need to.  This is especially true if your development and testing is outsourced to third parties.  In addition, persistent data masking allows you to maintain the original characteristics of the data to ensure test data quality.

In production environments, you may want to use a combination of encryption and dynamic data masking.  This is the case if you would like to ensure that all data at rest is protected against unauthorized users, yet you need to protect sensitive fields only for certain sets of authorized or privileged users, but the rest of your users should be able to view the data in the clear.

The best method or combination of methods will depend on each scenario and set of requirements for your environment and organization.  As with any technology and solution, there is no one size fits all.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data masking, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

Which Method of Controls Should You Use to Protect Sensitive Data in Databases and Enterprise Applications? Part I

Sensitive Data

Protecting Sensitive Data

I’m often asked to share my thoughts about protecting sensitive data. The questions that typically come up include:

  • Which types of data should be protected?
  • Which data should be classified as “sensitive?”
  • Where is this sensitive data located?
  • Which groups of users should have access to this data?

Because these questions come up frequently, it seems ideal to share a few guidelines on this topic.

When protecting the confidentiality and integrity of data, the first level of defense is Authentication and access control. However, data with higher levels of sensitivity or confidentiality may require additional levels of protection, beyond regular authentication and authorization methods.

There are a number of control methods for securing sensitive data available in the market today, including:

  • Encryption
  • Persistent (Static) Data Masking
  • Dynamic Data Masking
  • Tokenization
  • Retention management and purging

Encryption is a cryptographic method of encoding data.  There are generally, two methods of encryption:  symmetric (using single secret key) and asymmetric (using public and private keys).  Although there are methods of deciphering encrypted information without possessing the key, a good encryption algorithm makes it very difficult to decode the encrypted data without knowledge of the key.  Key management is usually a key concern with this method of control.  Encryption is ideal for mass protection of data (e.g. an entire data file, table, partition, etc.) against unauthorized users.

Persistent or static data masking obfuscates data at rest in storage.  There is usually no way to retrieve the original data – the data is permanently masked.  There are multiple techniques for masking data, including: shuffling, substitution, aging, encryption, domain-specific masking (e.g. email address, IP address, credit card, etc.), dictionary lookup, randomization, etc.  Depending on the technique, there may be ways to perform reverse masking  - this should be used sparingly.  Persistent masking is ideal for cases where all users should not see the original sensitive data (e.g. for test / development environments) and field level data protection is required.

Dynamic data masking de-identifies data when it is accessed.  The original data is still stored in the database.  Dynamic data masking (DDM) acts as a proxy between the application and database and rewrites the user / application request against the database depending on whether the user has the privilege to view the data or not.  If the requested data is not sensitive or the user is a privileged user who has the permission to access the sensitive data, then the DDM proxy passes the request to the database without modification, and the result set is returned to the user in the clear.  If the data is sensitive and the user does not have the privilege to view the data, then the DDM proxy rewrites the request to include a masking function and passes the request to the database to execute.  The result is returned to the user with the sensitive data masked.  Dynamic data masking is ideal for protecting sensitive fields in production systems where application changes are difficult or disruptive to implement and performance / response time is of high importance.

Tokenization substitutes a sensitive data element with a non-sensitive data element or token.  The first generation tokenization system requires a token server and a database to store the original sensitive data.  The mapping from the clear text to the token makes it very difficult to reverse the token back to the original data without the token system.  The existence of a token server and database storing the original sensitive data renders the token server and mapping database as a potential point of security vulnerability, bottleneck for scalability, and single point of failure. Next generation tokenization systems have addressed these weaknesses.  However, tokenization does require changes to the application layer to tokenize and detokenize when the sensitive data is accessed.  Tokenization can be used in production systems to protect sensitive data at rest in the database store, when changes to the application layer can be made relatively easily to perform the tokenization / detokenization operations.

Retention management and purging is more of a data management method to ensure that data is retained only as long as necessary.  The best method of reducing data privacy risk is to eliminate the sensitive data.  Therefore, appropriate retention, archiving, and purging policies should be applied to reduce the privacy and legal risks of holding on to sensitive data for too long.  Retention management and purging is a data management best practices that should always be put to use.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data masking, Data Security, Data Services, Enterprise Data Management | Tagged , , , | Leave a comment

Informatica Responsible Disclosure Policy

Notifying Informatica of Security Issues

Our team of security experts strive to quickly address security issues involving our products and services. For guidance please see below:

Problem or issue Informatica contact or resource
How do I report a security problem in an Informatica application, online service, or website? Email Responsible_Disclosure@informatica.com
How do I provide feedback on an Informatica product or service? Go to Informatica Support orEmail support@informatica.com
How do I report an email, website or pop-up window that falsely claims to represent Informatica? Contact security@informatica.com
Does Informatica have a Bug Bounty Program No, but we do participate in a Responsible Disclosure Program.

 

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Security | Tagged , , | Leave a comment

Informatica and the Shellshock Security Vulnerability

The security of information systems is a complex, shared responsibility between infrastructure, system and application providers. Informatica doesn’t take lightly the responsibility our customers have entrusted to us in this complex risk equation.

As Informatica’s Chief Information Security Officer, I’d like to share three important security updates with our customers:

  1. What you need to know about Informatica products and services relative to the latest industry-wide security concern,
  2. What you need to do to secure Informatica products against the ShellShock vulnerability, and
  3. How to contact Informatica if you have questions about Informatica product security.

1 – What you need to know

On September 24, 2014 a serious new cluster of vulnerabilities to Linux/Unix distributions was announced, classified as (CVE-2014-6271, CVE-2014-7169, CVE-2014-7186, CVE-2014-7187, CVE-2014-6277 and CVE-2014-6278) aka “Shellshock” or “Bashdoor”. What makes ShellShock so impactful is that it requires relatively low effort or expertise to exploit and gain privileged access to vulnerable systems.

Informatica’s cloud-hosted products, including Informatica Cloud Services (ICS) and our recently-launched Springbok beta, have already been patched to address this issue. We continue to monitor for relevant updates to both vulnerabilities and available patches.

Because this vulnerability is a function of the underlying Operating System, we encourage administrators of potentially vulnerable systems to assess their risk levels and apply patches and/or other appropriate countermeasures.

Informatica’s Information Security team coordinated an internal response with product developers to assess the vulnerability and make recommendations necessary for our on-premise products. Specific products and actions are listed below.

2 – What you need to do

Informatica products themselves require no patches to address the Shellshock vulnerability, they are not directly impacted. However, Informatica strongly recommends that you apply your OS vendors’ patches as they become available, since some applications allow customers to use shell scripts in their pre-and post-processing scripts. Specific Informatica products and remediations are listed below:

Cloud Service Version Patch / Remediation
Springbok Beta No action necessary. The Springbok infrastructure has been patched by Informatica Cloud Operations.
ActiveVOS/Cloud All No action necessary. The ActiveVOS/Cloud infrastructure has been patched by Informatica Cloud Operations.
Cloud/ICS All Customers should apply OS patches to all of their machines running a Cloud agent. Relevant Cloud/ICS hosted infrastructure has already been patched by Informatica Cloud Operations.

 

Product Version Patch / Remediation
PowerCenter All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
IDQ All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
MM, BG, IDE All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
PC Express All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
Data Services / Mercury stack All No direct impact. Customers who use shell scripts within their pre- / post-processing steps should apply OS patches to mitigate this vulnerability.
PWX mainframe & CDC All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
UM, VDS All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IDR, IFC All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
B2B DT, UDT, hparser, Atlantic All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
Data Archive All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
Dynamic data masking All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IDV All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
SAP Nearline No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed..
TDM No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
MDM All No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
IR / name3 No direct impact.  Recommend customers apply OS patch to all machines with INFA product installed.
B2B DX / DIH All DX & DIH on Red Hat Customers should apply OS patches.  Other OS customers still recommended to apply OS patch.
PIM All PIM core and Procurement are not not directly impacted. Recommend Media Manager customers apply OS patch to all machines with INFA product installed.
ActiveVOS All No direct impact for on-premise ActiveVOS product.  Cloud-realtime has already been patched.
Address Doctor All No direct impact for AD services run on Windows.  Procurement service has already been patched by Informatica Cloud Operations.
StrikeIron All No direct impact.

3 – How to contact Informatica about security

Informatica takes the security of our customers’ data very seriously. Please contact our Informatica’s Knowledge Base (article ID 301574), or our Global Customer Support team if you have any questions or concerns. The Informatica support portal is always available at http://mysupport.informatica.com.

If you are security researcher and have identified a potential vulnerability in an Informatica product or service, please follow our Responsible Disclosure Program.

Thank you,

Bill Burns, VP & Chief Information Security Officer

FacebookTwitterLinkedInEmailPrintShare
Posted in Cloud, Data Security, IaaS | Tagged , , , | Leave a comment