Tag Archives: Data Privacy

Streamlining and Securing Application Test and Development Processes

Informatica recently hosted a webinar with Cognizant who shared how they streamline test data management processes internally with Informatica Test Data Management and pass on the benefits to their customers.  Proclaimed as the world’s largest Quality Engineering and Assurance (QE&A) service provider, they have over 400 customers and thousands of testers and are considered a thought leader in the testing practice.

We polled over 100 attendees on what their top challenges were with test data management considering the data and system complexities and the need to protect their client’s sensitive data.  Here are the results from that poll:

It was not surprising to see that generating test data sets and securing sensitive data in non-production environments were tied as the top two biggest challenges.   Data integrity/synchronization was a very close 3rd .

Cognizant with Informatica has been evolving its test data management offering to truly focus on not only securing sensitive data – but also improving testing efficiencies with identifying, provisioning and resetting test data – tasks that consume as much as 40% of testing cycle times.  As part of the next generation test data management platform, key components of that solution include:

Sensitive Data Discovery – an integrated and automated process that searches data sets looking for exposed sensitive data.  Many times, sensitive data resides in test copies unbeknownst to auditors.  Once data has been located, data can be masked in non-production copies.

Persistent Data Masking – masks sensitive data in-flight while cloning data from production or in-place on a gold copy.  Data formats are preserved while original values are completely protected.

Data Privacy Compliance Validation – auditors want to know that data has in fact been protected, the ability to validate and report on data privacy compliance becomes critical.

Test Data Management – in addition to creating test data subsets, clients require the ability to synthetically generate test data sets to eliminate defects by having data sets aligned to optimize each test case. Also, in many cases, multiple testers work on the same environment and may clobber each other’s test data sets.  Having the ability to reset test data becomes a key requirement to improve efficiencies.

lockner 2

Figure 2 Next Generation Test Data Management

When asked what tools or services that have been deployed, 78% said in-house developed scripts/utilities.  This is an incredibly time-consuming approach and one that has limited repeatability. Data masking was deployed in almost half of the respondents.

lockner 3

Informatica with Cognizant are leading the way to establishing a new standard for Test Data Management by incorporating both test data generation, data masking, and the ability to refresh or reset test data sets.  For more information, check out Cognizant’s offering based on Informatica: TDMaxim and White Paper: Transforming Test Data Management for Increased Business Value.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data masking, Data Migration, Data Privacy | Tagged , , , , | Leave a comment

Comparing Encryption and Tokenization with Dynamic Data Masking

In recent conversations regarding solutions to implement for data privacy, our Dynamic Data Masking team put together the following table to highlight the differences between encryption / tokenization and Dynamic Data Masking (DDM).  Best practices dictate that both should be implemented in an enterprise for the most comprehensive and complete data security strategy.  For the purpose of this blog, here are a few definitions:

Dynamic Data Masking (DDM) protects sensitive data when it is retrieved based on policy without requiring the data to be altered when it is stored persistently.  Authorized users will see true data, unauthorized users will see masked values in the application.  No coding is required in the source application.

Encryption / tokenization protects sensitive data by altering its values when stored persistently while being able to decrypt and present the original values when requested by authorized users.  The user is validated by a separate service which then provides a decryption key.  Unauthorized users will only see the encrypted values.  In many cases, applications need to be altered requiring development work.

Use case Tokenization DDM
Business users access PII Business users work with actual SSN and personal values in the clear (not with tokenized values). As the data is tokenized in the database, it needs to be de-tokenized every time it is accessed by users – which is done be changing the application source-code (imposing costs and risks), and causing performance penalty.For example, if a user needs to retrieve information on a client with SSN = ‘987-65-4329’, the application needs to de-tokenize the entire tokenized SSN column to identify the correct client info – a costly operation. This is why implementation scope is limited. As DDM does not change the data in the database, but only masks it when accessed by unauthorized users, authorized users do not experience any performance hit nor require application source-code changes.For example, if an authorized user needs to retrieve information on a client with SSN = ‘987-65-4329’, his request is untouched by DDM. As the SSN stored in the database is not changed, there is no performance penalty involved.In case an unauthorized user retrieves the same SSN, DDM masks the SQL request, causing the sensitive data result (e.g., name, address, CC and age) to be masked, hidden or completely blocked.
Privileged Infrastructure DBA have access to the database server files Personal Identifiable Information (PII) stored in the database files is tokenized, ensuring that the few administrators that have uncontrolled access to the database servers cannot see it PII stored in the database files remains in the clear. The few administrators that have uncontrolled access to the database servers can potentially access it.
Production support, application developers, DBAs, consultants, outsource and offshore teams These groups of users have application super-user privileges, seen by the tokenization solution as authorized, and as such access PII in the clear!!! These users are identified by DDM as unauthorized, and as such are masked, hidden or blocked, protecting the PII.
Data warehouse protection Implementing tokenization on Data warehouses requires tedious database changes and causes performance penalty:1.Loading or reporting upon millions of PII records requires to tokenize/de-tokenize each record.2.Running a report with a condition on a tokenized value (e.g., when having a condition: SSN like (‘%333’) causes the de-tokenization of the entire column).

Massive database configuration changes are required to use the tokenization API, creating and maintaining hundreds of views.

No performance penalty.No need to change reports, databases or to create views.

 

Combining both DDM and encryption/tokenization presents an opportunity to deliver complete data privacy without the need to alter the application or write any code.

Informatica works with its encryption and tokenization partners to deliver comprehensive data privacy protection in packaged applications, data warehouses and Big Data platforms such as Hadoop.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data masking, Data Privacy, Data Quality | Tagged , , , , | Leave a comment

Informatica Wins Award for Best Security Software

Last night Informatica was given the Silver award for Best Security Software by Info Security. The Best Security Software was one of the most competitive categories—with 8 finalists offering technologies ranging from mobile to cloud security.
Informatica won the award for its new Cloud Data Masking solution. Starting in June of last year, Informatica has steadily released a series of new Cloud solutions for data security. Informatica is the first to offer a comprehensive, data governance based solution for cloud data privacy. This solution addresses the full lifecycle of data privacy, including:

  •   Defining and classifying sensitive data
  •   Discovering where sensitive data lives
  •   Applying consistent data masking rules
  •   Measuring and monitoring to prove compliance

The Cloud Data Masking adds to Informatica’s leading cloud integration solution for salesforce.com includes data synchronization, data replication, data quality, and master data management.

 
Why is Cloud Data Masking important?

 

Sensitive data is at risk of being exposed during application development and testing, where it is important to use real production data to rigorously test applications. As reported by the Ponemon Institute, a data breach costs organizations on average $5.5 million dollars.

 

What does Cloud Data Masking do?

 

Based on Informatica’s market leading Data Masking technology, Informatica’s new Cloud Data Masking enables cloud customers to secure sensitive information during the testing phase by directly masking production data used within cloud sandboxes, creating realistic-looking, but de-identified data. Customers are therefore able to protect sensitive information from unintended exposure during development, test and training activities; streamline cloud projects by reducing the time it takes to mask test/training/development environments; and ensure compliance with mounting privacy regulations.

 

What do people do today?

 

Many organizations today will hand the masking efforts over to IT. This inevitably lengthens development cycles and delays releases. One of Informatica’s longtime customers and current partners, David Cheung of Cloud Sherpas, stated “Many customers wait days for IT to change the sensitive or confidential data, delaying releases. For example, I was at customer last week where the customer was waiting 5 days for IT to mask the sensitive data.”
Others use scripting or manual methods to mask the data. One prospect I spoke to recently said he manually altered the data but missed a few email addresses. So during a test run, the company accidentally sent emails to customers. These customers called back to demand what was going on. Do you want that to happen to you?

Visit Informatica Cloud Data Masking for more information.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data masking, Data Privacy | Tagged , , , , , | Leave a comment

Ready for Your Data Security Audit?

In a recent survey of Informatica customers,
• Over 60% of companies had a security audit in the last year
• 35% of the companies had an internal security audit
• 16% of the companies had both an internal security audit and one performed by an external auditor
• In addition, many of these organizations saw that another company in their same industry suffered a data breach.

These results are reinforced by the discussions I had with Audit and Compliance IT owners from various industries. Audits are on the rise as more customers require these audits before purchase. Compliance IT requires reports at a database or system level showing that the data has been protected. And they want to see these reports on a regular basis as data, including test data pulled from production environments, changes frequently.

Driving these audits and Informatica projects to protect data were the following top regulatory drivers (as reported by customers):
• SOX
• PCI
• PII
• PHI

These results are reinforced by the increasing use of Informatica’s regulatory and industry packs (containing pre-built rules and metadata), including PCI, PHI and PII. In addition to these areas, organizations I’ve spoken to are implementing projects to also protect non-public information, or confidential company information. For example, last week I spoke to a company about how they share detailed financial information about their company as part of the data they said to an outsourced partner. This financial information could be easily used to estimate company’s revenues and profits for any given quarter—before that information is released to the street, if at all.

In this same survey, the top benefits customers said that Informatica’s solution addressed included:
• Increasing productivity by leveraging pre-built masking techniques, accelerators and purpose-built tools
• Reducing the time it took to identify and capture optimal test cases, therefore reducing overall testing time
• Reducing the risk of data breach

Are you ready for your data security audit?

For more information on Informatica’s data security solutions for non-production environments, please join us for an upcoming webinar:

http://bit.ly/W5IciG

For more information on Informatica’s data security solutions in general, please see:

http://bit.ly/PGcJkq

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data masking, Data Privacy, Uncategorized | Tagged , , , , | Leave a comment

Data Privacy Still the Talk of the Town

Recently, the UK’s Parliament and the Internet conference brought together leading figures from Government, Parliament, academia and the industry to discuss and debate the most pressing policy issues facing the Internet.

As expected, data privacy and security was top of the agenda for much of the day, with a number of discussions highlighting the extent to which consumer data is being exposed to security risks and the need for the right legislation and protection to keep it safe. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Privacy | Tagged , , , | Leave a comment

Informatica Recognized By Gartner as a Leader in Data Masking and by Infosecurity for Best Security Software

Informatica was named as a leader in the 2012 Gartner Magic Quadrant for Data Masking. A couple of weeks ago, Infosecurity named Informatica as a finalist for Best Security Software for 2013.
Both the Gartner Magic Quadrant for Data Masking and Infosecurity Products Guide recognized Informatica for continued innovation:

  •  Gartner states, “The data masking portfolio has been broadening. In addition to SDM technology… the market is beginning to offer dynamic data masking (DDM)… ” (more…)
FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data masking, Data Privacy, Uncategorized | Tagged , , , , , , | Leave a comment

Data Security from the Inside—Out

Earlier this week I met with security leaders at some of the largest organizations in the San Francisco Bay Area.  They highlighted disturbing trends, in addition to the increased incidence of breaches they see increased:

-          Numbers of customer who want to do security audits of their company

-          Number of RFPs in which information is required about data security

-          Litigation from data security breaches— and occurrences of class action lawsuits—as opposed to regulatory fines driving concerns

So much attention has been placed on defending the perimeter that many organizations feel they are in an arms race. Part of the problem is that it’s not clear how effective the firewalls are. While firewalls may be a part of the solution, organizations are increasingly looking at how to make their applications bulletproof and centralize controls. One of the high risk areas are systems where people have more access than they need to.

For example, many organizations have created copies of production environments for test, development and training purposes. As a result this data can be completely exposed and the confidential aspects are at risk of being leaked intentionally or unintentionally. I spoke to a customer a couple of weeks ago who had tried to change the email addresses in their test database. But they missed a few.  As a result, during a test run, they sent their customers emails. Their customers called back and asked what was going on. That was when we started talking to them about a masking solution that would permanently mask the data in these environments. In this way they would have the best data to test with and all sensitive details obliterated.

Another high risk area is with certain users, for example cloud administrators, who have access to all data in the clear. As a result, the administrators have access to account numbers and social security numbers that they don’t need in order to do their jobs. Here, masking these values would enable them to still see the passwords they need to do their jobs. But it would prevent the breach of the other confidential data.

Going back to the concerns the security leaders had, how do you prove to your customers that you have data security? Especially, if it’s difficult to prove the effectiveness of a firewall? This is where reports on what data was masked and what it was masked to comes in. Yes, you can pay for cyberinsurance to cover your losses for when you have a breach. But wouldn’t it be better to prevent the breaches in the first place and showing how you’ve done it? Try looking at the problem from the inside—out.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data masking, Data Privacy, Financial Services, Governance, Risk and Compliance, Healthcare, SaaS | Tagged , , , , , | Leave a comment

Informatica Educates Oracle OpenWorld 2012 Attendees On How To Balance The Big Data Balancing Act

Thousands of Oracle OpenWorld 2012 attendees visited the Informatica booth to learn how to leverage their combined investments in Oracle and Informatica technology.  Informatica delivered over 40 presentations on topics that ranged from cloud, to data security to smart partitioning. Key Informatica executives and experts, from product engineering and product management, spoke with hundreds of users on topics and answered questions on how Informatica can help them improve Oracle application performance, lower risk and costs, and reduce project timelines. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Cloud Computing, Data Migration, Data Privacy, Data Quality, data replication, Master Data Management, Partners | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment

How Do You Make the Business Case for Data Privacy?

 

Given the reputation risk and the cost of security breaches, organizations know they should be implementing data privacy in all their environments—whether it’s in production or test and development.
But the question I often get is: I know we need to better data security, but how do I prioritize these projects above other projects?
As our customers have shared with us the time and cost savings they achieved using our software, we have created a business value assessment that uses those benchmarks to calculate the benefits organizations would achieve by implementing a solution like Informatica Data Privacy. This business value assessment is based on the best practices for managing the data privacy lifecycle and includes the following phases seen below.

 

For each of these phases, we have collected how our customers have benefited and used those figures as the basis for calculating benefits for any organization using the Informatica solution. We’ve also used industry benchmarks to calculate risk mitigation and hardware cost savings. Following are the benefits our customers have realized and map to the data privacy life cycle.

Accelerate Sensitive Data Discovery – Rapidly identify sensitive data across all legacy and packaged applications
Increase Development Productivity – Develop global masking rules more efficiently
Increase Testing Productivity – Reduce the time it takes to capture optimal test case data
Increase Quality – Use realistic data in QA and development to reduce later rework and fixes
Risk Mitigation – Avoid breaches, reducing victim notification costs, fines
Hardware Reduction – Subset and create smaller copies of production for test purposes, reducing storage costs
Increase Compliance Reporting Productivity – Prove compliance through automated reports on masked data
Outsourcing Savings – Because data is masked, companies can then outsource application development or support.

As a result of using this type of assessment early in their project cycle, our customers have successfully made the case to prioritize these data privacy projects.
Let us know if you want to talk about the Business Value Assessment for Data Privacy— so you too can say, “I know we need to mitigate risk—and here’s how we can minimize the costs of doing so.”

FacebookTwitterLinkedInEmailPrintShare
Posted in Data masking, Data Privacy, Uncategorized | Tagged , , , , , | Leave a comment

Where Are the Data Security Weaknesses in Your Cloud Solution?

In a May 2012 survey by the Ponemon Institute, 66 percent said they are not confident their organization would be able to detect the loss or theft of sensitive personal information contained in systems operated by third parties, including cloud providers. In addition, the majority are not confident that their organization would be able detect the loss or theft of sensitive personal information in their company’s production environment. 

 Which aspect of data security for your cloud solution is most important?

 1. Is it to protect the data in copies of production/cloud applications used for test or training purposes?  For example, do you need to secure data in your Salesforce.com Sandbox?

2.  Is it to protect the data so that a user will see data based on her/his role, privileges, location and data privacy rules? 

3.  Is it to protect the data before it gets to the cloud?

 As compliance continues to drive people to action, compliance with contractual agreements, especially for the cloud infrastructure continues to drive investment.  In addition, many organizations are supporting Salesforce.com as well as packaged solutions such as Oracle eBusiness, Peoplesoft, SAP, and Siebel. 

Of the available data protection solutions, tokenization has been used and is well known for supporting PCI data and preserving the format and width of a table column. But because many tokenization solutions today require creating database views or changing application source code, it has been difficult for organizations to support packaged applications that don’t allow these changes. In addition, databases and applications take a measurable performance hit to process tokens.

What might work better is to dynamically tokenize data before it gets to the cloud. So there would be a transparent layer between the cloud and on-premise data integration that would replace the sensitive data with tokens.  In this way, additional code to the application would not be required.

In the Ponemon survey, most said the best control is to dynamically mask sensitive information based on the user’s privilege level.  After dynamically masking sensitive data, people said encrypting all sensitive information contained in the record is the best option.

The strange thing is that people recognize there is a problem but are not spending accordingly.  In the same survey from Ponemon, 69% of organizations find it difficult to restrict user access to sensitive information in IT and business environments. However, only 33% say they have adequate budgets to invest in the necessary solutions to reduce the insider threat.

Is this an opportunity for you? 

Hear Larry Ponemon discuss the survey results in more detail during a CSOonline.com/Computerworld webinar, Data Privacy Challenges and Solutions: Research Findings with Ponemon Institute, on Wednesday, June 13.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data Governance, Data masking, Data Privacy, Financial Services, Governance, Risk and Compliance, Healthcare, Informatica 9.5, Pervasive Data Quality | Tagged , , , , , , , | Leave a comment