Tag Archives: test data management
Informatica recently hosted a webinar with Cognizant who shared how they streamline test data management processes internally with Informatica Test Data Management and pass on the benefits to their customers. Proclaimed as the world’s largest Quality Engineering and Assurance (QE&A) service provider, they have over 400 customers and thousands of testers and are considered a thought leader in the testing practice.
We polled over 100 attendees on what their top challenges were with test data management considering the data and system complexities and the need to protect their client’s sensitive data. Here are the results from that poll:
It was not surprising to see that generating test data sets and securing sensitive data in non-production environments were tied as the top two biggest challenges. Data integrity/synchronization was a very close 3rd .
Cognizant with Informatica has been evolving its test data management offering to truly focus on not only securing sensitive data – but also improving testing efficiencies with identifying, provisioning and resetting test data – tasks that consume as much as 40% of testing cycle times. As part of the next generation test data management platform, key components of that solution include:
Sensitive Data Discovery – an integrated and automated process that searches data sets looking for exposed sensitive data. Many times, sensitive data resides in test copies unbeknownst to auditors. Once data has been located, data can be masked in non-production copies.
Persistent Data Masking – masks sensitive data in-flight while cloning data from production or in-place on a gold copy. Data formats are preserved while original values are completely protected.
Data Privacy Compliance Validation – auditors want to know that data has in fact been protected, the ability to validate and report on data privacy compliance becomes critical.
Test Data Management – in addition to creating test data subsets, clients require the ability to synthetically generate test data sets to eliminate defects by having data sets aligned to optimize each test case. Also, in many cases, multiple testers work on the same environment and may clobber each other’s test data sets. Having the ability to reset test data becomes a key requirement to improve efficiencies.
Figure 2 Next Generation Test Data Management
When asked what tools or services that have been deployed, 78% said in-house developed scripts/utilities. This is an incredibly time-consuming approach and one that has limited repeatability. Data masking was deployed in almost half of the respondents.
Informatica with Cognizant are leading the way to establishing a new standard for Test Data Management by incorporating both test data generation, data masking, and the ability to refresh or reset test data sets. For more information, check out Cognizant’s offering based on Informatica: TDMaxim and White Paper: Transforming Test Data Management for Increased Business Value.
I was at an IT conference a few years ago. The speaker was talking about application testing. At the beginning of his talk, he asked the audience:
“Please raise your hand if you flew here from out of town.”
Most of the audience raised their hands. The speaker then said:
“OK, now if you knew that the airplane you flew on had been tested the same way your company tests its applications, would you have still flown on that plane?
After some uneasy chuckling, every hand went down. Not a great affirmation of the state of application testing in most IT shops. (more…)
Informatica Recognized By Gartner as a Leader in Data Masking and by Infosecurity for Best Security Software
Informatica was named as a leader in the 2012 Gartner Magic Quadrant for Data Masking. A couple of weeks ago, Infosecurity named Informatica as a finalist for Best Security Software for 2013.
Both the Gartner Magic Quadrant for Data Masking and Infosecurity Products Guide recognized Informatica for continued innovation:
- Gartner states, “The data masking portfolio has been broadening. In addition to SDM technology… the market is beginning to offer dynamic data masking (DDM)… ” (more…)
Informatica was listed as a leader in the industry’s first Gartner Magic Quadrant for Data Masking Technology. Finally, the data masking market gets a main stage role in one of the fastest growing enterprise software markets – data security. With the incredible explosion of data and the resulting number of places our personal information exists in the cybersphere, this confirmation is desperately needed as we enter into 2013. (more…)
Personally Identifiable Information is under attack like never before. In the news recently two prominent organizations—institutions—were attacked. What happened:
- A data breach at a major U.S. Insurance company exposed over a million of their policyholders to identity fraud. The data stolen included Personally Identifiable information such as names, Social Security numbers, driver’s license numbers and birth dates. In addition to Nationwide paying million dollar identity fraud protection to policyholders, this breach is creating fears that class action lawsuits will follow. (more…)
Given the reputation risk and the cost of security breaches, organizations know they should be implementing data privacy in all their environments—whether it’s in production or test and development.
But the question I often get is: I know we need to better data security, but how do I prioritize these projects above other projects?
As our customers have shared with us the time and cost savings they achieved using our software, we have created a business value assessment that uses those benchmarks to calculate the benefits organizations would achieve by implementing a solution like Informatica Data Privacy. This business value assessment is based on the best practices for managing the data privacy lifecycle and includes the following phases seen below.
For each of these phases, we have collected how our customers have benefited and used those figures as the basis for calculating benefits for any organization using the Informatica solution. We’ve also used industry benchmarks to calculate risk mitigation and hardware cost savings. Following are the benefits our customers have realized and map to the data privacy life cycle.
Accelerate Sensitive Data Discovery – Rapidly identify sensitive data across all legacy and packaged applications
Increase Development Productivity – Develop global masking rules more efficiently
Increase Testing Productivity – Reduce the time it takes to capture optimal test case data
Increase Quality – Use realistic data in QA and development to reduce later rework and fixes
Risk Mitigation – Avoid breaches, reducing victim notification costs, fines
Hardware Reduction – Subset and create smaller copies of production for test purposes, reducing storage costs
Increase Compliance Reporting Productivity – Prove compliance through automated reports on masked data
Outsourcing Savings – Because data is masked, companies can then outsource application development or support.
As a result of using this type of assessment early in their project cycle, our customers have successfully made the case to prioritize these data privacy projects.
Let us know if you want to talk about the Business Value Assessment for Data Privacy— so you too can say, “I know we need to mitigate risk—and here’s how we can minimize the costs of doing so.”
In a May 2012 survey by the Ponemon Institute, 66 percent said they are not confident their organization would be able to detect the loss or theft of sensitive personal information contained in systems operated by third parties, including cloud providers. In addition, the majority are not confident that their organization would be able detect the loss or theft of sensitive personal information in their company’s production environment.
Which aspect of data security for your cloud solution is most important?
1. Is it to protect the data in copies of production/cloud applications used for test or training purposes? For example, do you need to secure data in your Salesforce.com Sandbox?
2. Is it to protect the data so that a user will see data based on her/his role, privileges, location and data privacy rules?
3. Is it to protect the data before it gets to the cloud?
As compliance continues to drive people to action, compliance with contractual agreements, especially for the cloud infrastructure continues to drive investment. In addition, many organizations are supporting Salesforce.com as well as packaged solutions such as Oracle eBusiness, Peoplesoft, SAP, and Siebel.
Of the available data protection solutions, tokenization has been used and is well known for supporting PCI data and preserving the format and width of a table column. But because many tokenization solutions today require creating database views or changing application source code, it has been difficult for organizations to support packaged applications that don’t allow these changes. In addition, databases and applications take a measurable performance hit to process tokens.
What might work better is to dynamically tokenize data before it gets to the cloud. So there would be a transparent layer between the cloud and on-premise data integration that would replace the sensitive data with tokens. In this way, additional code to the application would not be required.
In the Ponemon survey, most said the best control is to dynamically mask sensitive information based on the user’s privilege level. After dynamically masking sensitive data, people said encrypting all sensitive information contained in the record is the best option.
The strange thing is that people recognize there is a problem but are not spending accordingly. In the same survey from Ponemon, 69% of organizations find it difficult to restrict user access to sensitive information in IT and business environments. However, only 33% say they have adequate budgets to invest in the necessary solutions to reduce the insider threat.
Is this an opportunity for you?
Hear Larry Ponemon discuss the survey results in more detail during a CSOonline.com/Computerworld webinar, Data Privacy Challenges and Solutions: Research Findings with Ponemon Institute, on Wednesday, June 13.
ROI Tool To Help Make The Business Case For Database Archiving, Application Retirement, Test Data Management, And Data Masking
Though the benefits of containing the size of your databases by archiving seems obvious in terms of saving costs and improving performance, quantifying those benefits in terms of dollar savings requires more thought. The same is true when it comes to the costs that can be eliminated by retiring redundant legacy applications. Some of the savings may come from hard dollar costs such as:
– Backup devices
– Maintenance contracts
– Software licenses (more…)
Recently at Informatica World, 2010 in Washington, DC, Richard Clark was a featured speaker during one of the general sessions. He was the former Counterterrorism Czar, serving multiple presidencies in the White House, working for the Pentagon and the Intelligence Community, and is currently the Chairman of Good Harbor Consulting Services, LLC – a 360° Security Risk Management firm. There was no one better suited to discuss corporate information security and risk management where the entire theme of the event was Beyond Boundaries.
I recently visited a client running multiple SAP applications with three non-production copies per environment – a separate copy for Test, Development, and Training. When asked what data they were using for the non-production copies, they stated they preferred to use data from production because they were guaranteed to have the latest, up to date information which should eliminate any testing issues associated with the data itself.