Tag Archives: test data management
I was at an IT conference a few years ago. The speaker was talking about application testing. At the beginning of his talk, he asked the audience:
“Please raise your hand if you flew here from out of town.”
Most of the audience raised their hands. The speaker then said:
“OK, now if you knew that the airplane you flew on had been tested the same way your company tests its applications, would you have still flown on that plane?
After some uneasy chuckling, every hand went down. Not a great affirmation of the state of application testing in most IT shops. (more…)
Informatica Recognized By Gartner as a Leader in Data Masking and by Infosecurity for Best Security Software
Informatica was named as a leader in the 2012 Gartner Magic Quadrant for Data Masking. A couple of weeks ago, Infosecurity named Informatica as a finalist for Best Security Software for 2013.
Both the Gartner Magic Quadrant for Data Masking and Infosecurity Products Guide recognized Informatica for continued innovation:
- Gartner states, “The data masking portfolio has been broadening. In addition to SDM technology… the market is beginning to offer dynamic data masking (DDM)… ” (more…)
Informatica was listed as a leader in the industry’s first Gartner Magic Quadrant for Data Masking Technology. Finally, the data masking market gets a main stage role in one of the fastest growing enterprise software markets – data security. With the incredible explosion of data and the resulting number of places our personal information exists in the cybersphere, this confirmation is desperately needed as we enter into 2013. (more…)
Personally Identifiable Information is under attack like never before. In the news recently two prominent organizations—institutions—were attacked. What happened:
- A data breach at a major U.S. Insurance company exposed over a million of their policyholders to identity fraud. The data stolen included Personally Identifiable information such as names, Social Security numbers, driver’s license numbers and birth dates. In addition to Nationwide paying million dollar identity fraud protection to policyholders, this breach is creating fears that class action lawsuits will follow. (more…)
Given the reputation risk and the cost of security breaches, organizations know they should be implementing data privacy in all their environments—whether it’s in production or test and development.
But the question I often get is: I know we need to better data security, but how do I prioritize these projects above other projects?
As our customers have shared with us the time and cost savings they achieved using our software, we have created a business value assessment that uses those benchmarks to calculate the benefits organizations would achieve by implementing a solution like Informatica Data Privacy. This business value assessment is based on the best practices for managing the data privacy lifecycle and includes the following phases seen below.
For each of these phases, we have collected how our customers have benefited and used those figures as the basis for calculating benefits for any organization using the Informatica solution. We’ve also used industry benchmarks to calculate risk mitigation and hardware cost savings. Following are the benefits our customers have realized and map to the data privacy life cycle.
Accelerate Sensitive Data Discovery – Rapidly identify sensitive data across all legacy and packaged applications
Increase Development Productivity – Develop global masking rules more efficiently
Increase Testing Productivity – Reduce the time it takes to capture optimal test case data
Increase Quality – Use realistic data in QA and development to reduce later rework and fixes
Risk Mitigation – Avoid breaches, reducing victim notification costs, fines
Hardware Reduction – Subset and create smaller copies of production for test purposes, reducing storage costs
Increase Compliance Reporting Productivity – Prove compliance through automated reports on masked data
Outsourcing Savings – Because data is masked, companies can then outsource application development or support.
As a result of using this type of assessment early in their project cycle, our customers have successfully made the case to prioritize these data privacy projects.
Let us know if you want to talk about the Business Value Assessment for Data Privacy— so you too can say, “I know we need to mitigate risk—and here’s how we can minimize the costs of doing so.”
In a May 2012 survey by the Ponemon Institute, 66 percent said they are not confident their organization would be able to detect the loss or theft of sensitive personal information contained in systems operated by third parties, including cloud providers. In addition, the majority are not confident that their organization would be able detect the loss or theft of sensitive personal information in their company’s production environment.
Which aspect of data security for your cloud solution is most important?
1. Is it to protect the data in copies of production/cloud applications used for test or training purposes? For example, do you need to secure data in your Salesforce.com Sandbox?
2. Is it to protect the data so that a user will see data based on her/his role, privileges, location and data privacy rules?
3. Is it to protect the data before it gets to the cloud?
As compliance continues to drive people to action, compliance with contractual agreements, especially for the cloud infrastructure continues to drive investment. In addition, many organizations are supporting Salesforce.com as well as packaged solutions such as Oracle eBusiness, Peoplesoft, SAP, and Siebel.
Of the available data protection solutions, tokenization has been used and is well known for supporting PCI data and preserving the format and width of a table column. But because many tokenization solutions today require creating database views or changing application source code, it has been difficult for organizations to support packaged applications that don’t allow these changes. In addition, databases and applications take a measurable performance hit to process tokens.
What might work better is to dynamically tokenize data before it gets to the cloud. So there would be a transparent layer between the cloud and on-premise data integration that would replace the sensitive data with tokens. In this way, additional code to the application would not be required.
In the Ponemon survey, most said the best control is to dynamically mask sensitive information based on the user’s privilege level. After dynamically masking sensitive data, people said encrypting all sensitive information contained in the record is the best option.
The strange thing is that people recognize there is a problem but are not spending accordingly. In the same survey from Ponemon, 69% of organizations find it difficult to restrict user access to sensitive information in IT and business environments. However, only 33% say they have adequate budgets to invest in the necessary solutions to reduce the insider threat.
Is this an opportunity for you?
Hear Larry Ponemon discuss the survey results in more detail during a CSOonline.com/Computerworld webinar, Data Privacy Challenges and Solutions: Research Findings with Ponemon Institute, on Wednesday, June 13.
ROI Tool To Help Make The Business Case For Database Archiving, Application Retirement, Test Data Management, And Data Masking
Though the benefits of containing the size of your databases by archiving seems obvious in terms of saving costs and improving performance, quantifying those benefits in terms of dollar savings requires more thought. The same is true when it comes to the costs that can be eliminated by retiring redundant legacy applications. Some of the savings may come from hard dollar costs such as:
- Backup devices
- Maintenance contracts
- Software licenses (more…)
Recently at Informatica World, 2010 in Washington, DC, Richard Clark was a featured speaker during one of the general sessions. He was the former Counterterrorism Czar, serving multiple presidencies in the White House, working for the Pentagon and the Intelligence Community, and is currently the Chairman of Good Harbor Consulting Services, LLC – a 360° Security Risk Management firm. There was no one better suited to discuss corporate information security and risk management where the entire theme of the event was Beyond Boundaries.
I recently visited a client running multiple SAP applications with three non-production copies per environment - a separate copy for Test, Development, and Training. When asked what data they were using for the non-production copies, they stated they preferred to use data from production because they were guaranteed to have the latest, up to date information which should eliminate any testing issues associated with the data itself.
Test data sets need to be created to validate or confirm specific use cases during testing and development phases for packaged or custom database applications. Most companies use full copies of production data to seed test data sets. Using live, up to date data is preferable by Quality Assurance teams to increase confidence in the testing results. Two key issues with using full live data sets are increasing costs as well as introducing security risks.
Full Copies of Production Data Sets Increase Cost
As the data volumes grow, so does each copy of the data used in each test environment, increasing the cost of infrastructure required to store and maintain performance with larger data volumes and increasing the time it takes to complete testing cycles. According to the Enterprise Strategy Group, the number of secondary copies of production data sets required for development, testing and training is four (at a minimum). Multiply the size of the production data sets for each copy to get the total cost of ownership. With larger data sets, queries and reports take longer to complete. Many times, functional tests only require a small segment of data to validate a test. Subsets of test data would be adequate for most testing scenarios. (more…)