Tag Archives: data masking
In a May 2012 survey by the Ponemon Institute, 66 percent said they are not confident their organization would be able to detect the loss or theft of sensitive personal information contained in systems operated by third parties, including cloud providers. In addition, the majority are not confident that their organization would be able detect the loss or theft of sensitive personal information in their company’s production environment.
Which aspect of data security for your cloud solution is most important?
1. Is it to protect the data in copies of production/cloud applications used for test or training purposes? For example, do you need to secure data in your Salesforce.com Sandbox?
2. Is it to protect the data so that a user will see data based on her/his role, privileges, location and data privacy rules?
3. Is it to protect the data before it gets to the cloud?
As compliance continues to drive people to action, compliance with contractual agreements, especially for the cloud infrastructure continues to drive investment. In addition, many organizations are supporting Salesforce.com as well as packaged solutions such as Oracle eBusiness, Peoplesoft, SAP, and Siebel.
Of the available data protection solutions, tokenization has been used and is well known for supporting PCI data and preserving the format and width of a table column. But because many tokenization solutions today require creating database views or changing application source code, it has been difficult for organizations to support packaged applications that don’t allow these changes. In addition, databases and applications take a measurable performance hit to process tokens.
What might work better is to dynamically tokenize data before it gets to the cloud. So there would be a transparent layer between the cloud and on-premise data integration that would replace the sensitive data with tokens. In this way, additional code to the application would not be required.
In the Ponemon survey, most said the best control is to dynamically mask sensitive information based on the user’s privilege level. After dynamically masking sensitive data, people said encrypting all sensitive information contained in the record is the best option.
The strange thing is that people recognize there is a problem but are not spending accordingly. In the same survey from Ponemon, 69% of organizations find it difficult to restrict user access to sensitive information in IT and business environments. However, only 33% say they have adequate budgets to invest in the necessary solutions to reduce the insider threat.
Is this an opportunity for you?
Hear Larry Ponemon discuss the survey results in more detail during a CSOonline.com/Computerworld webinar, Data Privacy Challenges and Solutions: Research Findings with Ponemon Institute, on Wednesday, June 13.
In a May 2012 report just released by the Ponemon Institute, 69 percent of organizations find it difficult to restrict user access to sensitive information in IT and business environments. On top of that 66% say their organizations find it difficult to comply with privacy and data protection regulations. So organizations are finding it hard to keep up with new regulation at the same time they are unable to secure data from internal users. It’s no wonder that in this same report 50% say that data has been compromised or stolen by malicious insiders such as privileged users. (more…)
It’s hard to miss data privacy in the headlines these days. Banks and insurance companies have not only had their customer information compromised, but they need to keep up with changing privacy regulations (PCI DSS, GLB, EU Data Protection Directive, US Privacy Laws)—or be fined. The impact is staggering—and costly. For example, last year Citigroup had more information compromised from their 200,000 bank cardholders. HSBC faced $5M in fines for inadequate data security.
But personal information is not the only type of data that needs to be protected. We’ve spoken to our customers about the need to protect sensitive information that includes financial information about a client, revenues, purchasing and pricing information. In addition I’ve spoken to organizations that are looking to keep and protect sensitive information across business units (so that one business unit will have restricted access to another business unit’s data). (more…)
It seems like every day a new data breach splashes across the news. As consumers, patients, customers and social networkers many of us have a plethora of information stored in various databases well outside our control. Data security officers, DBAs and other security specialists continue to do their best to educate, protect and anticipate both internal and external threats. But … the breaches continue and so do their associated costs. There are many technologies from encryption to tokenization to database activity monitoring (DAM) to data loss prevention (DLP).
Informatica just released a new option to the mix: dynamic data masking. The technology came into the company through the acquisition of ActiveBase. Since then I’ve had a number of people ask me if Informatica Dynamic Data Masking will complement or replace an organization’s existing data security technologies.
Everyone is worried about data security and privacy as they should be; for data to be trusted, users and management need confidence in not just knowing that data is correct, but also in knowing that it is secure and that access is permitted only in controlled situations. There is no shortage of security disaster stories, but I’m not worried about production data since it is at the heart of application management disciplines which, while still not perfect, have had 50 years to mature. This perspective is stated succinctly by Ronald Reagan when he spoke about the economy and said “I am not worried about the deficit. It is big enough to take care of itself.” (more…)
ROI Tool To Help Make The Business Case For Database Archiving, Application Retirement, Test Data Management, And Data Masking
Though the benefits of containing the size of your databases by archiving seems obvious in terms of saving costs and improving performance, quantifying those benefits in terms of dollar savings requires more thought. The same is true when it comes to the costs that can be eliminated by retiring redundant legacy applications. Some of the savings may come from hard dollar costs such as:
- Backup devices
- Maintenance contracts
- Software licenses (more…)
Recently at Informatica World, 2010 in Washington, DC, Richard Clark was a featured speaker during one of the general sessions. He was the former Counterterrorism Czar, serving multiple presidencies in the White House, working for the Pentagon and the Intelligence Community, and is currently the Chairman of Good Harbor Consulting Services, LLC – a 360° Security Risk Management firm. There was no one better suited to discuss corporate information security and risk management where the entire theme of the event was Beyond Boundaries.
I recently visited a client running multiple SAP applications with three non-production copies per environment - a separate copy for Test, Development, and Training. When asked what data they were using for the non-production copies, they stated they preferred to use data from production because they were guaranteed to have the latest, up to date information which should eliminate any testing issues associated with the data itself.