Tag Archives: data security

What types of data need protecting?

In my first article on the topic of citizens’ digital health and safety we looked at the states’ desire to keep their citizens healthy and safe and also at the various laws and regulations they have in place around data breaches and losses. The size and scale of the problem together with some ideas for effective risk mitigation are in this whitepaper.

The cost and frequency of data breaches continue to rise

The cost and frequency of data breaches continue to rise

Let’s now start delving a little deeper into the situation states are faced with. It’s pretty obvious that citizen data that enables an individual to be identified (PII) needs to be protected. We immediately think of the production data: data that is used in integrated eligibility systems; in health insurance exchanges; in data warehouses and so on. In some ways the production data is the least of our problems; our research shows that the average state has around 10 to 12 full copies of data for non-production (development, test, user acceptance and so on) purposes. This data tends to be much more vulnerable because it is widespread and used by a wide variety of people – often subcontractors or outsourcers, and often the content of the data is not well understood.

Obviously production systems need access to real production data (I’ll cover how best to protect that in the next issue), on the other hand non-production systems of every sort do not. Non-production systems most often need realistic, but not real data and realistic, but not real data volumes (except maybe for the performance/stress/throughput testing system). What need to be done? Well to start with, a three point risk remediation plan would be a good place to start.

1. Understand the non-production data using sophisticated data and schema profiling combined with NLP (Natural Language Processing) techniques help to identify previously unrealized PII that needs protecting.
2. Permanently mask the PII so that it is no longer the real data but is realistic enough for non-production uses and make sure that the same masking is applied to the attribute values wherever they appear in multiple tables/files.
3. Subset the data to reduce data volumes, this limits the size of the risk and also has positive effects on performance, run-times, backups etc.

Gartner has just published their 2013 magic quadrant for data masking this covers both what they call static (i.e. permanent or persistent masking) and dynamic (more on this in the next issue) masking. As usual the MQ gives a good overview of the issues behind the technology as well as a review of the position, strengths and weaknesses of the leading vendors.

It is (or at least should be) an imperative that from the top down state governments realize the importance and vulnerability of their citizens data and put in place a non-partisan plan to prevent any future breaches. As the reader might imagine, for any such plan to success needs a combination of cultural and organizational change (getting people to care) and putting the right technology – together these will greatly reduce the risk. In the next and final issue on this topic we will look at the vulnerabilities of production data, and what can be done to dramatically increase its privacy and security.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data Archiving, Data Governance, Data masking, Data Privacy, Public Sector | Tagged , , , , , | Leave a comment

No, Cloud Won’t Chase the Data Analytics Gremlins Away

Hosting Big Data applications in the cloud has compelling advantages. Scale doesn’t become as overwhelming an issue as it is within on-premise systems. IT will no longer feel compelled to throw more disks at burgeoning storage requirements, and performance becomes the contractual obligation of someone else outside the organization.

Cloud may help clear up some of the costlier and thornier problems of attempting to manage Big Data environments, but it also creates some new issues. As Ron Exler of Saugatuck Technology recently pointed out in a new report, cloud-based solutions “can be quickly configured to address some big data business needs, enabling outsourcing and potentially faster implementations.” However, he adds, employing the cloud also brings some risks as well.

Data security is one major risk area, and I could write many posts on this. But management issues also present other challenges. Too many organizations see cloud as an cure-all for their application and data management ills, but broken processes are never fixed when new technology is applied to them. There are also plenty of risks with the misappropriation of big data, and the cloud won’t make these risks go away. Exler lists some of the risks that stem from over-reliance on cloud technology, from the late delivery of business reports to the delivery of incorrect business information, resulting in decisions based on incorrect source data. Sound familiar? The gremlins that have haunted data analytic and management for years simply won’t disappear behind a cloud.

Exler makes three recommendations for moving big data into cloud environments – note that the solutions he proposes have nothing to do with technology, and everything to do with management:

1) Analyze the growth trajectory of your data and your business. Typically, organizations will have a lot of different moving parts and interfaces. And, as the business grows and changes, it will be constantly adding new data sources.  As Exler notes, “processing integration or hand off points in such piecemeal approaches represent high risk to data in the chain of possession – from collection points to raw data to data edits to data combination to data warehouse to analytics engine to viewing applications on multiple platforms.” Business growth and future requirements should be analyzed and modeled to make sure cloud engagements will be able “to provide adequate system performance, availability, and scalability to account for the projected business expansion,” he states.

2) Address data quality issues as close to the source as possible.  Because both cloud and big data environments have so many moving parts,  “finding the source of a data problem can be a significant challenge,” Exler warns. “Finding problems upstream in the data flow prevent time-consuming and expensive reprocessing that could be needed should errors be discovered downstream.” Such quality issues have a substantial business cost as well. When data errors are found, it becomes “an expensive company-wide fire drill to correct the data,” he says.

3) Build your project management, teamwork and communication skills. Because big data and cloud projects involve so many people and components from across the enterprise, requiring coordination and interaction between various specialists, subject matter experts, vendors, and outsourcing partners. “This coordination is not simple,” Exler warns. “Each group involved likely has different sets of terminology, work habits, communications methods, and documentation standards. Each group also has different priorities; oftentimes such new projects are delegated to lower priority for supporting groups.” Project managers must be leaders and understand the value of open and regular communications.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Cloud Computing, Data Quality | Tagged , , , | Leave a comment

Informatica Wins Award for Best Security Software

Last night Informatica was given the Silver award for Best Security Software by Info Security. The Best Security Software was one of the most competitive categories—with 8 finalists offering technologies ranging from mobile to cloud security.
Informatica won the award for its new Cloud Data Masking solution. Starting in June of last year, Informatica has steadily released a series of new Cloud solutions for data security. Informatica is the first to offer a comprehensive, data governance based solution for cloud data privacy. This solution addresses the full lifecycle of data privacy, including:

  •   Defining and classifying sensitive data
  •   Discovering where sensitive data lives
  •   Applying consistent data masking rules
  •   Measuring and monitoring to prove compliance

The Cloud Data Masking adds to Informatica’s leading cloud integration solution for salesforce.com includes data synchronization, data replication, data quality, and master data management.

 
Why is Cloud Data Masking important?

 

Sensitive data is at risk of being exposed during application development and testing, where it is important to use real production data to rigorously test applications. As reported by the Ponemon Institute, a data breach costs organizations on average $5.5 million dollars.

 

What does Cloud Data Masking do?

 

Based on Informatica’s market leading Data Masking technology, Informatica’s new Cloud Data Masking enables cloud customers to secure sensitive information during the testing phase by directly masking production data used within cloud sandboxes, creating realistic-looking, but de-identified data. Customers are therefore able to protect sensitive information from unintended exposure during development, test and training activities; streamline cloud projects by reducing the time it takes to mask test/training/development environments; and ensure compliance with mounting privacy regulations.

 

What do people do today?

 

Many organizations today will hand the masking efforts over to IT. This inevitably lengthens development cycles and delays releases. One of Informatica’s longtime customers and current partners, David Cheung of Cloud Sherpas, stated “Many customers wait days for IT to change the sensitive or confidential data, delaying releases. For example, I was at customer last week where the customer was waiting 5 days for IT to mask the sensitive data.”
Others use scripting or manual methods to mask the data. One prospect I spoke to recently said he manually altered the data but missed a few email addresses. So during a test run, the company accidentally sent emails to customers. These customers called back to demand what was going on. Do you want that to happen to you?

Visit Informatica Cloud Data Masking for more information.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data masking, Data Privacy | Tagged , , , , , | Leave a comment

Ready for Your Data Security Audit?

In a recent survey of Informatica customers,
• Over 60% of companies had a security audit in the last year
• 35% of the companies had an internal security audit
• 16% of the companies had both an internal security audit and one performed by an external auditor
• In addition, many of these organizations saw that another company in their same industry suffered a data breach.

These results are reinforced by the discussions I had with Audit and Compliance IT owners from various industries. Audits are on the rise as more customers require these audits before purchase. Compliance IT requires reports at a database or system level showing that the data has been protected. And they want to see these reports on a regular basis as data, including test data pulled from production environments, changes frequently.

Driving these audits and Informatica projects to protect data were the following top regulatory drivers (as reported by customers):
• SOX
• PCI
• PII
• PHI

These results are reinforced by the increasing use of Informatica’s regulatory and industry packs (containing pre-built rules and metadata), including PCI, PHI and PII. In addition to these areas, organizations I’ve spoken to are implementing projects to also protect non-public information, or confidential company information. For example, last week I spoke to a company about how they share detailed financial information about their company as part of the data they said to an outsourced partner. This financial information could be easily used to estimate company’s revenues and profits for any given quarter—before that information is released to the street, if at all.

In this same survey, the top benefits customers said that Informatica’s solution addressed included:
• Increasing productivity by leveraging pre-built masking techniques, accelerators and purpose-built tools
• Reducing the time it took to identify and capture optimal test cases, therefore reducing overall testing time
• Reducing the risk of data breach

Are you ready for your data security audit?

For more information on Informatica’s data security solutions for non-production environments, please join us for an upcoming webinar:

http://bit.ly/W5IciG

For more information on Informatica’s data security solutions in general, please see:

http://bit.ly/PGcJkq

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data masking, Data Privacy, Uncategorized | Tagged , , , , | Leave a comment

Informatica Recognized By Gartner as a Leader in Data Masking and by Infosecurity for Best Security Software

Informatica was named as a leader in the 2012 Gartner Magic Quadrant for Data Masking. A couple of weeks ago, Infosecurity named Informatica as a finalist for Best Security Software for 2013.
Both the Gartner Magic Quadrant for Data Masking and Infosecurity Products Guide recognized Informatica for continued innovation:

  •  Gartner states, “The data masking portfolio has been broadening. In addition to SDM technology… the market is beginning to offer dynamic data masking (DDM)… ” (more…)
FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data masking, Data Privacy, Uncategorized | Tagged , , , , , , | Leave a comment

The Next Frontier of Data Security: Minimizing the Threat of an Internal Data Breach

Adam Wilson, General Manager of ILM at Informatica talks about the next frontier of data security. The more data that is passed around internally, the more risk your company runs for a data breach. Find out why auditors are taking a closer look at the number of internal data copies that are floating around and what it means for your company’s risk of a data leak.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data masking | Tagged , , | Leave a comment

Is Your Customer and Employee Information Safe?

Personally Identifiable Information is under attack like never before.  In the news recently two prominent organizations—institutions—were attacked.  What happened:

  • A data breach at a major U.S. Insurance company exposed over a million of their policyholders to identity fraud.  The data stolen included Personally Identifiable information such as names, Social Security numbers, driver’s license numbers and birth dates.  In addition to Nationwide paying million dollar identity fraud protection to policyholders, this breach is creating fears that class action lawsuits will follow. (more…)
FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data masking, Financial Services, Public Sector | Tagged , , , , , , , , , , | 1 Comment

Data Security from the Inside—Out

Earlier this week I met with security leaders at some of the largest organizations in the San Francisco Bay Area.  They highlighted disturbing trends, in addition to the increased incidence of breaches they see increased:

-          Numbers of customer who want to do security audits of their company

-          Number of RFPs in which information is required about data security

-          Litigation from data security breaches— and occurrences of class action lawsuits—as opposed to regulatory fines driving concerns

So much attention has been placed on defending the perimeter that many organizations feel they are in an arms race. Part of the problem is that it’s not clear how effective the firewalls are. While firewalls may be a part of the solution, organizations are increasingly looking at how to make their applications bulletproof and centralize controls. One of the high risk areas are systems where people have more access than they need to.

For example, many organizations have created copies of production environments for test, development and training purposes. As a result this data can be completely exposed and the confidential aspects are at risk of being leaked intentionally or unintentionally. I spoke to a customer a couple of weeks ago who had tried to change the email addresses in their test database. But they missed a few.  As a result, during a test run, they sent their customers emails. Their customers called back and asked what was going on. That was when we started talking to them about a masking solution that would permanently mask the data in these environments. In this way they would have the best data to test with and all sensitive details obliterated.

Another high risk area is with certain users, for example cloud administrators, who have access to all data in the clear. As a result, the administrators have access to account numbers and social security numbers that they don’t need in order to do their jobs. Here, masking these values would enable them to still see the passwords they need to do their jobs. But it would prevent the breach of the other confidential data.

Going back to the concerns the security leaders had, how do you prove to your customers that you have data security? Especially, if it’s difficult to prove the effectiveness of a firewall? This is where reports on what data was masked and what it was masked to comes in. Yes, you can pay for cyberinsurance to cover your losses for when you have a breach. But wouldn’t it be better to prevent the breaches in the first place and showing how you’ve done it? Try looking at the problem from the inside—out.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Data masking, Data Privacy, Financial Services, Governance, Risk and Compliance, Healthcare, SaaS | Tagged , , , , , | Leave a comment

Informatica Educates Oracle OpenWorld 2012 Attendees On How To Balance The Big Data Balancing Act

Thousands of Oracle OpenWorld 2012 attendees visited the Informatica booth to learn how to leverage their combined investments in Oracle and Informatica technology.  Informatica delivered over 40 presentations on topics that ranged from cloud, to data security to smart partitioning. Key Informatica executives and experts, from product engineering and product management, spoke with hundreds of users on topics and answered questions on how Informatica can help them improve Oracle application performance, lower risk and costs, and reduce project timelines. (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Big Data, Cloud Computing, Data Migration, Data Privacy, Data Quality, data replication, Master Data Management, Partners | Tagged , , , , , , , , , , , , , , , , , , | Leave a comment

Data Mining and Analytics in the Presidential Race – Sinister or Just Sensible?

A lot of media reports have been surfacing lately about “secretive” data mining activities taking place within the presidential campaign. Many articles paint the efforts with a sinister caste, implying that underhanded invasions of privacy are taking place.

But to any seasoned data professional, data mining is a discovery tool that pulls nuggets of insight out of mountains of data. For any business that wants to get ahead in today’s hyper-competitive global economy, advanced data mining and analysis is not a luxury, it is a necessity. As USA Today’s Jack Gillum describes the Romney campaign’s data analytics: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Data Privacy | Tagged , , , , , , , | Leave a comment