Category Archives: Data Security

Data Is Precious: Secure it Accordingly

Data Is Precious: Secure it Accordingly

Data Is Precious: Secure it Accordingly

Security professionals are in dire need of a solution that provides visibility into where sensitive and confidential data resides, as well as visibility into the data’s risk. [1] This knowledge would allow those responsible to take an effective, proactive approach to combating cybercrime. By focusing on the data, Informatica and our customers, partners and market ecosystem are collaborating to make data-centric security with Data Security Intelligence the next line of defense.

Security technologies that focus on securing the network and perimeter require additional safeguards when sensitive and confidential data traverse beyond these protective controls. Data proliferates to cloud-based applications and mobile devices. Application security and identity access management tools may lack visibility and granular control when data is replicated to Big Data and advanced analytics platforms.

Informatica is filling this need with its data-centric security portfolio, which now includes Secure@Source.  Informatica Secure@Source is the industry’s first data security intelligence solution that delivers insight into where sensitive and confidential data reside, as well as the data’s risk profile.

To hear more from Informatica and an esteemed panel of security experts about where the future of security is going and why Informatica Secure@Source is an ideal solution for the data challenges around security, please register here. Panelist include:

The opportunity for Data Security Intelligence is extensive. In a recently published report, Neuralytix defined Data-Centric Security as “an approach to security that focuses on the data itself; to cover the gaps of traditional network, host and application security solutions.” A critical element for successful data security is collecting intelligence required to prioritize where to focus security controls and efforts that mitigate risk. This is precisely what Informatica Secure@Source was designed to achieve.

What has emerged from a predominantly manual practice, the data security intelligence software market is expected to reach $800M by 2018 with a CAGR of 27.8%. We are excited about this opportunity! As a leader in data management software, we are uniquely qualified to take an active role in shaping this emerging market category.

Informatica Secure@Source addresses the need to get smarter about where our sensitive and private data reside, who is accessing it, prioritize which controls to implement, and work harmoniously with existing security architectures, policies and procedures. Our customers are asking us for data security intelligence, the industry deserves it. With more than 60% of security professionals stating their biggest challenge is not knowing where their sensitive and confidential data reside, the need for Data Security Intelligence has never been greater

Neuralytix says “data security is about protecting individual data objects that traverse across networks, in and out of a public or private cloud, from source applications to targets such as partner systems, to back office SaaS applications to data warehouses and analytics platforms”. We couldn’t agree more. We believe that the best way to incorporate a data-centric security approach is to begin with data security intelligence.

[1] “The State of Data Centric Security,” Ponemon Institute, sponsored by Informatica, June 2014

Share
Posted in Data Security | Tagged , | Leave a comment

SailPoint Partners with Informatica Secure@Source

Secure@Source

Informatica World 2015

As part of the Informatica Secure@Source launch, Data Security Group Director of Business Development, Christophe Hassaine, interviewed our partner SailPoint’s Vice President of Product Management, Paul Trulove.  They discuss the importance of data security intelligence to ensure effective identity and access management

Christophe: Tell us a little more about what SailPoint does?

Paul: SailPoint is a leader in Identity and Access Management. Our products, IdentityIQ and IdentityNow help customers get the right access to the right users at the right time. This helps keeps users productive while at the same time minimizing the risk of inappropriate access or non-compliant access to sensitive resources or data for the customer.

Christophe: What are the challenges you are seeing in the market?

Paul: One of the most significant challenges we’re seeing in the market today is around the amount of data being generated and stored in the enterprise. This is creating issues for IT security teams to restrict access to only those users with a valid business reason.

Christophe: Specifically what gaps do you see in customers’ data security posture?

Paul: There are two important gaps that we see in the approaches being used today: one is a general lack of visibility to where sensitive data is within the enterprise; the second is how access to it is managed as customer generally think about managing access from a higher-level than data. These issues are compounded by the fact that in most organizations the data management teams and technology don’t link tightly with the IAM teams and systems. This can create blind shots and slow reaction time when a security event is detected.

Christophe: Why is Data Security Intelligence important to your customers?

Paul: Data security intelligence is important because you can’t manage everything. You have to prioritize security controls based on risk or you don’t have a chance.

Christophe: What are your integration plans with Informatica Secure@Source?

Paul: We are working on several innovative integration options with Secure@Source. One of the main focus areas is around providing identity context for data events. Since SailPoint knows who has access to what across every system in the enterprise, we can tell Secure@Source who it should be looking at when a security event is detected.

We are also automating risk responses with Informatica. For example, when Secure@Source identifies and locates sensitive and confidential data, SailPoint IdentityIQ ensures only authorized users have appropriate levels of access, no matter where the data proliferates – on-premises or in the cloud.

Christophe: How will the joint offering benefit your customers?

Paul: By combining our industry-leading approach to identity and access management with Informatica’s innovative Data Security Intelligence, our joint customers can proactively gain control of risk and improve their security posture by managing and securing all end users and tying them to the data they create.

If you are not able to view the video, click here.

For more information, check out our product website at https://www.informatica.com/products/data-security/secure-at-source.html

Share
Posted in Data Security | Tagged , , | Leave a comment

Vormetric Partners with Informatica Secure@Source

Secure@Source

Informatica World 2015

Informatica recently launched the industry’s first data security intelligence offering, Secure@Source. Informatica’s Data Security Group Director of Business Development, Christophe Hassaine, interviewed our partner Vormetric’s Vice President of Product Management, Derek Tumulak to get his take on how our complementary solutions address the need for more data centric security.

Christophe:  Derek, tell us a little more about how Vormetric customers benefit from your offerings.

Derek: Vormetric provides data security solutions. We help organizations protect sensitive information assets and we enable them to achieve regulatory compliance and security requirements. We also help them protect against data breaches. Our solution benefits customers by protecting information in database and file servers, big data, and cloud environments.

Christophe: What are the shifts in the industry you see and what new challenges it creates?

Derek:  The challenges we see in the market today are data breaches that are occurring more frequently. The largest gaps are in the fact that historically organizations have focused on anti-virus and anti-malware solutions. Even today many organizations continue to focus on network/perimeter and host based solutions when they need to be more focused on data-centric security solutions that bring the controls closer to the data itself. Organizations need to be implementing encryption, tokenization, access control and comprehensive auditing solutions in order to better protect their sensitive data in any environment.

Christophe: Why is data security intelligence so important to your customers?

Derek:    Data security intelligence is important for our customers since they not only need to understand and classify the data they have but also need to understand potentially anomalous/suspicious access patterns and even failed attempts to access sensitive information by various users and applications. Based on this type of threat intelligence and analytics organizations can be proactive about adapting their access policies particularly in situations where an organization may be under attack.

Christophe: How will the integration between Vormetric and Secure@Source benefit your customers?

Derek: We are integrating with Informatica Secure@Source in two distinct areas. The first allows customers to implement encryption, tokenization, and sophisticated access controls in environments that Informatica identifies as having sensitive information and potentially inadequate data security controls. The second integration is around providing rich data access audit information to Secure@Source for increased threat intelligence and analytics. This benefits our common customers by giving them an end-to-end solution and a comprehensive view around the data security lifecycle. Customers can discover, protect, and continuously monitor sensitive data.

If you are not able to view the video, click here.

For more information, check out our product website at https://www.informatica.com/products/data-security/secure-at-source.html

Share
Posted in Data Security | Tagged , , | Leave a comment

Healthcare Data Masking: A Primer

Healthcare Data Masking: A Primer

Healthcare Data Masking: A Primer

When trying to protect your data from the nefarious souls that would like access to it (?), there are several options available that apply to very specific use cases. In order for us to talk about the different solutions – it is important to define all of the terms:

  • PII – Personally Identifiable Information – any data that could potentially identify a specific individual. Any information that can be used to distinguish one person from another and can be used for de-anonymizing anonymous data can be considered PII
  • GSA’s Rules of Behavior for Handling Personally Identifiable Information – This directive provides GSA’s policy on how to properly handle PII and the consequences and corrective actions that will be taken if a breach occurs
  • PHI – Protected Health Information – any information about health status, provision of health care, or payment for health care that can be lined to a specific individual
  • HIPAA Privacy Rule – The HIPAA Privacy Rule establishes national standards to protect individuals’ medical records and other personal health information and applies to health plans, health care clearinghouses, and those health care providers that conduct certain health care transactions electronically.  The Rule requires appropriate safeguards to protect the privacy of personal health information, and sets limits and conditions on the uses and disclosures that may be made of such information without patient authorization. The Rule also gives patients rights over their health information, including rights to examine and obtain a copy of their health records, and to request corrections.
  • Encryption – a method of protecting data by scrambling it into an unreadable form. It is a systematic encoding process which is only reversible with the right key.
  • Tokenization – a method of replacing sensitive data with non-sensitive placeholder tokens. These tokens are swapped with data stored in relational databases and files.
  • Data masking – a process that scrambles data, either an entire database or a subset. Unlike encryption, masking is not reversible; unlike tokenization, masked data is useful for limited purposes. There are several types of data masking:
    • Static data masking (SDM) masks data in advance of using it. Non production databases masked NOT in real-time.
    • Dynamic data masking (DDM) masks production data in real time
    • Data Redaction – masks unstructured content (PDF, Word, Excel)

Each of the three methods for protecting data (encryption, tokenization and data masking) have different benefits and work to solve different security issues . We’ll address them in a bit. For a visual representation of the three methods – please see the table below:

 

Original Value Encrypted Tokenized Masked
Last Name johnson 8UY%45Sj wjehneo simpson
First Name margaret 3%ERT22##$ owhksoes marge
SSN 585-88-9874 Mh9&o03ms)) 93nmvhf93na 345-79-4444

Encryption

For protecting PHI data – encryption is superior to tokenization. You encrypt different portions of personal healthcare data under different encryption keys. Only those with the requisite keys can see the data. This form of encryption requires advanced application support to manage the different data sets to be viewed or updated by different audiences. The key management service must be very scalable to handle even a modest community of users. Record management is particularly complicated. Encryption works better than tokenization for PHI – but it does not scale well.

Properly deployed, encryption is a perfectly suitable tool for protecting PII. It can be set up to protect archived data or data residing on file systems without modification to business processes.

  • To protect the data, you must install encryption and key management services to protect the data – this only protects the data from access that circumvents applications
  • You can add application layer encryption to protect data in use
    • This requires changing applications and databases to support the additional protection
    • You will pay the cost of modification and the performance of the application will be impacted

Tokenization

For tokenization of PHI – there are many pieces of data which must be bundled up in different ways for many different audiences. Using the tokenized data requires it to be de-tokenized (which usually includes a decryption process). This introduces an overhead to the process. A person’s medical history is a combination of medical attributes, doctor visits, outsourced visits. It is an entangled set of personal, financial, and medical data. Different groups need access to different subsets. Each audience needs a different slice of the data – but must not see the rest of it. You need to issue a different token for each and every audience. You will need a very sophisticated token management and tracking system to divide up the data, issuing and tracking different tokens for each audience.

Data Masking

Masking can scramble individual data columns in different ways so that the masked data looks like the original (retaining its format and data type) but it is no longer sensitive data. Masking is effective for maintaining aggregate values across an entire database, enabling preservation of sum and average values within a data set, while changing all the individual data elements. Masking plus encryption provide a powerful combination for distribution and sharing of medical information

Traditionally, data masking has been viewed as a technique for solving a test data problem. The December 2014 Gartner Magic Quadrant Report on Data Masking Technology extends the scope of data masking to more broadly include data de-identification in production, non-production, and analytic use cases. The challenge is to do this while retaining business value in the information for consumption and use.

Masked data should be realistic and quasi-real. It should satisfy the same business rules as real data. It is very common to use masked data in test and development environments as the data looks like “real” data, but doesn’t contain any sensitive information.

Share
Posted in 5 Sales Plays, Application ILM, Data Governance, Data masking, Data Privacy, Data Security, Governance, Risk and Compliance, Healthcare | Tagged , , , , , , , , | 1 Comment

Informatica World 2015 – Are you Ready for the Data Security and Privacy Track?

Data security and Privacy track

Informatica World 2015

For the first time ever Informatica will have a Data Security and Privacy track at our annual Informatica World event. This highlights the fact that Informatica is investing in a new and exciting area called Data Centric Security. Data Centric Security is comprised of two inter-related components: Data Security Intelligence (understanding where sensitive data resides and analyzing your risk) and Data Security Controls (traditional methods of protecting sensitive data through masking, archiving and retention/disposal).

The Data Security and Privacy (DS&P) track is packed with informative sessions to help you engage with Informatica colleagues, customers and partners around your topics of interest. A session guide for all of DS&P track sessions is published below as a ‘cheat sheet’ to help you quickly find the sessions of interest.

Here are some quick highlights of the Data Security and Privacy track-

  • Over 20 content filled breakout sessions
  • Hear from industry experts – IAPP, Neuralytix and Securosis
  • Customer Stories – JP Morgan Chase, Cisco, Agrium and Comcast share their successes
  • Latest roadmap and vision for Data Archive, Test Data Management (TDM) and Data Masking
  • Get a first look at the award winning Secure@Source application (Data Security Intelligence)
  • Hands on Lab Sessions- Sign-up for live demos of Secure@Source, Data Masking, TDM and Data Archive
  • Meet the Experts sessions where you can talk one-on-one with Informatica experts to get their guidance
  • Pavilion – Walk-up sessions where you can meet with Data Archive, Data Masking and Data Security experts

I hope you take advantage of these great resources and have a great Informatica World 2015.

Download ‘Data Security and Privacy Session’ Guide here or can go through the same as listed below:

Breakout Sessions, Tuesday, May 12th

Session Time Location
The New Security Perimeter: Data (DS&P Track Keynote, Amit Walia, SVP and GM Data Security) 10:45am – 11:15am Gracia 5
Data Centric Security for a Data Centric World (Jeff Northrop, IAPP, Robert Shields, Informatica) 11:30am – 12:15pm Gracia 5
First Look at  Secure@Source V1:  The CISO Perspective (Bill Burns, CISO, Informatica, Gary Patterson, Informatica) 1:30pm – 2:30pm Gracia 5
Informatica Big Data Ready Summit: Keynote Address (Anil Chakravarthy, EVP and Chief Product Officer) 1:40 – 2:25 Castellana 1
Big Data Keynote: Tom Davenport, Distinguished Professor in Management and Information Technology, Babson College 2:30 – 3:15 Castellana 1
Governing Data While Minimizing Risk with Data Security Intelligence  (Deloitte) 2:40 – 3:25 Gracia 5
Shore Up your Most Vulnerable Environments – Data Security & Privacy in Test Environments (Tom Petrocelli, Neuralytix, Research and Panel Discussion) 3:35pm – 4:20pm Gracia 5
The Big Data Journey: Traditional BI to Next Gen Analytics (Johnson & Johnson, Transamerica, Devon Energy, KPN) 4:15 – 4:30 Castellana 1
Data Security Vision and What’s New in Informatica Data Archive (Claudia Chandra, VP, Data Security Group, Informatica) 4:30pm – 5:30pm Gracia 5
Accelerate Big Data Projects with Informatica (Jeff Rydz, Informatica) 4:35 – 5:20 Castellana 1
Big Data Topic, Michael J. Franklin, Professor of Computer Science, UC Berkeley 5:20 -5:30 Castellana 1
  • Informatica World Pavilion 5:45 PM – 8:00 PM

Breakout Sessions, Wednesday, May 13th

Session Time Location
How JP Morgan Chase Created Testing Efficiencies and Secured Test Environments (JPMC) 10:45am – 11:45pm Gracia 5
Risky Business: How Data Archiving Can Save the [Compliance] Day (Brenda Kononen, Agrium) 2:00pm – 2:45pm Gracia 5
Securing Private Data:  Test Data Management and Data Masking Developer Tricks (Informatica) 2:55pm – 3:55pm Gracia 5
Application Consolidation & Migration Best Practices: Customer Panel (Discount Tire, Cisco, Verizon) 2:55pm – 3:55pm Gracia 2
Key Considerations for Next Generation Test Data Management Solutions (TCS & Comcast) 4:05pm – 4:50pm Gracia 5
Data Masking on Premise and in the Cloud:  What’s New and What’s Next (Informatica) 5:00pm – 5:45pm Gracia 5

 Meet the Experts Sessions, Wednesday, May 13th

Session Time Location
Data Security – Data Security and Privacy for Sensitive Data (Robert Shields, Gary Patterson) 12:00pm – 12:50pm, 1:00pm – 1:50pm and 2:55pm – 3:55pm Castelena 2

Informatica World Pavilion                     11:45 PM – 2:00 PM

 Breakout Sessions, Thursday, May 14th

Session Time Location
Security Controls: Mind the Gaps! (Adrian Lane, Securosis) 9:00am – 10:00am Gracia 5
Optimally Provision, Protect, Augment, and Maintain Test Data Sets with the Informatica Secure Testing Suite (Informatica) 10:10am – 11:10am Gracia 5
Retire Legacy Applications – Improve your Bottom-line while Managing Compliance (Cisco) 11:20am – 12:20pm Gracia 4
Data Security & Privacy:  Meet the Experts in Data Security (Informatica) 2:30pm – 3:30pm Gracia 5
  • Informatica World Pavillion 12:30 PM – 2:30 PM

Hands-On Labs

Session Time Location
Data Security & Privacy
Data Archive (Global Customer Support) Mon: 1:00, 3:00
Tue: 7:30, 11:45, 2:40, 4:25
Wed: 10:45, 12:45, 2:55, 5:00, ….7:00
Thu: 9:00, 11:20, 1:15
Fri: 7:30, 9:20, 11:15
Table 06a
Test Data Management (Global Customer Support) Mon: 2:00, 4:00
Tue: 10:45, 1:45, 3:35
Wed: 7:30, 11:45, 2:00, 4:05, 6:00
Thu: 7:30, 10:10, 12:15, 2:15
Fri: 8:25, 10:15
Table 06b
Retire Legacy Applications and Optimize Application Performance with Informatica Data Archive Mon: 1:00, 2:00, 3:00, 4:00
Tue: 7:30, 10:45, 11:45, 1:45, 2:40, …… 3:35, 4:25
Wed: 7:30, 10:45, 11:45, 12:45, ….2:00, 2:55, 4:05, 5:00, 6:00, 7:00
Thu: 7:30, 9:00, 10:10, 11:20, ….12:15, 1:15, 2:15
Fri: 7:30, 8:25, 9:20, 10:15, 11:15
Table 23
Protect Salesforce Sandboxes with Cloud Data Masking Mon: 1:00, 3:00Tue: 7:30, 11:45, 2:40, 4:25
Wed: 10:45, 12:45,

Thu: 1:15
Fri: 7:30, 11:15

Table 24a
Optimally Provision Test Data Sets with Test Data Management Mon: 2:00, 4:00
Tues: 10:45, 1:45, 3:35
Wed: 7:30, 11:45, ,2:00, 2:55, ….4:05, 5:00, 6:00 7:00
Thurs: 7:30, 9:00, 10:10, 11:20, ….12:15, 2:15
Fri: 8:25, 9:20, 10:15
Table 24b
Data Security Intelligence with Secure@Source Mon: 1:00, 2:00, 3:00
Tue: 10:45, 11:45, 2:40, 3:35
Wed: 10:45, 11:45, 2:00, 5:00, ….6:00, 7:00Thu: 7:30, 9:00, 10:10, 12:15, 1:15
Fri: 7:30, 8:25, 10:15, 11:15
Table 36a
Data Centric Security with Data Masking Mon: 4:00
Tue: 7:30, 1:45, 4:25
Wed: 7:30, 12:45, 2:55, 4:05Thu: 11:20, 2:15
Fri: 9:20
Table 36b
Share
Posted in Data Security | Tagged , | Leave a comment

The Cost versus Risk of a Security Breach Conversation

The Chief Information Security Officer (CISO) and the Chief Risk Officer (CRO) generally speak in different languages. One speaks about how to secure an organization and its assets. The other speaks about the potential of losing something of value. One area where they find common ground is in the shared conversation of the Cost versus Risk of a data breach.

A data breach costs an organization in the US on average $201 per stolen record.[1] The risk of a data breach is a number between 1 and 10 that indicates how at risk your organization is.[2] The cost of implementing security measures and controls ranges based on the acceptable levels of risk an organization is willing to take.

This is the conversation that needs to be mastered in order to communicate the need for more resources to Chief Financial Officers and the rest of the C-Suite.

As organizations conduct vulnerability assessments of their IT landscape, they get a sense for how at risk their environments and systems are of being breached. Yet, in many cases, these vulnerability tools have significant blind spots when users replicate data to applications and systems that are not within reach of their assessment tools. This requires the addition of a data-centric approach to classifying, categorizing and measuring the value of data and its potential risk.

In the Informatica Secure@Source launch event, Larry Ponemon of the Ponemon Institute describes during a panel session how great it would be if there were a tool that could tell you ‘ here is the risk of the data’ and ‘here is the cost of that risk to the organization’.  That is exactly what Secure@Source was designed to accomplish.

If you are unable to view the video, click here.

Security teams are not surprisingly consistently under-resourced.  Teams are constantly responding to alerts and intelligence feeds which causes a cry of need for more resources. Yet, if these teams had a view into where the data was most at risk and could focus their energy on prioritized assets that if secured at the source would eliminate downstream risk, they may find their world less overwhelming.

[1] http://www.ponemon.org

[2] http://breachlevelindex.com

Share
Posted in Data Security | Tagged , , , | Leave a comment

Build Security-Mindedness in Your Organization

Throughout the RSA conference this week, there was a steady drumbeat calling out the need for building a security mindset in an organization. Many breaches are caused by people making mistakes in our work places. How can you stop breaches caused by the human factor? It is all about increasing awareness and actively making an effort to build security mindedness into everything we do.

During one RSA breakout session entitled, How One Smart Phone Picture Can Take Down Your Company, Dr. Larry Ponemon, Founder of the Ponemon Institute, describes how a hacker really only needs one piece of valuable information to unlock a large-scale data breach, which can be achieved by taking a snapshot of log-in credentials on a screen and other low-tech means.  In his research report, Visual Hacking Experimental Study, he cites how ‘certain situations are more risky. Documents on vacant desks and data visible on computer screens are most likely to be hacked.’ This research report was sponsored by 3M – which makes sense since they sell privacy screens for computers, iPads and iPhones.

What is really needed is to make teams aware of the risk and vulnerabilities through education and training, through policy definitions and enforcement, and through constant reminders from leadership.

One startup company, Apozy, took a novel approach using gamification to incentivize employees to incorporate best practices in their day to day routines. Informatica’s own CISO, Bill Burns, is using an internal competition between departments to motivate management to incorporate best practices.

While we continue to invest in technology to automate the implementation and enforcement of policies through controls, we also need to look at who we are hiring and incorporating the security conversation into the on-boarding process.

When recruiting, look to colleges and universities that offer courses and degrees in cybersecurity. (Check out the Ponemon Institute 2014 Best Schools for Cybersecurity).  Arnold Federbaum, Adjunt Professor of Cyber Security at NYU School of Engineering discusses Data Security Culture and Higher Education in a panel video recorded during the Informatica Secure@Source product launch.

If you unable to view the video, click here.

Even the IRS has great training videos and podcasts to build awareness on potential risks of identity theft.

As we continue to see more data breach related news, it will be important to emphasize a security mindedness in an organizations culture, build policies that make sense and that have the appropriate level of enforcement, and if it is critical to your business, prioritize hiring those with a formal education and background in cybersecurity.

Share
Posted in Data Security | Tagged , , , | Leave a comment

Data Privacy Needs Data Security Intelligence and Controls

logo_rsac

RSA Conference, San Francisco

In an RSA Conference session entitled IAPP: Engineering Privacy: Why Security Isn’t Enough, Sagi Leizerov, E&Y’s Privacy Practice leader began with a plea:

‘We need effective ways to bring together privacy and security controls in an automated way”

Privacy professionals, according to Sagi, essentially need help in determining the use of information – which is a foundational definition of data privacy. Security tools and controls can provide the information necessary to perform that type of investigation conducted by privacy officers.   Yet as data proliferates, are the existing security tools truly up for the task?

In other sessions, such as A Privacy Primer for Security Officers , many speakers are claiming that Data Security projects get prioritized as a result of a need to comply with Data Privacy policies and legislation.

We are in an age where data proliferation is one of the major sources of pain for both Chief Information Security Officers and Chief Privacy and Risk Officers (CPO/CRO). Business systems that were designed to automate key business processes store sensitive and private information are primary sources of data for business analytics. As more business users want access data to understand the state of their businesses, data naturally proliferates. Data proliferates to spreadsheets and presentations, emailed in and out of a corporate network, and potentially stored in a public cloud storage offering.

Even though the original intention for using this information was likely all above board, one security violation could potentially open up a can of worms for nefarious characters to take advantage of this data for mal intent. Jeff Northrop, the CTO of the International Association of Privacy Professionals (IAPP) suggests we need to close the gap between security and privacy in a panel discussion with Larry Ponemon, founder of the Ponemon Institute.

Sagi concluded his session by stating ‘Be a voice of change in your organization. Pilot products, be courageous, give new ideas a chance.’ In the recent launch of Informatica Secure@Source,  we discuss the need for more alignment between security and privacy teams and the industry seems to agree. Congratulations to the Informatica Secure@Source development team for their recent announcement of winning Gold Medal in the New Product and Service Category at the Info Security Products Guide 2015 Global Excellence Awards!

For more on the importance of Data Security Intelligence in Privacy, watch Larry Ponemon, Founder of the Ponemon Institute and Jeff Northrop, CTO IAPP discuss this topic with Arnold Federbaum, former CISO and Adjunct Professor, NYU, and Linda Hewlett, Sr Enterprise Security Architect, Santander Holdings USA.

If unable to view the video, click here.

Share
Posted in Data Privacy, Data Security, Governance, Risk and Compliance | Tagged , , , , | Leave a comment

Striking Gold at RSA

logo_rsac

Striking Gold at RSA

Each year, many RSA exhibitors vie for the prestigious Info Security Product Guide’s security awards.  Informatica has fared well in in past years with our data- centric security solutions, earning gold, silver, bronze and the Most Innovative Security Product of the year.  These awards were for our dynamic and cloud masking products.

On the heels of our April 8 Secure@Source announcement, we were cautiously optimistic that the 50 judges who rank entries would find this solution innovative, valuable and contemporary to the needs of information security.  We were honored to receive the Gold award for the “New Product and Services” category.

While these awards represent a significant achievement for Informatica, they more importantly highlight the growing recognition of data-centric security.  They echo what our customers, partners and advisors have told us; improving information security requires focus on the data itself.

While encryption, APT protection and network security analytics dominate the RSA show floor, data-centric security is creeping into the pitches of many exhibitors and is cited in many of the session presentations.  With their utilization of Informatica’s data-centric security solution, our customers have certainly been early adopters and innovators in data-centric security, understanding its importance and benefits before the masses have recognized the need.

So what does Secure@Source offer for our clients?  To summarize, it analyzes sensitive data and determines its risk.  There is much to this story though.  First, to understand sensitive data risk requires a solution that can classify and discover sensitive data in its many combinations and permutations.  There are solutions that have been in the market for sensitive data discovery.  However, the complex data structures, relationships and proliferation mandate the need for high precision and rules-based discovery.   Informatica ’ s 20 years of understanding, integrating and managing data from disparate sources provides a distinct advantage for this capability.

Second, for accurate risk scoring, sensitive data protection, value, size and proliferation must be precisely determined so that sensitive data risk scores have meaning and relevance to the organization.  With location and risk scores, organizations can identify the data security priorities and align network and host security accordingly.  Informatica brings to bear it heritage in all things data to provide powerful analytics and scoring; delivering abstracted views for decision makers and detailed views for practitioners.

Finally, Secure@Source provides a repeatable and automated solution for sensitive data risk, replacing time consuming and manual audits and surveys, and replacing disparate tools.  Risk assessment is timely, repeatable, accurate and auditable.

The need for data security intelligence and for data protection beyond encryption is emerging in organizations that understand that protecting the network perimeter and host security are important.  But, most importantly, focus on the target of breaches, the data.  Understand, prioritize by risk and implement the controls to reduce risk.  The combination of data-centric security with traditional security will significantly improve what organizations urgently need; risk reduction and breach resiliency.

Share
Posted in Data Security | Tagged , | Leave a comment

Tips From Bill Burns For the First Time CISO

This week at the RSA conference, Informatica’s CISO Bill Burns presented to a packed room filled with security professionals coaching new and aspiring CISOs what battles to fight from the perspective of changing your frame of reference. This was, in my opinion, one of the most useful sessions of the day. Bill’s practical advice and insights made a lot of sense. Here are the top ideas I took away from the presentation.

The role of the CISO, at the end of the day, is to raise the bar of an organization’s security posture and leave it in a better place than when they arrived. With this as the context of his advice, he continued to review frames of reference a CISO should have when fighting for budget, resources, and mindshare.

Risk vs Threat

Focus on what you can control. You don’t know when the next zero day will be. You can’t predict when an attack will happen – but you prepare. Reduce the impact in the event of an attack. Conduct vulnerability assessments and change the conversation to things you can do.

Data vs Opinion

Use a data-driven approach to drive fact-based conversations. Use the scientific method to propose a hypothesis, experiment, conduct A/B tests, measure results, and prove/disprove your hypothesis. Make decisions to improve security based on the data and repeat. For example, test what message will work to your end users. Send two emails with a security message – one that focuses on compliance and another that focuses on best practices that are the right thing to do. See which emails the users respond to and use that message.

Relationships vs Transactions

Build relationships with your peers inside and outside the organization, take them out to lunch and ask them about their business. Remove subjectivity and opinions in your dialogue by leveraging 3rd party data and information from peers. For example, leverage your relationships and knowledge bases outside your organization to collect input on salaries, budgets, product reviews, successful training programs, feedback and your own sanity. Use that as part of your dialogue with your internal constituents to increase your relevance to their world while avoiding being viewed as transactional.

Business Impact vs Disruption

Speak to the business impact. Security can be a competitive advantage and it is a ‘must do’. Talk about the potential threat by looking at what happened to competitors and ask, what if that happened here? How would it disrupt our business? And have an answer at the ready, ‘My analysis shows that we could improve here versus there’. Connect the dots for the business.

Systems and Programs vs Tasks

Looking at all of the tasks that need to be completed can be a daunting task. Rather than focusing on the list of patches that need to be applied (you have to do that anyways), focus on the configuration management process and measure process improvements. Measure things like time to closure, and not so much the number of tasks.

For more information on Bill Burn’s recommendations and presentation, visit his session link.

To hear more about the changing role of the CISO, watch Larry Ponemon, Founder of the Ponemon Institute and Jeff Northrop, CTO IAPP discuss this topic with Arnold Federbaum, former CISO and Adjunct Professor, NYU, and Linda Hewlett, Sr Enterprise Security Architect, Santandar Holdings USA.

If unable to view the video, click here.

Share
Posted in Data Security | Tagged , , | Leave a comment