Category Archives: Data Privacy
Last week I had the opportunity to attend the Gartner Security and Risk Management Summit. At this event, Gartner analysts and security industry experts meet to discuss the latest trends, advances, best practices and research in the space. At the event, I had the privilege of connecting with customers, peers and partners. I was also excited to learn about changes that are shaping the data security landscape.
Here are some of the things I learned at the event:
- Security continues to be a top CIO priority in 2014. Security is well-aligned with other trends such as big data, IoT, mobile, cloud, and collaboration. According to Gartner, the top CIO priority area is BI/analytics. Given our growing appetite for all things data and our increasing ability to mine data to increase top-line growth, this top billing makes perfect sense. The challenge is to protect the data assets that drive value for the company and ensure appropriate privacy controls.
- Mobile and data security are the top focus for 2014 spending in North America according to Gartner’s pre-conference survey. Cloud rounds out the list when considering worldwide spending results.
- Rise of the DRO (Digital Risk Officer). Fortunately, those same market trends are leading to an evolution of the CISO role to a Digital Security Officer and, longer term, a Digital Risk Officer. The DRO role will include determination of the risks and security of digital connectivity. Digital/Information Security risk is increasingly being reported as a business impact to the board.
- Information management and information security are blending. Gartner assumes that 40% of global enterprises will have aligned governance of the two programs by 2017. This is not surprising given the overlap of common objectives such as inventories, classification, usage policies, and accountability/protection.
- Security methodology is moving from a reactive approach to compliance-driven and proactive (risk-based) methodologies. There is simply too much data and too many events for analysts to monitor. Organizations need to understand their assets and their criticality. Big data analytics and context-aware security is then needed to reduce the noise and false positive rates to a manageable level. According to Gartner analyst Avivah Litan, ”By 2018, of all breaches that are detected within an enterprise, 70% will be found because they used context-aware security, up from 10% today.”
I want to close by sharing the identified Top Digital Security Trends for 2014
- Software-defined security
- Big data security analytics
- Intelligent/Context-aware security controls
- Application isolation
- Endpoint threat detection and response
- Website protection
- Adaptive access
- Securing the Internet of Things
In response to the growth, organizations seek new ways to unlock the value of their data. Traditionally, data has been analyzed for a few key reasons. First, data was analyzed in order to identify ways to improve operational efficiency. Secondly, data was analyzed to identify opportunities to increase revenue.
As data expands, companies have found new uses for these growing data sets. Of late, organizations have started providing data to partners, who then sell the ‘intelligence’ they glean from within the data. Consider a coffee shop owner whose store doesn’t open until 8 AM. This owner would be interested in learning how many target customers (Perhaps people aged 25 to 45) walk past the closed shop between 6 AM and 8 AM. If this number is high enough, it may make sense to open the store earlier.
As much as organizations prioritize the value of data, customers prioritize the privacy of data. If an organization loses a customer’s data, it results in a several costs to the organization. These costs include:
- Damage to the company’s reputation
- A reduction of customer trust
- Financial costs associated with the investigation of the loss
- Possible governmental fines
- Possible restitution costs
To guard against these risks, data that organizations provide to their partners must be obfuscated. This protects customer privacy. However, data that has been obfuscated is often of a lower value to the partner. For example, if the date of birth of those passing the coffee shop has been obfuscated, the store owner may not be able to determine if those passing by are potential customers. When data is obfuscated without consideration of the analysis that needs to be done, analysis results may not be correct.
There is away to provide data privacy for the customer while simultaneously monetizing enterprise data. To do so, organizations must allow trusted partners to define masking generalizations. With sufficient data masking governance, it is indeed possible for data obfuscation and data value to coexist.
Currently, there is a great deal of research around ensuring that obfuscated data is both protected and useful. Techniques and algorithms like ‘k-Anonymity’ and ‘l-Diversity’ ensure that sensitive data is safe and secure. However, these techniques have have not yet become mainstream. Once they do, the value of big data will be unlocked.
The other comparison is that data is like solar power. Like solar power, data is abundant. In addition, it’s getting cheaper and more efficient to harness. The juxtaposition of these images captures the current sentiment around data’s potential to improve our lives in many ways. For this to happen, however, corporations and data custodians must effectively balance the power of data with security and privacy concerns.
Many people have a preconception of security as an obstacle to productivity. Actually, good security practitioners understand that the purpose of security is to support the goals of the company by allowing the business to innovate and operate more quickly and effectively. Think back to the early days of online transactions; many people were not comfortable banking online or making web purchases for fear of fraud and theft. Similar fears slowed early adoption of mobile phone banking and purchasing applications. But security ecosystems evolved, concerns were addressed, and now Gartner estimates that worldwide mobile payment transaction values surpass $235B in 2013. An astute security executive once pointed out why cars have brakes: not to slow us down, but to allow us to drive faster, safely.
The pace of digital change and the current proliferation of data is not a simple linear function – it’s growing exponentially – and it’s not going to slow down. I believe this is generally a good thing. Our ability to harness data is how we will better understand our world. It’s how we will address challenges with critical resources such as energy and water. And it’s how we will innovate in research areas such as medicine and healthcare. And so, as a relatively new Informatica employee coming from a security background, I’m now at a crossroads of sorts. While Informatica’s goal of “Putting potential to work” resonates with my views and helps customers deliver on the promise of this data growth, I know we need to have proper controls in place. I’m proud to be part of a team building a new intelligent, context-aware approach to data security (Secure@SourceTM).
We recently announced Secure@SourceTM during InformaticaWorld 2014. One thing that impressed me was how quickly attendees (many of whom have little security background) understood how they could leverage data context to improve security controls, privacy, and data governance for their organizations. You can find a great introduction summary of Secure@SourceTM here.
I will be sharing more on Secure@SourceTM and data security in general, and would love to get your feedback. If you are an Informatica customer and would like to help shape the product direction, we are recruiting a select group of charter customers to drive and provide feedback for the first release. Customers who are interested in being a charter customer should register and send email to SecureCustomers@informatica.com.
About 15 or so years ago, some friends of mine called me to share great news. Their dating relationship had become serious and they were headed toward marriage. After a romantic proposal and a beautiful ring, it was time to plan the wedding and invite the guests.
This exciting time was confounded by a significant challenge. Though they were very much in love, one of them had an incredibly tough time making wise financial choices. During the wedding planning process, the financially astute fiancée grew concerned about the problems the challenged partner could bring. Even though the financially illiterate fiancée had every other admirable quality, the finance issue nearly created enough doubt to end the engagement. Fortunately, my friends moved forward with the ceremony, were married and immediately went to work on learning new healthy financial habits as a couple.
Let’s segue into how this relates to telecommunications and data, specifically to your average communications operator. Just like a concerned fiancée, you’d think twice about making a commitment to an organization that didn’t have a strong foundation.
Like the financially challenged fiancée, the average operator has a number of excellent qualities: functioning business model, great branding, international roaming, creative ads, long-term prospects, smart people at the helm and all the data and IT assets you can imagine. Unfortunately, despite the externally visible bells and whistles, over time they tend to lose operational soundness around the basics. Specifically, their lack of data quality causes them to forfeit an ever increasing amount of billing revenue. Their poor data costs them millions each year.
A recent set of engagements highlighted this phenomenon. The small carrier (3-6 million subscribers) who implements a more consistent, unique way to manage core subscriber profile and product data could recover underbilling of $6.9 million annually. A larger carrier (10-20 million subscribers) could recover $28.1 million every year from fixing billing errors. (This doesn’t even cover the large Indian and Chinese carriers who have over 100 million customers!)
Typically, a billing error starts with an incorrect set up of a service line item base price and related 30+ discount line variances. Next, the wrong service discount item is applied at contract start. If that did not happen (or on top of those), it will occur when the customer calls in during or right before the end of the first contract period (12-24 months) to complain about the service quality, bill shock, etc. Here, the call center rep will break an existing triple play bundle by deleting an item and setting up a separate non-bundle service line item at a lower price (higher discount). The head of billing actually told us, “our reps just give a residential subscriber a discount of $2 for calling us”. It’s even higher for commercial clients.
To make matters worse, this change will trigger misaligned (incorrect) activation dates or even bill duplication, all of which will have to be fixed later by multiple staff on the BSS and OSS side or may even trigger an investigation project by the revenue assurance department. Worst case, the deletion of the item from the bundle (especially for B2B clients) will not terminate the wholesale cost the carrier still owes a national carrier for a broadband line, which often is 1/3 of the retail price for a business customer.
To come full circle to my initial “accounting challenged” example; would you marry (invest in) this organization? Do you think this can or should be solved in a big bang approach or incrementally? Where would you start: product management, the service center, residential or commercial customers?
Observations and illustrations contained in this post are estimates only and are based entirely upon information provided by the prospective customer and on our observations and benchmarks. While we believe our recommendations and estimates to be sound, the degree of success achieved by the prospective customer is dependent upon a variety of factors, many of which are not under Informatica’s control and nothing in this post shall be relied upon as representative of the degree of success that may, in fact, be realized and no warranty or representation of success, either express or implied, is made.
“What really matters about big data is what it does. Aside from how we define big data as a technological phenomenon, the wide variety of potential uses for big data analytics raises crucial questions about whether our legal, ethical, and social norms are sufficient to protect privacy and other values in a big data world.”
These crucial questions, raised in a recent White House report on the implications of big data, frame a growing debate taking place across both society and the business world on how far organizations can push the limits with data collection and analysis. The report, issued by a presidential commission tasked with assessing big data’s privacy implications, explains how big data is a double-edged sword. While big data analytics pave the way to unexpected discoveries, innovations, and advancements in our quality of life, it also has the potential for abuse as well. As the report puts it, big data’s capabilities, “most of which are not visible or available to the average consumer, also create an asymmetry of power between those who hold the data and those who intentionally or inadvertently supply it.”
The report’s authors acknowledge that big data analytics is an engine of economic growth and a competitive tool for companies across all industries, as well as a tool for quality of life. “Used well, big data analysis can boost economic productivity, drive improved consumer and government services, thwart terrorists, and save lives,” the report states. In addition, there will likely be a profound impact as data analytics gets applied to the Internet of Things, which “have made it possible to merge the industrial and information economies.” In another example, healthcare providers and payers can employ predictive analytics to detect fraud and abuse in real time.
The report’s main thrust is personal privacy implications, and many these issues will inevitably shape the practices and policies of enterprises as they expand their businesses into the big data realm. The managers and professionals charged with identifying, collecting and analyzing information assets will increasingly be under pressure – as their organizations feel pressure – to understand the boundaries between insight, targeted engagement, and overreach.
For example, a still relatively unexplored area of big data is its ownership. Does data belong to those who collect it, or those who contribute to it? “Big data may be viewed as property, as a public resource, or as an expression of individual identity,” the report states.
Another challenge is the fact that many organizations will opt to assemble massive databases as they move forward with big data analysis. “Big data technologies can derive value from large data sets in ways that were previously impossible — indeed, big data can generate insights that researchers didn’t even think to seek.” For example, new tools and technologies provide for analysis across entire data sets, versus extracting a small representative subset of the data and extrapolating any results against a larger universe. However, with so much data, analysis may potentially be erroneous as well. “Correlation still doesn’t equal causation,” the report’s authors state. “Finding a correlation with big data techniques may not be an appropriate basis for predicting out-comes or behavior, or rendering judgments on individuals. In big data, as with all data, interpretation is always important.”
Another issue is the permanence of data – which also is a privacy issue. At the same time, this may also create headaches for corporate data managers as well. “In the past, retaining physical control over one’s personal information was often sufficient to ensure privacy,” the report states. “Documents could be destroyed, conversations forgotten, and records expunged. But in the digital world, information can be captured, copied, shared, and transferred at high fidelity and retained indefinitely. Volumes of data that were once unthinkably expensive to preserve are now easy and affordable to store on a chip the size of a grain of rice. As a consequence, data, once created, is in many cases effectively permanent. Furthermore, digital data often concerns multiple people, making personal control impractical.”
The report’s authors state that organizations need to take steps to address privacy issues, and suggest de-identification and encryption as technical solutions that are available at this time. However, in the long run, de-identification is still a weak approach to the problem. “Many technologists are of the view that de-identification of data as a means of protecting individual privacy is, at best, a limited proposition. In practice, data collected and de-identified is protected in this form by companies’ commitments to not re-identify the data and by security measures put in place to ensure those protections.”
Ultimately, the best methods to ensure the ethical use of data need to come through inspired and forward-thinking management. It takes judicious management, a commitment to training and education, and a focus on what nuggets of information matter the most to the business. Big data opens up many new vistas for enterprises, and those that take the high road will reap its rewards.
- A loss of customer trust
- Revenue shortfalls
- A plummeting stock price
- C-level executives losing their jobs
As a result, Data security and privacy has become a key topic of discussion, not just in IT meetings, but in the media and the boardroom.
Preventing access to sensitive data has become more complex than ever before. There are new potential entry points that IT never previously considered. These new options go beyond typical BYOD user devices like smartphones and tablets. Today’s entry points can be much smaller: Things like HVAC controllers, office polycoms and temperature control systems.
So what can organizations do to combat this increasing complexity? Traditional data security practices focus on securing both the perimeter and the endpoints. However, these practices are clearly no longer working and no longer manageable. Not only is the number and type of devices expanding, but the perimeter itself is no longer present. As companies increasingly outsource, off-shore and move operations to the cloud, it is no longer possible fence the perimeters and to keep intruders out. Because 3rd parties often require some form of access, even trusted user credentials may fall into the hands of malicious intruders.
Data security requires a new approach. It must use policies to follow the data and to protect it, regardless of where it is located and where it moves. Informatica is responding to this need. We are leveraging our market leadership and domain expertise in data management and security. We are defining a new data security offering and category. This week, we unveiled our entry into the Data Security market at our Informatica World conference. Our new security offering, Secure@Source™ will allow enterprises to discover, detect and protect sensitive data.
The first step towards protecting sensitive data is to locate and identify them. So Secure@Source™ first allows you discover where all the sensitive data are located in the enterprise and classify them. As part of the discovery, Secure@source also analyzes where sensitive data is being proliferated, who has access to the data, who are actually accessing them and whether the data is protected or unprotected when accessed. Secure@Source™ leverages Informatica’s PowerCenter repository and lineage technology to perform a first pass, quick discovery with a more in depth analysis and profiling over time. The solution allows you to determine the privacy risk index of your enterprise and slice and dice the analysis based on region, departments, organization hierarchy, as well as data classifications.
The longer term vision of Secure@Source™ will allow you to detect suspicious usage patterns and orchestrate the appropriate data protection method, such as: alerting, blocking, archiving and purging, dynamically masking, persistently masking, encrypting, and/or tokenizing the data. The data protection method will depend on whether the data store is a production or non-production system, and whether you would like to de-identify sensitive data across all users or only for some users. All can be deployed based on policies. Secure@Source™ is intended to be an open framework for aggregating data security analytics and will integrate with key partners to provide a comprehensive visibility and assessment of an enterprise data privacy risk.
Secure@Source™ is targeted for beta at the end of 2014 and general availability in early 2015. Informatica is recruiting a select group of charter customers to drive and provide feedback for the first release. Customers who are interested in being a charter customer should register and send email to SecureCustomers@informatica.com.
At the Informatica World 2014 pre-conference, the “ILM Day” sessions were packed, with over 100 people in attendance. This attendance reflects the strong interest in data archive, test data management and data security. Customers were the focus of the panel sessions today, taking center stage to share their experiences, best practices and lessons learned from successful deployments.
Both the test management and data archive panels had strong audience interest and interaction. For Test Data Management, the panel topic was “Agile Development by Streamlining Test Data Management”; for data archive, the session tackled “Managing Data Growth in the Era of Application Consolidation and Modernization”. The panels provided practical tactics and strategies to address the challenges and issues in managing data growth, and how to efficiently and safely provision test data. Thank you to the customers, partners and analysts who served on the panels; participating was EMC, Visteon, Comcast, Lowes, Tata Consultancy Services and Neuralytix.
The day concluded with a most excellent presentation from the ILM General Manager, Amit Walia and the CTO of the International Association of Privacy Professionals, Jeff Northrop. Amit provided an executive summary pre-view of Tuesday’s Secure@Source(TM) announcement, while Jeff Northrop provided a thought provoking market backdrop on the issues and challenges for data privacy and security, and how the focus on information security needs to shift to a ‘data-centric’ approach.
A very successful event for all involved!
Data security breaches continue to escalate. Privacy legislation and enforcement is tightening and analysts have begun making dire predictions in regards to cyber security’s effectiveness. But there is more – Trusted insiders continue to be the major threat. In addition, most executives cannot identify the information they are trying to protect.
Data security is a senior management concern, not exclusive to IT. With this in mind, what is the next step CxOs must take to counter these breaches?
A new approach to Data Security
It is clear that a new approach is needed. This should focus on answering fundamental, but difficult and precise questions in regards to your data:
- What data should I be concerned about?
- Can I create re-usable rules for identifying and locating sensitive data in my organization?
- Can I do so both logically and physically?
- What is the source of the sensitive data and where is it consumed?
- What are the sensitive data relationships and proliferation?
- How is it protected? How should it be protected?
- How can I integrate data protection with my existing cyber security infrastructure?
The answers to these questions will help guide precise data security measures in order to protect the most valuable data. The answers need to be presented in an intuitive fashion, leveraging simple, yet revealing graphics and visualizations of your sensitive data risks and vulnerabilities.
At Informatica World 2014, Informatica will unveil its vision to help organizations address these concerns. This vision will assist in the development of precise security measures designed to counter the growing sophistication and frequency of cyber-attacks, and the ever present danger of rogue insiders.
Stay tuned, more to come from Informatica World 2014.
- The RSA conference took place in San Francisco from February 24-28, 2014
- The IAPP Global Privacy Summit took place Washington, DC from March 5-7, 2014
Data Privacy at the 2014 RSA Conference
The RSA conference was busy as expected, with over 30,000 attendees. Informatica co-sponsored an after-hours event with one of our partners, Imperva, at the Dark Circus. The event was standing room only and provided a great escape from the torrential rain. One highlight of RSA, for Informatica, is that we were honored with two of the 2014 Security Products Guide Awards:
- Informatica Dynamic Data Masking won the Gold Award for Database Security, Data Leakage Prevention/Extrusion Prevention
- Informatica Cloud Test Data Management and Security won the Bronze Award for New Products
Of particular interest to us was the growing recognition of data-centric security and privacy at RSA. I briefly met Bob Rudis, co-author of “Data Driven Security” which was featured at the onsite bookstore. In the book, Rudis has presented a great case for focusing on data as the center-point of security, through data analysis and visualization. From Informatica’s perspective, we also believe that a deep understanding of data and its relationships will escalate as a key driver of security policies and measures.
Data Privacy at the IAPP Global Privacy Summit
The IAPP Global Privacy Summit was an amazing event, small (2,500), but completely sold-out and overflowing its current venue. We exhibited and had the opportunity to meet CPOs, privacy, risk/compliance and security professionals from around the world, and had hundreds of conversations about the role of data discovery and masking for privacy. From the privacy perspective, it is all about finding, de-identification and protection of PII, PCI and PHI. These privacy professionals have extensive legal and/or data security backgrounds and understand the need to safeguard privacy by using data masking. Many notable themes were present at IAPP:
- De-identification is a key topic area
- Concerns about outsourcing and contractors in application development and testing have driven test data management adoption
- No national US privacy regulations expected in the short-term
- Europe has active but uneven privacy enforcement (France: “name and shame”, UK: heavy fines, Spain; most active)
If you want to learn more about data privacy and security, you will find no better place than Informatica World 2014. There, you’ll learn about the latest data security trends, see updates to Informatica’s data privacy and security offerings, and find out how Informatica protects sensitive information in real time without requiring costly, time-consuming changes to applications and databases. Register TODAY!