Category Archives: Data Security
Throughout the RSA conference this week, there was a steady drumbeat calling out the need for building a security mindset in an organization. Many breaches are caused by people making mistakes in our work places. How can you stop breaches caused by the human factor? It is all about increasing awareness and actively making an effort to build security mindedness into everything we do.
During one RSA breakout session entitled, How One Smart Phone Picture Can Take Down Your Company, Dr. Larry Ponemon, Founder of the Ponemon Institute, describes how a hacker really only needs one piece of valuable information to unlock a large-scale data breach, which can be achieved by taking a snapshot of log-in credentials on a screen and other low-tech means. In his research report, Visual Hacking Experimental Study, he cites how ‘certain situations are more risky. Documents on vacant desks and data visible on computer screens are most likely to be hacked.’ This research report was sponsored by 3M – which makes sense since they sell privacy screens for computers, iPads and iPhones.
What is really needed is to make teams aware of the risk and vulnerabilities through education and training, through policy definitions and enforcement, and through constant reminders from leadership.
One startup company, Apozy, took a novel approach using gamification to incentivize employees to incorporate best practices in their day to day routines. Informatica’s own CISO, Bill Burns, is using an internal competition between departments to motivate management to incorporate best practices.
While we continue to invest in technology to automate the implementation and enforcement of policies through controls, we also need to look at who we are hiring and incorporating the security conversation into the on-boarding process.
When recruiting, look to colleges and universities that offer courses and degrees in cybersecurity. (Check out the Ponemon Institute 2014 Best Schools for Cybersecurity). Arnold Federbaum, Adjunt Professor of Cyber Security at NYU School of Engineering discusses Data Security Culture and Higher Education in a panel video recorded during the Informatica Secure@Source product launch.
If you unable to view the video, click here.
Even the IRS has great training videos and podcasts to build awareness on potential risks of identity theft.
As we continue to see more data breach related news, it will be important to emphasize a security mindedness in an organizations culture, build policies that make sense and that have the appropriate level of enforcement, and if it is critical to your business, prioritize hiring those with a formal education and background in cybersecurity.
In an RSA Conference session entitled IAPP: Engineering Privacy: Why Security Isn’t Enough, Sagi Leizerov, E&Y’s Privacy Practice leader began with a plea:
‘We need effective ways to bring together privacy and security controls in an automated way”
Privacy professionals, according to Sagi, essentially need help in determining the use of information – which is a foundational definition of data privacy. Security tools and controls can provide the information necessary to perform that type of investigation conducted by privacy officers. Yet as data proliferates, are the existing security tools truly up for the task?
In other sessions, such as A Privacy Primer for Security Officers , many speakers are claiming that Data Security projects get prioritized as a result of a need to comply with Data Privacy policies and legislation.
We are in an age where data proliferation is one of the major sources of pain for both Chief Information Security Officers and Chief Privacy and Risk Officers (CPO/CRO). Business systems that were designed to automate key business processes store sensitive and private information are primary sources of data for business analytics. As more business users want access data to understand the state of their businesses, data naturally proliferates. Data proliferates to spreadsheets and presentations, emailed in and out of a corporate network, and potentially stored in a public cloud storage offering.
Even though the original intention for using this information was likely all above board, one security violation could potentially open up a can of worms for nefarious characters to take advantage of this data for mal intent. Jeff Northrop, the CTO of the International Association of Privacy Professionals (IAPP) suggests we need to close the gap between security and privacy in a panel discussion with Larry Ponemon, founder of the Ponemon Institute.
Sagi concluded his session by stating ‘Be a voice of change in your organization. Pilot products, be courageous, give new ideas a chance.’ In the recent launch of Informatica Secure@Source, we discuss the need for more alignment between security and privacy teams and the industry seems to agree. Congratulations to the Informatica Secure@Source development team for their recent announcement of winning Gold Medal in the New Product and Service Category at the Info Security Products Guide 2015 Global Excellence Awards!
For more on the importance of Data Security Intelligence in Privacy, watch Larry Ponemon, Founder of the Ponemon Institute and Jeff Northrop, CTO IAPP discuss this topic with Arnold Federbaum, former CISO and Adjunct Professor, NYU, and Linda Hewlett, Sr Enterprise Security Architect, Santander Holdings USA.
If unable to view the video, click here.
Each year, many RSA exhibitors vie for the prestigious Info Security Product Guide’s security awards. Informatica has fared well in in past years with our data- centric security solutions, earning gold, silver, bronze and the Most Innovative Security Product of the year. These awards were for our dynamic and cloud masking products.
On the heels of our April 8 Secure@Source announcement, we were cautiously optimistic that the 50 judges who rank entries would find this solution innovative, valuable and contemporary to the needs of information security. We were honored to receive the Gold award for the “New Product and Services” category.
While these awards represent a significant achievement for Informatica, they more importantly highlight the growing recognition of data-centric security. They echo what our customers, partners and advisors have told us; improving information security requires focus on the data itself.
While encryption, APT protection and network security analytics dominate the RSA show floor, data-centric security is creeping into the pitches of many exhibitors and is cited in many of the session presentations. With their utilization of Informatica’s data-centric security solution, our customers have certainly been early adopters and innovators in data-centric security, understanding its importance and benefits before the masses have recognized the need.
So what does Secure@Source offer for our clients? To summarize, it analyzes sensitive data and determines its risk. There is much to this story though. First, to understand sensitive data risk requires a solution that can classify and discover sensitive data in its many combinations and permutations. There are solutions that have been in the market for sensitive data discovery. However, the complex data structures, relationships and proliferation mandate the need for high precision and rules-based discovery. Informatica ’ s 20 years of understanding, integrating and managing data from disparate sources provides a distinct advantage for this capability.
Second, for accurate risk scoring, sensitive data protection, value, size and proliferation must be precisely determined so that sensitive data risk scores have meaning and relevance to the organization. With location and risk scores, organizations can identify the data security priorities and align network and host security accordingly. Informatica brings to bear it heritage in all things data to provide powerful analytics and scoring; delivering abstracted views for decision makers and detailed views for practitioners.
Finally, Secure@Source provides a repeatable and automated solution for sensitive data risk, replacing time consuming and manual audits and surveys, and replacing disparate tools. Risk assessment is timely, repeatable, accurate and auditable.
The need for data security intelligence and for data protection beyond encryption is emerging in organizations that understand that protecting the network perimeter and host security are important. But, most importantly, focus on the target of breaches, the data. Understand, prioritize by risk and implement the controls to reduce risk. The combination of data-centric security with traditional security will significantly improve what organizations urgently need; risk reduction and breach resiliency.
This week at the RSA conference, Informatica’s CISO Bill Burns presented to a packed room filled with security professionals coaching new and aspiring CISOs what battles to fight from the perspective of changing your frame of reference. This was, in my opinion, one of the most useful sessions of the day. Bill’s practical advice and insights made a lot of sense. Here are the top ideas I took away from the presentation.
The role of the CISO, at the end of the day, is to raise the bar of an organization’s security posture and leave it in a better place than when they arrived. With this as the context of his advice, he continued to review frames of reference a CISO should have when fighting for budget, resources, and mindshare.
Risk vs Threat
Focus on what you can control. You don’t know when the next zero day will be. You can’t predict when an attack will happen – but you prepare. Reduce the impact in the event of an attack. Conduct vulnerability assessments and change the conversation to things you can do.
Data vs Opinion
Use a data-driven approach to drive fact-based conversations. Use the scientific method to propose a hypothesis, experiment, conduct A/B tests, measure results, and prove/disprove your hypothesis. Make decisions to improve security based on the data and repeat. For example, test what message will work to your end users. Send two emails with a security message – one that focuses on compliance and another that focuses on best practices that are the right thing to do. See which emails the users respond to and use that message.
Relationships vs Transactions
Build relationships with your peers inside and outside the organization, take them out to lunch and ask them about their business. Remove subjectivity and opinions in your dialogue by leveraging 3rd party data and information from peers. For example, leverage your relationships and knowledge bases outside your organization to collect input on salaries, budgets, product reviews, successful training programs, feedback and your own sanity. Use that as part of your dialogue with your internal constituents to increase your relevance to their world while avoiding being viewed as transactional.
Business Impact vs Disruption
Speak to the business impact. Security can be a competitive advantage and it is a ‘must do’. Talk about the potential threat by looking at what happened to competitors and ask, what if that happened here? How would it disrupt our business? And have an answer at the ready, ‘My analysis shows that we could improve here versus there’. Connect the dots for the business.
Systems and Programs vs Tasks
Looking at all of the tasks that need to be completed can be a daunting task. Rather than focusing on the list of patches that need to be applied (you have to do that anyways), focus on the configuration management process and measure process improvements. Measure things like time to closure, and not so much the number of tasks.
For more information on Bill Burn’s recommendations and presentation, visit his session link.
To hear more about the changing role of the CISO, watch Larry Ponemon, Founder of the Ponemon Institute and Jeff Northrop, CTO IAPP discuss this topic with Arnold Federbaum, former CISO and Adjunct Professor, NYU, and Linda Hewlett, Sr Enterprise Security Architect, Santandar Holdings USA.
If unable to view the video, click here.
Informatica announced Secure@Source last week, unveiling the industry’s first data security intelligence offering. At a time when Not Knowing Where Sensitive and Confidential Data Reside is the number one thing that keeps security professionals up at night for two years in a row, according to The Ponemon Institute, it seems like the timing is right for a capability, such as Data Security Intelligence, that gives line of site to obscured threats.
Neuralytix conducted market research, entitled The Future State of Data Security Intelligence, where they define data security intelligence (DSI) as a framework for understanding the risk of sensitive or confidential data and recommending the optimal set of controls to mitigate that risk. DSI is comprised of technology that provides the definition, classification, discovery, and assessment phases of a data-centric security approach. The state:
By deploying data security intelligence in combination with data security controls, enterprises can gain active insight into where risks exist and proactively set controls to mitigate the impact in the event of a data breach.
The Enterprise Strategy Group further commented in the report, “Data‐centric Security: A New Information Security Perimeter”, authored by industry expert, Jon Oltsik:
To address modern threats and IT mobility, CISOs must adopt two new security perimeters around identity attributes and data-centric security. In this regard, sensitive data must be continuously monitored for situational awareness and risk management.
This launch precedes the security industry’s equivalent of the NFL’s Superbowl – RSA Conference, where the world talks security. Informatica will be there, debuting its first Data Security Intelligence offering Secure@Source. The team should be so proud – this is by far one of the coolest products I have had the opportunity to be a part of. Here is a brief blurb on what Secure@Source is and does:
Secure@Source discovers, analyzes and visualizes data relationships, proliferation and sensitivity that details data risks and vulnerabilities to focus data protection and monitoring to secure data from external breaches and insider abuse. Secure@Source leverages proven data integration and quality capabilities to provide integrated views of data, independent of platform, from legacy, cloud, big data and mobile environments.
Secure@Source provides granular detail on what data has value, where the data resides and how it transverses the enterprise and how it should be protected. Informatica leverages market leading technology for data discovery and profiling, protection and retirement, and innovative analysis and visualizations for monitoring data security in real-time.
At a conference where the world talks security, I’m looking forward to engaging in conversations with you about getting smarter about Data Security Intelligence and eliminate blind spots. See you at the venue from April 20-24, South Hall, Booth No.2626.
Original article is posted at techcrunch.com
It’s probably no surprise to the security professional community that once again, identity theft is among the IRS’s Dirty Dozen tax scams. Criminals use stolen Social Security numbers and other personally identifiable information to file tax claims illegally, deposit the tax refunds to rechargeable debit cards, and vanish before the average citizen gets around to filing.
Since the IRS began publishing its “Dirty Dozen” list to alert filers of the worst tax scams, identity theft has continually topped the list since 2011. In 2012, the IRS implemented a preventive measure to catch fraud prior to actually issuing refunds, and issued more than 2,400 enforcement actions against identity thieves. With an aggressive campaign to fight identity theft, the IRS saved over $1.4 billion in 2011 and over $63 billion since October 2014.
That’s great progress – but given that of the 117 million tax payers who filed electronically in 2014, 80 million received on average $2,851 directly deposited into their bank, which is more than $229 billion changing hands electronically. The pessimist in me has to believe that cyber criminals are already plotting how to nab more Social Security numbers and e-filing logins to tap into that big pot of gold.
So where are criminals getting the data to begin with? Any organization that has employees and a human resources department collects and possibly stores Social Security numbers, birthdays, addresses and income either on-premises or in a cloud HR application. This information is everything a criminal would need to fraudulently file taxes. Any time a common business process is digitally transformed, or moved to the cloud, the potential risk of exposure increases.
As the healthcare industry transforms to electronic health records and patient records, another abundant source of Social Security numbers and personally identifiable information increases the surface area of opportunity. When you look at the abundance of Social Security numbers stolen in major data breaches, such as the case with Anthem, you start to connect the dots.
One of my favorite dynamic infographics comes from the website Information is Beautiful entitled, ‘World’s Biggest Data Breaches.’ When you filter the data based on number of records versus sensitivity, the size of the bubbles indicate the severity. Even though the sensitivity score appears to be somewhat arbitrary, it does provide one way to assess the severity based on the type of information that was breached:
|Just email address/online information||1|
|Credit card information||300|
|Email password/health records||4000|
|Full bank account details||50000|
What would be an interesting addition is how many records were sold on the black market that resulted in tax or insurance fraud.
Cyber-security expert Brian Krebs, who was personally impacted by a criminal tax return filing last year, says we will likely see “more phony tax refund claims than last year.” With credentials for TurboTax and H&R Block marketed on black market websites for about 4 cents per identity, it is hard to disagree.
The Ponemon Institute published a survey last year, entitled The State of Data Centric Security. One research finding that sticks out is when security professionals were asked what keeps them up at night, and more than 50 percent said “not knowing where sensitive and confidential data reside.” As we enter full swing into tax season, what should security professionals be thinking about?
Data Security Intelligence promises to be the next big thing that provides a more automated and data-centric view into sensitive data discovery, classification and risk assessment. If you don’t know where the data is or its risk, how can you protect it? Maybe with a little more insight, we can at least reduce the surface area of exposed sensitive data.
What does it take to be an effective Chief Information Security Officer (CISO) in today’s era massive data breaches? Besides skin as thick as armor and proven experience in security, an effective CISO needs to hold the following qualities:
- A strong grasp of their security program’s capabilities and of their adversaries
- The business acumen to frame security challenges into business opportunties
- An ability to effectively partner and communicate with stakeholders outside of the IT department
- An insatiable appetite to make data-driven decisions and to take smart risks
In order to be successful, a CISO needs data-driven insights. The business needs this too. Informatica recently launched the industry’s first Data Security Intelligence solution, Secure@Source. At the launch event, we shared how CISOs can leverage new insights, gathered and presented by Secure@Source. These insights better equip their security and compliance teams to defend against misconfigurations, cyber-attacks and malicious insider threats.
Data-driven organizations are more profitable, more efficient, and more competitive . An effective CISO ensures the business has the data it needs without introducing undo risk. In my RSA Conference Security Leadership Development session I will share several other characteristics of effective CISOs.
Despite best efforts at threat modeling and security automation, security controls will never be perfect. Modern businesses require data agility, as attack surface areas and risks change quickly. As data proliferates by business users beyond the firewall, the ability to ensure that sensitive and confidential data is safe from exposure or a breach becomes an enormous task.
Data at rest isn’t valuable if the business can’t use it in a timely manner. Encrypted data may be safe from theft, but needs to be decrypted at some point to be useful for those using the data for predictive analytics. Data’s relative risk of breach goes up as the number of connections, applications, and accounts that have access to the data also increases.
If you have two databases, each with the same millions of sensitive records in them, the system with more applications linked to it and privileged administrative accounts managing it is the one you should be focusing your security investments on. But you need a way to measure and manage your risk with accurate, timely intel.
As Informatica’s CISO, my responsibility is to ensure that our brand is protected, that our customers, stakeholders, and employees trust Informatica — that we are trustworthy custodians of our customers’ most important data assets.
In order to do that, I need to have conviction about where our sensitive assets are, what threats and risks are relevant to them, and have a plan to keep them compliant and safe no matter where the data travels.
Modern security guidance like the SANS Critical Security Controls or NIST CyberSecurity Framework both start with “know your assets”, building an inventory and what’s most critical to your business. Next, they advise you to form a strategy to monitor, protect, and re-assess relevant risks as the business evolves. In the age of Agile development and security automation, continuous monitoring is replacing batch-mode assessments. Businesses move too fast to measure risk annually or once a quarter.
As Informatica has shifted to a cloud-first enterprise, and as our marketing organization makes data-driven decisions for their customer experience initiatives, my teams ensure we are making data available to those who need it while adhering to international data privacy laws. This task has become more challenging as the volume of data increases, is shared between targets, and as requirements become more stringent. Informatica’s Data Security Intelligence solution, Secure@Source, was designed to help manage these activities while making it easier to collaborate with other stakeholders.
The role of the CISO has transformed over time to being a trusted advisor to the business; relying on their guidance to help take smart risks. The CISO provides a lens in business discussions that focuses on technical threats, regulatory constraints, and business risks while ensuring that the business earns and maintains trust with customers. In order to be an effective CISO, it all comes down to the data.
At the recent Bosch Connected World conference in Berlin, Stefan Bungart, Software Leader Europe at GE, presented a very interesting keynote, “How Data Eats the World”—which I assume refers to Marc Andreesen’s statement that “Software eats the world”. One of the key points he addressed in his keynote was the importance of generating actionable insight from Big Data, securely and in real-time at every level, from local to global and at an industrial scale will be the key to survival. Companies that do not invest in DATA now, will eventually end up like consumer companies which missed the Internet: It will be too late.
As software and the value of data are becoming a larger part of the business value chain, the lines between different industries become more vague, or as GE’s Chairman and CEO Jeff Immelt once stated: “If you went to bed last night as an industrial company, you’re going to wake up today as a software and analytics company.” This is not only true for an industrial company, but for many companies that produce “things”: cars, jet-engines, boats, trains, lawn-mowers, tooth-brushes, nut-runners, computers, network-equipment, etc. GE, Bosch, Technicolor and Cisco are just a few of the industrial companies that offer an Internet of Things (IoT) platform. By offering the IoT platform, they enter domains of companies such as Amazon (AWS), Google, etc. As Google and Apple are moving into new areas such as manufacturing cars and watches and offering insurance, the industry-lines are becoming blurred and service becomes the key differentiator. The best service offerings will be contingent upon the best analytics and the best analytics require a complete and reliable data-platform. Only companies that can leverage data will be able to compete and thrive in the future.
The idea of this “servitization” is that instead of selling assets, companies offer service that utilizes those assets. For example, Siemens offers a service for body-scans to hospitals instead of selling the MRI scanner, Philips sells lightning services to cities and large companies, not the light bulbs. These business models enable suppliers to minimize disruption and repairs as this will cost them money. Also, it is more attractive to have as much functionality of devices in software so that upgrades or adjustments can be done without replacing physical components. This is made possible by the fact that all devices are connected, generate data and can be monitored and managed from another location. The data is used to analyse functionality, power consumption, usage , but also can be utilised to predict malfunction, proactive maintenance planning, etc.
So what impact does this have on data and on IT? First of all, the volumes are immense. Whereas the total global volume of for example Twitter messages is around 150GB, ONE gas-turbine with around 200 sensors generates close to 600GB per day! But according to IDC only 3% of potentially useful data is tagged and less than 1% is currently analysed. Secondly, the structure of the data is now always straightforward and even a similar device is producing different content (messages) as it can be on a different software level. This has impact on the backend processing and reliability of the analysis of the data.
Also the data often needs to put into context with other master data from thea, locations or customers for real-time decision making. This is a non-trivial task. Next, Governance is an aspect that needs top-level support. Questions like: Who owns the data? Who may see/use the data? What data needs to be kept or archived and for how long? What needs to be answered and governed in IoT projects with the same priorities as the data in the more traditional applications.
To summarize, managing data and mastering data governance is becoming one of the most important pillars of companies that lead the digital age. Companies that fail to do so will be at risk for becoming a new Blockbuster or Kodak: companies that didn’t adopt quickly enough. In order to avoid this, companies need to evaluate a data platform can support a comprehensive data strategy which encapsulates scalability, quality, governance, security, ease of use and flexibility, and that enables them to choose the most appropriate data processing infrastructure, whether that is on premise or in the cloud, or most likely a hybrid combination of these.
Security professionals are in dire need of a solution that provides visibility into where sensitive and confidential data resides, as well as visibility into the data’s risk. This knowledge would allow those responsible to take an effective, proactive approach to combating cybercrime. By focusing on the data, Informatica and our customers, partners and market ecosystem are collaborating to make data-centric security with Data Security Intelligence the next line of defense.
Security technologies that focus on securing the network and perimeter require additional safeguards when sensitive and confidential data traverse beyond these protective controls. Data proliferates to cloud-based applications and mobile devices. Application security and identity access management tools may lack visibility and granular control when data is replicated to Big Data and advanced analytics platforms.
Informatica is filling this need with its data-centric security portfolio, which now includes Secure@Source. Informatica Secure@Source is the industry’s first data security intelligence solution that delivers insight into where sensitive and confidential data reside, as well as the data’s risk profile.
Join us at our online launch event on April 8th where we will showcase Secure@Source and share reactions from an amazing panel including:
- Security Industry leader Anil Chakravarthy, CPO and EVP Informatica and myself, Amit Walia, GM and SVP Informatica
- Luminaries Larry Ponemon, Founder Ponemon Institute and Jeff Northrop, CTO IAPP
- CISOs Bill Burns, Informatica and Arnold Federbaum, Former CISOs and CyberSecurity Professor NYU
- Enterprise Security Architect, Linda Hewlett, Santander Holdings USA.
The opportunity for Data Security Intelligence is extensive. In a recently published report, Neuralytix defined Data-Centric Security as “an approach to security that focuses on the data itself; to cover the gaps of traditional network, host and application security solutions.” A critical element for successful data security is collecting intelligence required to prioritize where to focus security controls and efforts that mitigate risk. This is precisely what Informatica Secure@Source was designed to achieve.
What has emerged from a predominantly manual practice, the data security intelligence software market is expected to reach $800M by 2018 with a CAGR of 27.8%. We are excited about this opportunity! As a leader in data management software, we are uniquely qualified to take an active role in shaping this emerging market category.
Informatica Secure@Source addresses the need to get smarter about where our sensitive and private data reside, who is accessing it, prioritize which controls to implement, and work harmoniously with existing security architectures, policies and procedures. Our customers are asking us for data security intelligence, the industry deserves it. With more than 60% of security professionals stating their biggest challenge is not knowing where their sensitive and confidential data reside, the need for Data Security Intelligence has never been greater
Neuralytix says “data security is about protecting individual data objects that traverse across networks, in and out of a public or private cloud, from source applications to targets such as partner systems, to back office SaaS applications to data warehouses and analytics platforms”. We couldn’t agree more. We believe that the best way to incorporate a data-centric security approach is to begin with data security intelligence.
JOIN US at the online launch event on April 8th for the security industry’s most exciting new Data Security Intelligence solution, Informatica Secure@Source.
 “The State of Data Centric Security,” Ponemon Institute, sponsored by Informatica, June 2014
In case you haven’t noticed, data integration is all the rage right now. Why? There are three major reasons for this trend that we’ll explore below, but a recent USA Today story focused on corporate data as a much more valuable asset than it was just a few years ago. Moreover, the sheer volume of data is exploding.
For instance, in a report published by research company IDC, they estimated that the total count of data created or replicated worldwide in 2012 would add up to 2.8 zettabytes (ZB). By 2020, IDC expects the annual data-creation total to reach 40 ZB, which would amount to a 50-fold increase from where things stood at the start of 2010.
But the growth of data is only a part of the story. Indeed, I see three things happening that drive interest in data integration.
First, the growth of cloud computing. The growth of data integration around the growth of cloud computing is logical, considering that we’re relocating data to public clouds, and that data must be synced with systems that remain on-premise.
The data integration providers, such as Informatica, have stepped up. They provide data integration technology that can span enterprises, managed service providers, and clouds that dealing with the special needs of cloud-based systems. Moreover, at the same time, data integration improves the ways we doing data governance, and data quality,
Second, the growth of big data. A recent IDC forecast shows that the big data technology and services market will grow at a 26.4% compound annual growth rate to $41.5 billion through 2018, or, about six times the growth rate of the overall information technology market. Additionally, by 2020, IDC believes that line of business buyers will help drive analytics beyond its historical sweet spot of relational to the double-digit growth rates of real-time intelligence and exploration/discovery of the unstructured worlds.
The world of big data razor blades around data integration. The more that enterprises rely on big data, and the more that data needs to move from place to place, the more a core data integration strategy and technology is needed. That means you can’t talk about big data without talking about big data integration.
Data integration technology providers have responded with technology that keeps up with the volume of data that moves from place to place. As linked to the growth of cloud computing above, providers also create technology with the understanding that data now moves within enterprises, between enterprises and clouds, and even from cloud to cloud. Finally, data integration providers know how to deal with both structured and unstructured data these days.
Third, better understanding around the value of information. Enterprise managers always knew their data was valuable, but perhaps they did not understand the true value that it can bring.
With the growth of big data, we now have access to information that helps us drive our business in the right directions. Predictive analytics, for instance, allows us to take years of historical data and determine patterns that allow us to predict the future. Mashing up our business data with external data sources makes our data even more valuable.
Of course, data integration drives much of this growth. Thus the refocus on data integration approaches and tech. There are years and years of evolution still ahead of us, and much to be learned from the data we maintain.