With this 2014 holiday season rolling into full swing, Americans will spend more than $600 Billion, a 4.1% increase from last year. According to the Credit Union National Association, a poll showed that 45% of credit and debit card users will think twice about how they shop and pay given the tens of millions of shoppers impacted by breaches. Stealing identities is a lucrative pastime for those with ulterior motives. The Black Market pays between $10-$12 per stolen record. Yet when enriched with health data, the value is as high as $50 per record because it can be used for insurance fraud.
Are the thieves getting smarter or are we getting sloppy?
With ubiquitous access to technology globally, general acceptance to online shopping, and the digitization of health records, there is more data online with more opportunities to steal our data than ever before. Unfortunately for shoppers, 2013 was known as ‘the year of the retailer breach’ according to the Verizon’s 2014 data breach report. Unfortunately for patients, Healthcare providers were most noted for the highest percentage of losing protected healthcare data.
So what can we do to be a smarter and safer consumer?
No one wants to bank roll the thieves’ illegal habits. One way would be to regress 20 years, drive to the mall and make our purchases cash in hand or go back to completely paper-based healthcare. Alternatively, here are a few suggestions to avoid being on the next list of victims:
1. Avoid irresponsible vendors and providers by being an educated consumer
Sites like The Identify Theft Resource Center and the US Department of Health and Human Services expose the latest breaches in retail and healthcare respectively. Look up who you are buying from and receiving care from and make sure they are doing everything they can to protect your data. If they didn’t respond in a timely fashion, tried to hide the breach, or didn’t implement new controls to protect your data, avoid them. Or take your chances.
2. Expect to be hacked, plan for it
Most organizations you trust with your personal information have already experienced a breach. In fact, according to a recent survey conducted by the Ponemon Group sponsored by Informatica, 72% of organizations polled experienced a breach within the past 12 months; more than 20% had 2 or more breaches in the same timeframe. When setting passwords, avoid using words or phrases that you publicly share on Facebook. When answering security questions, most security professionals suggest that you lie!
3. If it really bothers you, be vocal and engage
Many states are invoking legislation to make organizations accountable for notifying individuals when a breach occurs. For example, Florida enacted FIPA – the Florida Information Protection Act – on July 1, 2014 that stipulates that all breaches, large or small, are subject to notification. For every day that a breach goes undocumented, FIPA stipulates $1,000 per day penalty up to an annual limit of $500,000.
In conclusion, as the holiday shopping season approaches, now is the perfect time for you to ensure that you’re making the best – and most informed – purchasing decisions. You have the ability to take matters into your own hands; keep your data secure this year and every year.
To learn more about Informatica Data Security products, visit our Data Privacy solutions website.
This magic quadrant focuses on what Gartner calls Structured Data Archiving. Data Archiving is used to index, migrate, preserve and protect application data in secondary databases or flat files. These are typically located on lower-cost storage, for policy-based retention. Data Archiving makes data available in context of the originating business process or application. This is especially useful in the event of litigation or of an audit.
The Magic Quadrant calls out two use cases. These use cases are “live archiving of production applications” and “application retirement of legacy systems.” Informatica refers to both use cases, together, as “Enterprise Data Archiving.” We consider this to be a foundational component of a comprehensive Information Lifecycle Management strategy.
The application landscape is constantly evolving. For this reason, data archiving is a strategic component of a data growth management strategy. Application owners need a plan to manage data as applications are upgraded, replaced, consolidated, moved to the cloud and/or retired.
When you don’t have a plan in production, data accumulates in the business application. When this happens, performance bothers the business. In addition, data bloat bothers IT operations. When you don’t have a plan for legacy systems, applications accumulate in the data center. As a result, increasing budgets bother the CFO.
A data growth management plan must include the following:
- How to cycle through applications and retire them
- How to smartly store the application data
- How to ultimately dispose data while staying compliant
Structured data archiving and application retirement technologies help automate and streamline these tasks.
Informatica Data Archive delivers unparalleled connectivity, scalability and a broad range of innovative options (i.e. Smart Partitioning, Live Archiving, and retiring aging and legacy data to the Informatica Data Vault), and comprehensive retention management and data reporting and visualization. We believe our strengths in this space are the key ingredients for deploying a successful enterprise data archive.
For more information, read the Gartner Magic Quadrant for Structured Data Archiving and Application Retirement.
Oracle DBAs are challenged with keeping mission critical databases up and running with predictable performance as data volumes grow. Our customers are changing their approach to proactively managing Oracle performance while simplifying IT by leveraging our innovative Data Archive Smart Partitioning features. Smart Partitioning leverages Oracle Database Partitioning, simplifying deploying and managing partitioning strategies. DBAs have been able to respond to requests to improve business process performance without having to write any custom code or SQL scripts.
With Smart Partitioning, DBA’s have a new dialogue with business analysts – rather than wading in the technology weeds, they ask how many months, quarters or years of data are required to get the job done? And show – within a few clicks – how users can self-select how much gets processed when they run queries, reports or programs – basically showing them how they can control their own performance by controlling the volume of data they pull from the database.
Smart Partitioning is configured using easily understood business dimensions such as time, company, business unit etc. These dimensions make it easy to ‘slice’ data to meet the job at hand. Performance becomes manageable and under business control. Another benefit is in your non-production environments. Creating smaller sized, subset databases that are fully functional now fits easily into your cloning operations.
Finally, Informatica has been working closely with the Oracle Enterprise Solutions Group to align Informatica Data Archive Smart Partitioning with the Oracle ZS3 Appliance to maximize performance and savings while minimizing the complexity of implementing an Information Lifecycle Management strategy.
I recently met with a longtime colleague from the Oracle E-Business Suite implementation eco-system, now VP of IT for a global technology provider. This individual has successfully implemented data archiving and data masking technologies to eliminate duplicate applications and control the costs of data growth – saving tens of millions of dollars. He has freed up resources that were re-deployed within new innovative projects such as Big Data – giving him the reputation as a thought leader. In addition, he has avoided exposing sensitive data in application development activities by securing it with data masking technology – thus securing his reputation.
When I asked him about those projects and the impact on his career, he responded, ‘Data archiving and data security are table stakes in the Oracle Applications IT game. However, if I want to be a part of anything important, it has to involve Cloud and Big Data.’ He further explained how the savings achieved from Informatica Data Archive enabled him to increase employee retention rates because he was able to fund an exciting Hadoop project that key resources wanted to work on. Not to mention, as he transitioned from physical infrastructure to a virtual server by retiring legacy applications – he had accomplished his first step on his ‘journey to the cloud’. This would not have been possible if his data required technology that was not supported in the cloud. If he hadn’t secured sensitive data and had experienced a breach, he would be looking for a new job in a new industry.
Not long after, I attended a CIO summit where the theme of the conference was ‘Breakthrough Innovation’. Of course, Cloud and Big Data were main stage topics – not just about the technology, but about how it was used to solve business challenges and provide services to the new generation of ‘entitled’ consumers. This is the description of those who expect to have everything at their fingertips. They want to be empowered to share or not share their information. They expect that if you are going to save their personal information, it will not be abused. Lastly, they may even expect to try a product or service for free before committing to buy.
In order to size up to these expectations, Application Owners, like my long-time colleague, need to incorporate Data Archive and Data Masking in their standard SDLC processes. Without Data Archive, IT budgets may be consumed by supporting old applications and mountains of data, thereby becoming inaccessible for new innovative projects. Without Data Masking, a public breach will drive many consumers elsewhere.
Are you interested in Oracle Data Migration Best Practices? Are you upgrading, consolidating or migrating to or from an Oracle application? Moving to the cloud or a hosted service? Research and experience confirms that the tasks associated with migrating application data during these initiatives have the biggest impact on whether the project is considered a failure or success. So how do your peers ensure data migration success?
Informatica will be offering a full day Oracle Migrations Best Practices workshop at Oracle Application User Group’s annual conference, Collaborate 14, this year on April 7th in Las Vegas, NV. During this workshop, peers and experts will share best practices for how to avoid the pitfalls and ensure successful projects, lowering migration cost and risk. Our full packed agenda includes:
- Free use and trials of data migration tools and software
- Full training sessions on how to integrate cloud-based applications
- How to provision test data using different data masking techniques
- How to ensure consistent application performance during and after a migration
- A review of Oracle Migration Best Practices and case studies
Case Study: EMC
One of the key case studies that will be highlighted is EMC’s Oracle migration journey. EMC Corporation migrated to Oracle E-Business Suite, acquired more than 40 companies in 4 years, consolidated and retired environments, and is now on its path to migrating to SAP. Not only did they migrate applications, but they also migrated their entire technology platform from physical to virtual on their journey to the cloud. They needed to control the impact of data growth along the way, manage the size of their test environments while reducing the risk of exposing sensitive data to unauthorized users during development cycles. With best practices, and the help from Informatica, they estimate that they have saved approximately $45M in IT cost savings throughout their migrations. Now that they are deploying a new analytics platform based on Hadoop. They are leveraging existing skill sets and Informatica tools to ensure data is loaded into Hadoop without missing a beat.
Case Study: Verizon
Verizon is the second case study we will be discussing. They recently migrated to Salesforce.com and needed to ensure that more than 100 data objects were integrated with on-premises, back end applications. In addition, they needed to ensure that data was synchronized and kept secure in non-production environments in the cloud. They were able to leverage a cloud-based integration solution from Informatica to simplify their complex IT application architecture and maintain data availability and security – all while migrating a major business application to the cloud.
Case Study: OEM Heavy Equipment Manufacturer
The third case study we will review involves a well-known heavy equipment manufacturer who was facing a couple of challenges – the first was a need to separate data in in an Oracle E-Business Suite application as a result of a divestiture. Secondly, they also needed to control the impact of data growth on their production application environments that were going through various upgrades. Using an innovative approach based on Smart Partitioning, this enterprise estimates it will save $23M over a 5 year period while achieving 40% performance improvements across the board.
To learn more about what Informatica will be sharing at Collaborate 14, watch this video. If you are planning to attend Collaborate 14 this year and you are interested in joining us, you can register for the Oracle Migrations Best Practices Workshop here.
Informatica announced, once again, that it is listed as a leader in the industry’s second Gartner Magic Quadrant for Data Masking Technology. With data security continuing to grow as one of the fastest segments in the enterprise software market, technologies such as data masking are becoming the solution of choice for data-centric security.
Increased fear of cyber-attacks and internal data breaches has made predictions that 2014 is the year of preventative and tactical measures to ensure corporate data assets are safe. Data masking should be included in those measures. According to Gartner,
“Security program managers need to take a strategic approach with tactical best-practice technology configurations in order to properly address the most common advanced targeted attack scenarios to increase both detection and prevention capabilities.”
Without these measures, the cost of an attack or breach is growing every year. The Ponemon Institute posted in a recent study:
“The 2013 Cost of Cyber Crime Study states that the average annualized cost of cybercrime incurred by a benchmark sample of US organizations was $11.56 million, nearly 78% more than the cost estimated in the first analysis conducted 4 years ago.”
Informatica believes that the best preventative measures include a layered approach for data security but without sacrificing agility or adding unnecessary costs. Data Masking delivers data-centric security with improved productivity and reduced overall costs.
Data Masking prevents internal data theft and abuse of sensitive data by hiding it from users. Data masking techniques include replacing some fields with similar-looking characters, masking characters (for example, “x”), substituting real last names with fictional last names and shuffling data within columns – to name a few. Other terms for data masking include data obfuscation, sanitization, scrambling, de-identification, and anonymization . Call it what you like, but without it – organizations may continue to expose sensitive data to those with mal intentions.
To learn more, Download the Gartner Magic Quadrant Data Masking Report now. And visit the Informatica website for data masking product information.
About the Magic Quadrant
Gartner does not endorse any vendor, product or service depicted in its research publications, and does not advise technology users to select only those vendors with the highest ratings. Gartner research publications consist of the opinions of Gartner’s research organization and should not be construed as statements of fact. Gartner disclaims all warranties, expressed or implied, with respect to this research, including any warranties of merchantability or fitness for a particular purpose
A data integration hub is a proven vehicle to provide a self service model for publishing and subscribing data to be made available to a variety of users. For those who deploy these environments for regulated and sensitive data need to think of data privacy and data governance during the design phase of the project.
In the data integration hub architecture, think about how sensitive data will be coming from different locations, from a variety of technology platforms, and certainly from systems being managed by teams with a wide range of data security skills. How can you ensure data will be protected across such a heterogeneous environment? Not to mention if data traverses across national boundaries.
Then think about testing connectivity. If data needs to be validated in a data quality rules engine, in order to truly test this connectivity, there needs to be a capability to test using valid data. However testers should not have access or visibility into the actual data itself if it is classified as sensitive or confidential.
With a hub and spoke model, the rules are difficult to enforce if data is being requested from one country and received in another. The opportunity for exposing human error and potential data leakage increases exponentially. Rather than reading about a breach in the headlines, it may make sense to look at building preventative measures or spending the time and money to do the right thing from the onset of the project.
There are technologies that exist in the market that are easy to implement that are designed to prevent this very type of exposure. This technology is called data masking which includes data obfuscation, encryption and tokenization. Informatica’s Data Privacy solution based on persistent and dynamic data masking options can be easily and quickly deployed without the need to develop code or modify the source or target application.
When developing your reference architecture for a data integration hub, incorporate sound data governance policies and build data privacy into the application upfront. Don’t wait for the headlines to include your company and someone’s personal data.
Informatica announced yesterday the Informatica ILM Nearline product is SAP-certified. ILM Nearline helps IT organizations reduce costs of managing data growth in existing implementations of the SAP NetWeaver Business Warehouse (SAP NetWeaver BW) and SAP HANA. By doing so, customers can leverage freed budgets and resources to invest in its application landscape and data center modernization initiatives. Informatica ILM Nearline v6.1A for use with SAP NetWeaver BW and SAP HANA, available today, is purpose-built for SAP environments leveraging native SAP interfaces.
Data volumes are growing the fastest in data warehouse and reporting applications, yet a significant amount of it is rarely used or infrequently accessed. In deployments of SAP NetWeaver BW, standard SAP archiving can reduce the size of a production data warehouse database to help preserve its performance, but if users ever want to query or manipulate the archived data, the data needs to be loaded back into the production system disrupting data analytics processes and extending time to insight. The same holds true for SAP HANA.
To address this, ILM Nearline enables IT to migrate large volumes of largely inactive SAP NetWeaver BW or SAP HANA data from the production database or in memory store to online, secure, highly compressed, immutable files in a near-line system while maintaining end-user access. The result is a controlled environment running SAP NetWeaver BW or SAP HANA with predictable, ongoing hardware, software and maintenance costs. This helps ensure service-level agreements (SLAs) can be met while freeing up ongoing budget and resources so IT can focus on innovation.
Informatic ILM Nearline for use with SAP NetWeaver BW and SAP HANA has been certified with the following interfaces:
- NW-BW-NLS Nearline Storage SAP NetWeaver BW 7.30 on SAP HANA for Informatica Data Archive 6.1A
- NW-BW-NLS 7.30 – Nearline Storage – SAP NetWeaver BW 7.30 for Informatica Data Archive 6.1A
- BC-HCS 6.20 – HTTP Content Server 6.20 for Interface for Informatica Data Archive 6.1
“Informatica ILM Nearline for use with SAP NetWeaver BW and SAP HANA is all about reducing the costs of data while keeping the data easily accessible and thus valuable,” said Adam Wilson, general manager, ILM, Informatica. “As data volumes continue to soar, the solution is especially game-changing for organizations implementing SAP HANA as they can use the Informatica-enabled savings to help offset and control the costs of their SAP HANA licenses without disrupting the current SAP NetWeaver BW users’ access to the data.”
Specific advantages of Informatica ILM Nearline include:
- Industry-leading compression rates – Informatica ILM Nearline’s compression rates exceed standard database compression rates by a sizable margin. Customers typically achieve rates in excess of 90 percent, and some have reported rates as high as 98 percent.
- Easy administration and data access – No database administration is required for data archived by Informatica ILM Nearline. Data is accessible from the user’s standard SAP application screen without any IT interventions and is efficiently stored to simplify backup, restore and data replication processes.
- Limitless capacity – Highly scalable, the solution is designed to store limitless amounts of data without affecting data access performance.
- Easy storage tiering – As data is stored in a highly compressed format, the nearline archive can be easily migrated from one storage location to another in support of a tiered storage strategy.
Available now, Informatica ILM Nearline for use with SAP NetWeaver BW and SAP HANA is based on intellectual property acquired from Sand Technology in Q4 2011 and enhanced by Informatica.
 Informatica Survey Results, January 23, 2013 (citation from Enterprise Data Archive for Hybrid IT Webinar)
The Oracle Application User Group (OAUG) Archive and Purge Special Interest Group (SIG) held its semi-annual session first thing in the morning, Sunday September 22, 2013 – 8:00am. The chairman of the SIG, Brian Bent, must have lost in the drawing straws contest for session times. Regardless, attendance was incredibly strong and the topic, ‘Cleaning up your Oracle E-Business Suite Mess’, was well received.
From the initial audience survey, most attendees have made the jump to OEBS R12 and very few have implemented an Information Lifecycle Management (ILM) strategy. As organizations migrate to the latest version, the rate of data growth increases significantly such that performance takes a plunge, costs for infrastructure and storage spike, and DBAs are squeezed with trying to make due.
The bulk of the discussion was on what Oracle offers for purging Concurrent Programs. The focus was on system tables – not functional archive and purge routines, like General Ledger or Accounts Receivable. That will be a topic of another SIG day.
For starters, Oracle provides Concurrent Programs to purge administrative data. Look for ‘Big Tables’ owned by APPLSYS for more candidates and search for the biggest tables / indexes. Search for ‘PURGE’ on MyOracleSupport (MOS) – do your homework to decide if the Purge programs apply to you. If you are concerned about deleting data, you can create an archive table, add an ‘on delete’ trigger to the original table, run the purge and automatically save the data in the archive table (Guess what? This is a CUSTOMIZATION).
Some areas to look at include FND_Concurrent Requests and FND_LOBS.
- Most customers purge data older than 7-30 days
- Oracle recommends keeping this table under 25,000 rows
- Consider additional Purges that delete data about concurrent requests that run frequently
- DBAs do not delete from FND_LOBS; the only way to get rid of them is for Oracle to provide a concurrent Program for the module that users used to load them up
- Can take an enormous amount of space and make exporting and importing your database take a long time
- You can also look to store FND_LOBS as secure files, but requires advanced compression licenses
- Log enhancement requests for more concurrent programs to clean up FND_LOBS
- Look to third party solutions, such as Informatica
Other suggestions include WORKFLOW, but this requires more research.
For more information, join the Oracle Application User Group and sign up for the Archive and Purge Special Interest Group.
Testing in any domain involves dealing with an incredible amount of test data. As application environments get bigger and more complex, testing teams become susceptible to significant inefficiencies and project delays. Incomplete, missing, or irrelevant test data sets can wreak havoc on any QA cycle. If production data is copied for testing purposes, testers need to ensure sensitive data is protected – adding time to test data creation tasks.
Informatica recently polled its Test Data Management customers through a third party validation service, TechValidate. Here are three cases studies and testimonials about how Informatica customers are addressing these challenges.