Tag Archives: Data Archive
This magic quadrant focuses on what Gartner calls Structured Data Archiving. Data Archiving is used to index, migrate, preserve and protect application data in secondary databases or flat files. These are typically located on lower-cost storage, for policy-based retention. Data Archiving makes data available in context of the originating business process or application. This is especially useful in the event of litigation or of an audit.
The Magic Quadrant calls out two use cases. These use cases are “live archiving of production applications” and “application retirement of legacy systems.” Informatica refers to both use cases, together, as “Enterprise Data Archiving.” We consider this to be a foundational component of a comprehensive Information Lifecycle Management strategy.
The application landscape is constantly evolving. For this reason, data archiving is a strategic component of a data growth management strategy. Application owners need a plan to manage data as applications are upgraded, replaced, consolidated, moved to the cloud and/or retired.
When you don’t have a plan in production, data accumulates in the business application. When this happens, performance bothers the business. In addition, data bloat bothers IT operations. When you don’t have a plan for legacy systems, applications accumulate in the data center. As a result, increasing budgets bother the CFO.
A data growth management plan must include the following:
- How to cycle through applications and retire them
- How to smartly store the application data
- How to ultimately dispose data while staying compliant
Structured data archiving and application retirement technologies help automate and streamline these tasks.
Informatica Data Archive delivers unparalleled connectivity, scalability and a broad range of innovative options (i.e. Smart Partitioning, Live Archiving, and retiring aging and legacy data to the Informatica Data Vault), and comprehensive retention management and data reporting and visualization. We believe our strengths in this space are the key ingredients for deploying a successful enterprise data archive.
For more information, read the Gartner Magic Quadrant for Structured Data Archiving and Application Retirement.
Oracle DBAs are challenged with keeping mission critical databases up and running with predictable performance as data volumes grow. Our customers are changing their approach to proactively managing Oracle performance while simplifying IT by leveraging our innovative Data Archive Smart Partitioning features. Smart Partitioning leverages Oracle Database Partitioning, simplifying deploying and managing partitioning strategies. DBAs have been able to respond to requests to improve business process performance without having to write any custom code or SQL scripts.
With Smart Partitioning, DBA’s have a new dialogue with business analysts – rather than wading in the technology weeds, they ask how many months, quarters or years of data are required to get the job done? And show – within a few clicks – how users can self-select how much gets processed when they run queries, reports or programs – basically showing them how they can control their own performance by controlling the volume of data they pull from the database.
Smart Partitioning is configured using easily understood business dimensions such as time, company, business unit etc. These dimensions make it easy to ‘slice’ data to meet the job at hand. Performance becomes manageable and under business control. Another benefit is in your non-production environments. Creating smaller sized, subset databases that are fully functional now fits easily into your cloning operations.
Finally, Informatica has been working closely with the Oracle Enterprise Solutions Group to align Informatica Data Archive Smart Partitioning with the Oracle ZS3 Appliance to maximize performance and savings while minimizing the complexity of implementing an Information Lifecycle Management strategy.
Most application owners know that as data volumes accumulate, application performance can take a major hit if the underlying infrastructure is not aligned to keep up with demand. The problem is that constantly adding hardware to manage data growth can get costly – stealing budgets away from needed innovation and modernization initiatives.
Join Julie Lockner as she reviews the Cox Communications case study on how they were able to solve an application performance problem caused by too much data with the hardware they already had by using Informatica Data Archive with Smart Partitioning. Source: TechValidate. TVID: 3A9-97F-577
As part of their cost cutting program, organizations are consolidating data centers and the applications within them. Federal and state agencies in the public sector are among those where IT consolidation and moving applications to the cloud are top priorities as part of an overall goal to increase efficiencies and eliminate costs. In other industries, many consolidations are also under way due to mergers and acquisitions and other cost cutting initiatives. As you plan or undergo a consolidation project, you also need to plan for the retirement of legacy, redundant applications that are left behind.
It is that time of year for some to reflect on the past or ponder the future. If part of your end of year ritual includes cleaning out a cluttered closet or room in the house, consider the same ritual for the data in your databases.
In December, 2005 Sun Microsystems conducted an interview with Bill Inmon, the father of the data warehouse concept. He said, “ILM keeps a data warehouse from costing huge amounts of money and maintains good performance consistently throughout the data warehouse environment.” Four years later, the average size of a data warehouse has increased by 200%, surpassing the multi-terabyte size benchmark.
With these mammoth databases comes an increase in cost to manage them and a potential deterioration in performance. It is common practice to leverage techniques like indexing and database partitioning to address query performance issues with very large databases but those techniques do not address challenges associated with the raw volumes of data.
Information LifeCycle Management (ILM) involves classifying data to map its business value to the corresponding features of the infrastructure it resides on. This classification enables IT to appropriately build tiered infrastructure based on what the business needs, not based on the latest technology trends. Benefits of an ILM solution include lower cost of storage, better use of existing IT resources and application performance by moving aged data off high end systems. Additionally, improvements to operational efficiencies due to shorter backup, restore and maintenance windows, while reducing risk of falling out of compliance are other benefits. (more…)