Tag Archives: Data Archive
Most application owners know that as data volumes accumulate, application performance can take a major hit if the underlying infrastructure is not aligned to keep up with demand. The problem is that constantly adding hardware to manage data growth can get costly – stealing budgets away from needed innovation and modernization initiatives.
Join Julie Lockner as she reviews the Cox Communications case study on how they were able to solve an application performance problem caused by too much data with the hardware they already had by using Informatica Data Archive with Smart Partitioning. Source: TechValidate. TVID: 3A9-97F-577
As part of their cost cutting program, organizations are consolidating data centers and the applications within them. Federal and state agencies in the public sector are among those where IT consolidation and moving applications to the cloud are top priorities as part of an overall goal to increase efficiencies and eliminate costs. In other industries, many consolidations are also under way due to mergers and acquisitions and other cost cutting initiatives. As you plan or undergo a consolidation project, you also need to plan for the retirement of legacy, redundant applications that are left behind.
It is that time of year for some to reflect on the past or ponder the future. If part of your end of year ritual includes cleaning out a cluttered closet or room in the house, consider the same ritual for the data in your databases.
In December, 2005 Sun Microsystems conducted an interview with Bill Inmon, the father of the data warehouse concept. He said, “ILM keeps a data warehouse from costing huge amounts of money and maintains good performance consistently throughout the data warehouse environment.” Four years later, the average size of a data warehouse has increased by 200%, surpassing the multi-terabyte size benchmark.
With these mammoth databases comes an increase in cost to manage them and a potential deterioration in performance. It is common practice to leverage techniques like indexing and database partitioning to address query performance issues with very large databases but those techniques do not address challenges associated with the raw volumes of data.
Information LifeCycle Management (ILM) involves classifying data to map its business value to the corresponding features of the infrastructure it resides on. This classification enables IT to appropriately build tiered infrastructure based on what the business needs, not based on the latest technology trends. Benefits of an ILM solution include lower cost of storage, better use of existing IT resources and application performance by moving aged data off high end systems. Additionally, improvements to operational efficiencies due to shorter backup, restore and maintenance windows, while reducing risk of falling out of compliance are other benefits. (more…)