0

Seven Essential Best Practices For Data Center Consolidation

Data center consolidation is much more than physical movement of servers and infrastructure.  In fact, the facility costs and power savings are just the tip of the opportunity. The biggest benefits come from using the consolidation initiative as a catalyst to rationalize the application portfolio, archive inactive data and establish one version of the truth for the data that is left.

Just like two people that have been living separately for 30 years and decide to move to one location, if they just lift up all the “stuff” en masse that has accumulated to the combined location – they have a new house crammed full of junk with a lot of redundant objects such as multiple couches, appliances, etc. Consolidation is a perfect time to clean house.

This begs the question what are the best practices for “cleaning house” when consolidating data centers?  Here are the top seven techniques.

  1. Retire legacy applications. One of the reasons that application systems hang around for so long, even when they are not needed for day-to-day operations, is that you still need the data. In most cases data has a much longer life than the application that created it. The solution is to archive the data using tools like Informatica Data Archive. It compresses the data in an immutable (non-changeable) format along with its metadata, makes it available for on-line query and reporting tools for authorized users and systematically disposes of the data when the defined retention rules dictate.
  2. Remove inactive data from live applications. Many applications that are still active and useful are carrying years of transactional data that is rarely used.  Live data archiving doesn’t mean packing up the data onto magnetic tapes and storing them in a dusty warehouse or underground vault. Modern technology like Informatica’s Live Data Archive stores it online in a highly compressed format (often 95% compression or better) which can be quickly restored to the production system if and when required.  As a result the application runs faster, consumes less expensive storage, completes end-of-period jobs more quickly and is easier to upgrade.
  3. Monitor the quality of active data. Technology like  Informatica Data Quality provides a way to quickly profile data to find inconsistencies, develop scorecards to monitor and report on data quality trends, and provide workflow and analysis tools to enable the business users and data stewards to effectively deal with exceptions.
  4. Consolidate data using factory techniques. Once we have completed steps 1, 2 and 3, the next step is to consolidate applications (where appropriate of course) and migrate the live active data to the target system. Technology like Informatica’s PowerCenter and PowerExchange are the high performance work horses that enable rapid data mapping and development of migration workflows.  When combined with Lean Integration[1] factory concepts, also pioneered by Informatica, the result is a low-cost, just-in-time, method for moving data to where it needs to be.
  5. Master Data Management. What we have left after steps 1-4 are essential application systems that must continue to operate to serve specific processes and functional areas.  Yet even with this cleaned up and rationalized portfolio of applications, we still have a significant amount of duplicate and redundant information. There are techniques like  Informatica Master Data Management, a solution which facilitates establishing one version of the truth for shared information.
  6. Don’t move data until it’s needed. This seems like an obvious best practice, yet most data centers have integration processes that run daily to move mountains of information between systems, data warehouses, and datamarts.  Much of the data is for reports that were needed at one time but are no longer read or for business analytics that “might” be needed.  All this data movement consumes both IT infrastructure resources and people resources.  A better approach is to use tools such as  Informatica Data Services which enables shared business objects as an interface to source system of records, and can be deployed as either a web service, ETL process, or SQL Query.  In other words, whenever someone needs the data, they can just get it directly from the source.
  7. Put data in the cloud.  Consolidating data centers doesn’t mean you need to move data to another data center – at least not your data center.  The value of moving data to an external service bureau can offer significant cost, and accessibility, advantages.  Once again Informatica comes to the rescue with Cloud Data Integration.

In short, if you’re going through the trouble to consolidate data centers (and it can indeed be a lot of work and require significant analysis and planning), why not clean house at the same time.


[1] John G. Schmidt and David Lyle, Lean Integration, An Integration Factory Approach to Business Agility, 2010, Addison-Wesley

FacebookTwitterLinkedInEmailPrintShare
This entry was posted in Application ILM, Application Retirement, Cloud Computing, Data Integration, Data Quality, Data Services, Data Warehousing, Enterprise Data Management, Integration Competency Centers and tagged , , , , , , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>