Maximize the Potential Business Value from New Core Banking/Insurance Application Investments


According to the IDC Financial Insights 2013 Predictions report, financial institutions across most regions are getting serious about updating their legacy systems to improve reduce operating costs, automate labor intensive processes, improve customer experiences, and avoid costly disruptions. Transforming a bank’s core systems or insurance provider’s main business systems is a strategic decision that has far-reaching implications on the firm’s future business strategies and success. When done right, the capabilities offered in today’s modern banking and insurance platforms can propel a company in front of their competition or be the nail in the coffin if your data is not migrated correctly, safeguards are not in place to protect against unwanted data breaches, and if you are not able to decommission those old systems as planned.  

One of the most important and critical phases of any legacy modernization project is the process of migrating data from old to new.  Migrating data involves:

  • Ability to access existing data in the legacy systems
  • Understand the data structures that need to be migrated
  • Transform and execute one-to-one mapping with the relevant fields in the new system
  • Identify data quality errors and other gaps in the data
  • Validate what is entered into the new system by identifying transformation or mapping errors
  • Seamlessly connect to the target tables and fields in the new system

Sounds easy enough right?  Not so fast!

A survey of UK-based financial services firms found that 72% of organizations deferred moving applications because data migration is ‘too risky’. There are many hidden dangers and pitfalls to be aware of when taking on these investments including:

  • The systems that are being replaced are frequently 20 to 30 years old and the individuals who implemented or managed these systems are often long gone. At the same time, many firms lack adequate documentation of data definitions, code tables, business rules, and relationships between the available data making it difficult to determine the scope, costs, and confidence in completing these projects
  • Existing data errors in legacy systems due to the lack of data quality controls or validation processes to govern what goes into these systems resulting in duplicate records, invalid formats, non-conforming values and other data quality errors found in source tables. Data quality errors can often lead to higher development costs but more importantly lowers the business value and potential of your new investments. (Rubbish In/ Rubbish Out)
  • Data migrations are often considered a one-time activity resulting in firms to hand code the complex data work involved or relying on the system vendor to it for them.  Hand coding of these processes can lead to project delays, higher error rates, and higher development costs.
  • Unregulated use of production data for testing purposes that contain Personally Identifiable and Non-personally identifiable information can increase the risk of unwanted and costly data breaches
  • Costly maintenance of legacy systems kept alive to make data in them available to the business for regulatory needs. Minimizes the expected business value from new application investments and drives up unwanted IT costs.

Ensuring success in any legacy modernization investment must be supported with capable technology to provision the data migration, data governance, testing, and validation activities as part of these multi-year projects. They include:

  • Proven Enterprise data integration solutions that can access the data you need from any platform including legacy mainframe sources, relational databases, and unstructured data, of any volume. It is also important to have solutions that include a library of data transformations and rules that can be managed by business and IT to fast track development efforts. 
  • Data quality management and governance solutions to profile, discover, cleanse, report, monitor and correct data quality issues, ensuring that data is fit for use in your new systems while allowing data stewards to define data quality rules for development resources.
  • Data validation capabilities to ensure that data migrated from legacy systems into new core banking tables is valid and based on business requirements
  • Scalable and secure test data management to subset test data from production systems and masking sensitive information to avoid risk of a data breach while maintaining the structure and  integrity of your test data
  • A scalable and secure data archiving solution to support the decommissioning of those legacy systems while providing access to any data from those systems to the business for regulatory purposes.

Don’t let your data issues get in the way of maximizing the potential value your business expects from your legacy modernization investments.


This entry was posted in Application ILM, Application Retirement, Data Archiving, Data Governance, Data Integration, Data Quality, Financial Services and tagged , , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>