Tag Archives: Data Migration
IT application managers are constantly going through a process of integrating, modernizing and consolidating enterprise applications to keep them efficient and providing the maximum business value to the corporation for their cost.
But, it is important to remember that there is significant risk in these projects. An article in the Harvard Business Review states that 17% of enterprise application projects go seriously wrong; going over budget by 200% and over schedule by 70%. The HRB article refers to these projects as “black swans.”
How can you reduce this risk of project failure? Typically, 30% to 40% of an enterprise application project is data migration. A recent study by Bloor Research shows that while success rates for data migration projects are improving, 38% of them still miss their schedule and budget targets.
How can you improve the odds of success in data migration projects?
- Use data profiling tools to understand your data before you move it.
- Use data quality tools to correct data quality problems. There is absolutely no point in moving bad data around the organization – but it happens.
- Use a proven external methodology. In plain English, work with people who have “done it before”
- Develop your own internal competence. Nobody knows your data, and more importantly, the business context of your data than your own staff. Develop the skills and engage your business subject matter experts.
Informatica has industry-leading tools, a proven methodology, and a service delivery team with hundreds of successful data migration implementations.
To find out more about successful data migration:
- Informatica World: Visit us at the Hands On Lab – Data Migration.
- Informatica World: Informatica Presentation on Application Data Migration.
Application Data Migrations with Informatica Velocity Migration Methodology
Friday June 5, 2013 9:00 to 10:00
- Informatica World: Data Migration Factory Presentation by Accenture
Accelerating the Power of Data Migration
Tuesday June 4, 2013 2:00 to 3:00
- Bloor White Paper: Lower Your Risk with Application Data Migration: Next Steps With Informatica
- Informatica White Paper: De-Risk Your Application Go Lives
Adopting Agile may require a cultural shift and in the beginning can be disruptive to an organization. However, as I mentioned in Part 1 of this blog series, Agile Data Integration holds the promise to increase chances of success, deliver projects faster, and reduce defects. Applying Lean principles within your organization can help ease the transition to Agile Data Integration. Lean is a set of principles first explored in the context of data integration by John Schmidt and David Lyle in their book on Lean Integration. First and foremost Lean recommends an organization focus on eliminating waste and optimizing the data integration process from the customers’ perspective. Agile Data Integration maximizes the business value of projects (e.g. Agile BI, Data Warehousing, Big Data Analytics, Data Migration, etc.) because you can get it right the first time by delivering exactly what the business needs when they need it. Break big projects into smaller more manageable deliverables so that you can incrementally deliver value to the business. Agile Data Integration also recommends the following: (more…)
Richard Cramer, Chief Healthcare Strategist for Informatica shares some views on Electronic Health Record (EHR) adoption, including HITECH and Meaningful Use pressures. He also talks about the challenges that the future holds for EHRs.
Visit Informatica’s Healthcare pages for more on EMRs.
I was happy to do an architect-to-architect Webinar with David Lyle, which was more of an interactive conversation than a Webinar. The focus was on the ability to provide integration using data virtualization, but the message was perhaps more profound than that.
The core issues that many enterprises face are that information is largely an asset that they cannot access. The data is locked up within years and years of ill planned databases and applications where the core data, such as customer and sales information, is scattered throughout the enterprise. Most staffers and executives consider this to be “just the way it is.” (more…)
Back in June 2009, the Financial Accounting Standards Board (FASB) published Financial Accounting Statements No. 166, Accounting for Transfers of Financial Assets, and No. 167, Amendments to FASB Interpretation No. 46(R), which changes the way entities account for securitizations and special-purpose entities. The new standards will impact financial institution balance sheets beginning in 2010 and will require substantive changes to how banks account for many items, including securitized assets that had been previously excluded from these organizations’ balance sheets. Banks affected by the new accounting standards will be subject to higher risk-based regulatory capital requirements. So what does it all mean and how much will it cost banks to comply? (more…)
- The cost of losing one customer is four times higher than the cost of obtaining that same customer (Return on Behavior Magazine)
- Satisfying and retaining current customers is 3 to 10 times cheaper than acquiring new customers, and a typical company receives around 65 percent of its business from existing customers (McKinsey, 2001)
- A 5% reduction in the customer defection rate can increase profits by 25% to 80% (Return on Behavior Magazine)
- 7 out of 10 customers who switch to a competitor do so because of poor service (McKinsey, 2001) (more…)
So you’ve managed to reduce your IT budget focused on KTLO (Keep-the-lights-on) by automating a lot of the manual-intensive integration processes using the latest data integration platform.
In doing this, you lowered your upfront TCO through ease-of-use, prebuilt connectivity, reusable logic and rules, and are benefiting from lower ongoing costs through scalability and performance, ease of administration and seamless upgrades.
You found the required staff to manage the myriad of integration challenges all across your IT shop by tapping into the large community of Informatica specialists available and implemented an ICC to mirror companies like T. Rowe Price, HP and Avaya who achieved significant savings through their ICC’s. (This might have come about from your conversation with Gartner whose research on ICC’s convinced you that you could achieve 25% reuse of integration components; 30% savings in integration development time and costs and 20% savings in maintenance costs).
By the way, have you tried the ICC calculator using this data to show how much you could save?
OK, so you did all that and you have a more efficient IT shop with savings enough to rollout that single critical application that you’ve been asked to complete for the last 6 months. Now what? (more…)
Last week I wrote about the need to adopt and develop a risk-based management framework within which to execute data migration initiatives. I also looked at how a tools-based approach provided the essential building blocks with which to build such a framework.
This week let’s talk about how such an approach can help substantially address the first of our common pain points; “data discovery is skipped due to time or resource constraints.”
The net effect of this omission is easily and neatly summarized in the “code, load & explode” truism. This is where the migration process is coded to invalid data assumptions about its structure, content or compliance to known business rules; and not the reality. The actual reality is often only encountered at some significant milestone in a project where remediation is often very costly in terms of resource and budget. Both of which can significantly impact the ability to meet anticipated timelines. (more…)
Over the last few months I have had a large number of discussions regarding the best approach for achieving successful data migrations. There have been three concepts that come up most frequently, and they are:
- business or legacy system-orientated
- big bang versus incremental
- staged or non-staged, etc.
These are valid considerations and deserve serious deliberation; however, by focusing directly on the mechanics, many projects, in my experience, miss completely what I believe to be the fundamental goal of any data migration, that of risk management. (more…)