Tag Archives: Data Migration

The Billion Dollar (Data Integration) Mistake

How would you like to wake up to an extra billion dollars, or maybe nine, in the bank? This has happened to a teacher in India. He discovered to his astonishment a balance of $9.8 billion in his bank account!

Data IntegrationHow would you like to be the bank who gave the client an extra nine Billion dollars? Oh, to be a fly on the wall when the IT department got that call. How do you even begin to explain? Imagine the scrambling to track down the source of the data error.

This was a glaringly obvious error, which is easily caught. But there is potential for many smaller data errors. These errors may go undetected and add up hurting your bottom line.  How could this type of data glitch happen? More importantly, how can you protect your organization from these types of errors in your data?

A primary source of data mistakes is insufficient testing during Data Integration. Any change or movement of data harbors risk to its integrity. Unfortunately there are often insufficient IT resources to adequately validate the data. Some organizations validate the data manually. This is a lengthy, unreliable process, fraught with data errors. Furthermore manual testing does not scale well to large data volumes or complex data changes. So the validation is often incomplete. Finally some organizations simply lack the resources to conduct any level of data validation altogether.

Data Validation_Customer Benefits

Many of our customers have been able to successfully address this issue via automated data validation testing. (Also known as DVO). In a recent TechValidate survey, Informatica customers have told us that they:

  • Reduce costs associated with data testing.
  • Reduce time associated with data testing.
  • Increase IT productivity.
  • Increase the business trust in the data.

Customers tell us some of the biggest potential costs relate to damage control which occurs when something goes wrong with their data. The tale above, of our fortunate man and not so fortunate bank, can be one example. Bad data can hurt a company’s reputation and lead to untold losses in market-share and customer goodwill.  In today’s highly regulated industries, such as healthcare and financial services, consequences of incorrect data can be severe. This can include heavy fines or worse.

Using automated data validation testing allows customers to save on ongoing testing costs and deliver reliable data. Just as important, it prevents pricey data errors, which require costly and time-consuming damage control. It is no wonder many of our customers tell us they are able to recoup their investment in less than 12 months!

Data Validation_Use Cases

TechValidate survey shows us that customers are using data validation testing in a number of common use cases including:

  • Regression (Unit) testing
  • Application migration or consolidation
  • Software upgrades (Applications, databases, PowerCenter)
  • Production reconciliation

One of the most beneficial use cases for data validation testing has been for application migration and consolidation. Many SAP migration projects undertaken by our customers have greatly benefited from automated data validation testing.  Application migration or consolidation projects are typically large and risky. A Bloor Research study has shown 38% of data migration projects fail, incurring overages or are aborted altogether. According to a Harvard Business Review article, 1 in 6 large IT projects run 200% over budget. Poor data management is one of the leading pitfalls in these types of projects. However, according to Bloor Research, Informatica’ s data validation testing is a capability they have not seen elsewhere in the industry.

A particularly interesting example of above use case is in the case of M&A situation. The merged company is required to deliver ‘day-1 reporting’. However FTC regulations forbid the separate entities from seeing each other’s data prior to the merger. What a predicament! The automated nature of data validation testing, (Automatically deploying preconfigured rules on large data-sets) enables our customers to prepare for successful day-1 reporting under these harsh conditions.

And what about you?  What are the costs to your business for potentially delivering incorrect, incomplete or missing data? To learn more about how you can provide the right data on time, every time, please visit www.datavalidation.me

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , , , , , | Leave a comment

Comparative Costs and Uses for Data Integration Platforms – A Study from Bloor Research

Data Integration PlatformsFor years, companies have wrestled with the promise of data integration platforms. Along the way, businesses have asked many questions, including:

  • Does Data Integration technology truly provide a clear path toward unified data?
  • Can businesses truly harness the potential of their information?
  • Can companies take powerful action as a result?

Recently, Bloor Research set out to evaluate how things were actually playing out on the ground. In particular, they wanted to determine which data integration projects were actually taking place, at what scale, and with what results. The study, “Comparative Costs and Uses for Data Integration Platforms,” was authored by Philip Howard, research director at Bloor. The study examined data integration tool suitability across a range of scenarios, including:

  1. Data migration and consolidation projects
  2. Master data management (MDM) and associated solutions
  3. Application-to-application integration
  4. Data warehousing and business intelligence implementations
  5. Synching data with SaaS applications
  6. B2B data exchange

To draw conclusions, Bloor examined 292 responses from a range of companies. The responders used a variety of data integration approaches, from commercial data integration tools to “hand-coding.”

Informatica is pleased to be able to offer you a copy of this research for your review. The research covers areas like:

  • Suitability
  • Productivity
  • Reusability
  • Total Cost of Ownership (TCO)

We welcome you to download a copy of “Comparative Costs and Uses for Data Integration Platforms” today. We hope these findings offer you insights as you implement and evaluate your data integration projects and options.

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Integration Platform, Data Migration, Master Data Management | Tagged , , , | Leave a comment

Oracle Data Migration Best Practices: Join Us And Learn

Oracle Data Migration Best PracticesAre you interested in Oracle Data Migration Best Practices? Are you upgrading, consolidating or migrating to or from an Oracle application? Moving to the cloud or a hosted service? Research and experience confirms that the tasks associated with migrating application data during these initiatives have the biggest impact on whether the project is considered a failure or success. So how do your peers ensure data migration success?

Informatica will be offering a full day Oracle Migrations Best Practices workshop at Oracle Application User Group’s annual conference, Collaborate 14, this year on April 7th in Las Vegas, NV. During this workshop, peers and experts will share best practices for how to avoid the pitfalls and ensure successful projects, lowering migration cost and risk. Our full packed agenda includes:

  1. Free use and trials of data migration tools and software
  2. Full training sessions on how to integrate cloud-based applications
  3. How to provision test data using different data masking techniques
  4. How to ensure consistent application performance during and after a migration
  5. A review of Oracle Migration Best Practices and case studies

Case Study: EMC

One of the key case studies that will be highlighted is EMC’s Oracle migration journey. EMC Corporation migrated to Oracle E-Business Suite, acquired more than 40 companies in 4 years, consolidated and retired environments, and is now on its path to migrating to SAP. Not only did they migrate applications, but they also migrated their entire technology platform from physical to virtual on their journey to the cloud. They needed to control the impact of data growth along the way, manage the size of their test environments while reducing the risk of exposing sensitive data to unauthorized users during development cycles. With best practices, and the help from Informatica, they estimate that they have saved approximately $45M in IT cost savings throughout their migrations. Now that they are deploying a new analytics platform based on Hadoop. They are leveraging existing skill sets and Informatica tools to ensure data is loaded into Hadoop without missing a beat.

Case Study: Verizon

Verizon is the second case study we will be discussing. They recently migrated to Salesforce.com and needed to ensure that more than 100 data objects were integrated with on-premises, back end applications. In addition, they needed to ensure that data was synchronized and kept secure in non-production environments in the cloud. They were able to leverage a cloud-based integration solution from Informatica to simplify their complex IT application architecture and maintain data availability and security – all while migrating a major business application to the cloud.

Case Study: OEM Heavy Equipment Manufacturer

The third case study we will review involves a well-known heavy equipment manufacturer who was facing a couple of challenges – the first was a need to separate data in in an Oracle E-Business Suite application as a result of a divestiture. Secondly, they also needed to control the impact of data growth on their production application environments that were going through various upgrades. Using an innovative approach based on Smart Partitioning, this enterprise estimates it will save $23M over a 5 year period while achieving 40% performance improvements across the board.

To learn more about what Informatica will be sharing at Collaborate 14, watch this video. If you are planning to attend Collaborate 14 this year and you are interested in joining us, you can register for the Oracle Migrations Best Practices Workshop here.

FacebookTwitterLinkedInEmailPrintShare
Posted in Application ILM, Data masking, Data Migration | Tagged , | Leave a comment

Streamlining and Securing Application Test and Development Processes

Informatica recently hosted a webinar with Cognizant who shared how they streamline test data management processes internally with Informatica Test Data Management and pass on the benefits to their customers.  Proclaimed as the world’s largest Quality Engineering and Assurance (QE&A) service provider, they have over 400 customers and thousands of testers and are considered a thought leader in the testing practice.

We polled over 100 attendees on what their top challenges were with test data management considering the data and system complexities and the need to protect their client’s sensitive data.  Here are the results from that poll:

It was not surprising to see that generating test data sets and securing sensitive data in non-production environments were tied as the top two biggest challenges.   Data integrity/synchronization was a very close 3rd .

Cognizant with Informatica has been evolving its test data management offering to truly focus on not only securing sensitive data – but also improving testing efficiencies with identifying, provisioning and resetting test data – tasks that consume as much as 40% of testing cycle times.  As part of the next generation test data management platform, key components of that solution include:

Sensitive Data Discovery – an integrated and automated process that searches data sets looking for exposed sensitive data.  Many times, sensitive data resides in test copies unbeknownst to auditors.  Once data has been located, data can be masked in non-production copies.

Persistent Data Masking – masks sensitive data in-flight while cloning data from production or in-place on a gold copy.  Data formats are preserved while original values are completely protected.

Data Privacy Compliance Validation – auditors want to know that data has in fact been protected, the ability to validate and report on data privacy compliance becomes critical.

Test Data Management – in addition to creating test data subsets, clients require the ability to synthetically generate test data sets to eliminate defects by having data sets aligned to optimize each test case. Also, in many cases, multiple testers work on the same environment and may clobber each other’s test data sets.  Having the ability to reset test data becomes a key requirement to improve efficiencies.

lockner 2

Figure 2 Next Generation Test Data Management

When asked what tools or services that have been deployed, 78% said in-house developed scripts/utilities.  This is an incredibly time-consuming approach and one that has limited repeatability. Data masking was deployed in almost half of the respondents.

lockner 3

Informatica with Cognizant are leading the way to establishing a new standard for Test Data Management by incorporating both test data generation, data masking, and the ability to refresh or reset test data sets.  For more information, check out Cognizant’s offering based on Informatica: TDMaxim and White Paper: Transforming Test Data Management for Increased Business Value.

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Data masking, Data Migration, Data Privacy | Tagged , , , , | Leave a comment

Enterprise Application Projects Are Much Riskier Than You Think

IT application managers are constantly going through a process of integrating, modernizing and consolidating enterprise applications to keep them efficient and providing the maximum business value to the corporation for their cost.

But, it is important to remember that there is significant risk in these projects.  An article in the Harvard Business Review states that 17% of enterprise application projects go seriously wrong; going over budget by 200% and over schedule by 70%.  The HRB article refers to these projects as “black swans.”

How can you reduce this risk of project failure?  Typically, 30% to 40% of an enterprise application project is data migration.  A recent study by Bloor Research shows that while success rates for data migration projects are improving, 38% of them still miss their schedule and budget targets.

How can you improve the odds of success in data migration projects?

  1. Use data profiling tools to understand your data before you move it.
  2. Use data quality tools to correct data quality problems.  There is absolutely no point in moving bad data around the organization – but it happens.
  3. Use a proven external methodology. In plain English, work with people who have “done it before”
  4. Develop your own internal competence.  Nobody knows your data, and more importantly, the business context of your data than your own staff.  Develop the skills and engage your business subject matter experts.

Informatica has industry-leading tools, a proven methodology, and a service delivery team with hundreds of successful data migration implementations.

To find out more about successful data migration:

  • Informatica World:  Visit us at the Hands On Lab – Data Migration.
  • Informatica World: Informatica Presentation on Application Data Migration.

Application Data Migrations with Informatica Velocity Migration Methodology

Friday June 5, 2013          9:00 to 10:00

  • Informatica World: Data Migration Factory Presentation by  Accenture

Accelerating the Power of Data Migration

Tuesday June 4, 2013     2:00 to 3:00

 

FacebookTwitterLinkedInEmailPrintShare
Posted in Application Retirement, Data Governance, Data Migration, Data Quality, Informatica Events | Tagged , , , , | Leave a comment

Agile Data Integration Maximizes Business Value In The Era Of Big Data

Adopting Agile may require a cultural shift and in the beginning can be disruptive to an organization.  However, as I mentioned in Part 1 of this blog series, Agile Data Integration holds the promise to increase chances of success, deliver projects faster, and reduce defects.  Applying Lean principles within your organization can help ease the transition to Agile Data IntegrationLean is a set of principles first explored in the context of data integration by John Schmidt and David Lyle in their book on Lean Integration.  First and foremost Lean recommends an organization focus on eliminating waste and optimizing the data integration process from the customers’ perspective.  Agile Data Integration maximizes the business value of projects (e.g. Agile BI, Data Warehousing, Big Data Analytics, Data Migration, etc.) because you can get it right the first time by delivering exactly what the business needs when they need it.  Break big projects into smaller more manageable deliverables so that you can incrementally deliver value to the business.  Agile Data Integration also recommends the following: (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Services | Tagged , , , , , | Leave a comment

Video: Electronic Health Records Update

Richard Cramer, Chief Healthcare Strategist for Informatica shares some views on Electronic Health Record (EHR) adoption, including HITECH and Meaningful Use pressures. He also talks about the challenges that the future holds for EHRs.


 

Visit Informatica’s Healthcare pages for more on EMRs.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Healthcare, VLog | Tagged , , , , | Leave a comment

The Power Of Data

I was happy to do an architect-to-architect Webinar with David Lyle, which was more of an interactive conversation than a Webinar.  The focus was on the ability to provide integration using data virtualization, but the message was perhaps more profound than that.

The core issues that many enterprises face are that information is largely an asset that they cannot access.  The data is locked up within years and years of ill planned databases and applications where the core data, such as customer and sales information, is scattered throughout the enterprise.  Most staffers and executives consider this to be “just the way it is.” (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , , , , | Leave a comment

Solving FAS 166 And 167 Compliance With Data Integration And Data Quality

Back in June 2009, the Financial Accounting Standards Board (FASB) published Financial Accounting Statements No. 166, Accounting for Transfers of Financial Assets, and No. 167, Amendments to FASB Interpretation No. 46(R), which changes the way entities account for securitizations and special-purpose entities. The new standards will impact financial institution balance sheets beginning in 2010 and will require substantive changes to how banks account for many items, including securitized assets that had been previously excluded from these organizations’ balance sheets. Banks affected by the new accounting standards will be subject to higher risk-based regulatory capital requirements. So what does it all mean and how much will it cost banks to comply?  (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration, Data Quality, Financial Services, Governance, Risk and Compliance, Vertical | Tagged , , , , , , , | 1 Comment