Tag Archives: Data Validation Option

The Billion Dollar (Data Integration) Mistake

How would you like to wake up to an extra billion dollars, or maybe nine, in the bank? This has happened to a teacher in India. He discovered to his astonishment a balance of $9.8 billion in his bank account!

Data IntegrationHow would you like to be the bank who gave the client an extra nine Billion dollars? Oh, to be a fly on the wall when the IT department got that call. How do you even begin to explain? Imagine the scrambling to track down the source of the data error.

This was a glaringly obvious error, which is easily caught. But there is potential for many smaller data errors. These errors may go undetected and add up hurting your bottom line.  How could this type of data glitch happen? More importantly, how can you protect your organization from these types of errors in your data?

A primary source of data mistakes is insufficient testing during Data Integration. Any change or movement of data harbors risk to its integrity. Unfortunately there are often insufficient IT resources to adequately validate the data. Some organizations validate the data manually. This is a lengthy, unreliable process, fraught with data errors. Furthermore manual testing does not scale well to large data volumes or complex data changes. So the validation is often incomplete. Finally some organizations simply lack the resources to conduct any level of data validation altogether.

Data Validation_Customer Benefits

Many of our customers have been able to successfully address this issue via automated data validation testing. (Also known as DVO). In a recent TechValidate survey, Informatica customers have told us that they:

  • Reduce costs associated with data testing.
  • Reduce time associated with data testing.
  • Increase IT productivity.
  • Increase the business trust in the data.

Customers tell us some of the biggest potential costs relate to damage control which occurs when something goes wrong with their data. The tale above, of our fortunate man and not so fortunate bank, can be one example. Bad data can hurt a company’s reputation and lead to untold losses in market-share and customer goodwill.  In today’s highly regulated industries, such as healthcare and financial services, consequences of incorrect data can be severe. This can include heavy fines or worse.

Using automated data validation testing allows customers to save on ongoing testing costs and deliver reliable data. Just as important, it prevents pricey data errors, which require costly and time-consuming damage control. It is no wonder many of our customers tell us they are able to recoup their investment in less than 12 months!

Data Validation_Use Cases

TechValidate survey shows us that customers are using data validation testing in a number of common use cases including:

  • Regression (Unit) testing
  • Application migration or consolidation
  • Software upgrades (Applications, databases, PowerCenter)
  • Production reconciliation

One of the most beneficial use cases for data validation testing has been for application migration and consolidation. Many SAP migration projects undertaken by our customers have greatly benefited from automated data validation testing.  Application migration or consolidation projects are typically large and risky. A Bloor Research study has shown 38% of data migration projects fail, incurring overages or are aborted altogether. According to a Harvard Business Review article, 1 in 6 large IT projects run 200% over budget. Poor data management is one of the leading pitfalls in these types of projects. However, according to Bloor Research, Informatica’ s data validation testing is a capability they have not seen elsewhere in the industry.

A particularly interesting example of above use case is in the case of M&A situation. The merged company is required to deliver ‘day-1 reporting’. However FTC regulations forbid the separate entities from seeing each other’s data prior to the merger. What a predicament! The automated nature of data validation testing, (Automatically deploying preconfigured rules on large data-sets) enables our customers to prepare for successful day-1 reporting under these harsh conditions.

And what about you?  What are the costs to your business for potentially delivering incorrect, incomplete or missing data? To learn more about how you can provide the right data on time, every time, please visit www.datavalidation.me

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , , , , , | Leave a comment

Agile Data Integration in Action: PowerCenter 9.6 Demo

PowerCenter 9.6 Demo WebinarA Data Integration Developer, a Data Analyst and a Business Analyst go into a bar… Heard that one? You probably didn’t. They never made it to the bar. They are still back at the office, going back and forth for the umpteenth time on the data requirements for the latest report…

Sounds familiar? If so, you are not alone. Many IT departments are struggling to meet the data needs of their business counterparts. Spreadsheets, emails and cocktail napkins have not proven themselves an effective tool for relaying data requirement by the business. The process takes too long and leaves both sides frustrated and dissatisfied with the outcome. IT does not have the bandwidth to meet the ever-increasing and rapidly changing data needs of the business.

The old-fashioned “waterfall” approach to data integration simply won’t cut it anymore in the fast-paced data-driven world. There has to be a better way. Here at Informatica, we believe that an end-to-end Agile Data Integration process can greatly increase business agility.

We start with a highly collaborative process, whereby IT and the Analyst work closely together through an iterative process to define data integration requirements. IT empowers the analyst with self-service tools that enable rapid prototyping and data profiling. Once the analyst is happy with the data they access and combine, they can use their tool to seamlessly share the output with IT for final deployment. This approach greatly reduces the time-to-data, and not just any data, the right data!

The ability to rapidly generate reports and deliver new critical data for decision-making is foundational to business agility. Another important aspect of business agility is the ability to scale your system as your needs grow to support more data, data types, users and projects. We accomplish that through advanced scaling capabilities, such as grid support and high availability, leading to zero downtime, as well as improved data insights through metadata management, lineage, impact analysis and business glossary.

Finally, we need to continue to ensure agility when our system is in production. Data validation should be performed to eliminate data defects. Trying to manually validate data is like looking for a needle in a haystack, very slowly… Automating your data validation process is fast and reliable, ensuring that the business gets accurate data all the time.

It is just as important to become more proactive and less reactive when it comes to your data in production. Early detection of data process and workflow problems through proactive monitoring is key to prevention.

Would you like to see a 5X increase in the speed of delivering data integration projects?

Would you like to provide the system reliability you need as your business grows, and ensure that your business continues to get the critical data it requires without defects and without interruption?

To learn more about how Agile Data Integration can enable business agility, please check out the demonstration of the newly-released PowerCenter 9.6, featuring David Lyle, VP Product Strategy at Informatica and the Informatica Product Desk experts. This demo webinar is available on demand.

Deep Dive Demo: Informatica PowerCenter 9.6.

FacebookTwitterLinkedInEmailPrintShare
Posted in Big Data, Business Impact / Benefits, Business/IT Collaboration, Data Integration | Tagged , , , , , | Leave a comment

Validating Data for Production Environments

The BBC published a news story where a teacher in India looked at his bank account expecting to see a balance of $200 only to find that the balance shown was $9.8 billion (480 billion rupees).  Imagine that surprise!

These types of stories appear in the news on a regular basis, and the question they raise is this: How could an error of this magnitude have happened and what could have been done to prevent it? (more…)

FacebookTwitterLinkedInEmailPrintShare
Posted in Data Integration | Tagged , , , , , , | Leave a comment