Everybody’s doing it. And if not, they say they are doing it anyway! Are you doing it? We all hear the mantra: ‘Data is an asset’. Everybody wants to get in on the action. Hear at Informatica World 2015 how Informatica customers are using data integration agility to drive business agility. These organizations are relying on the Informatica platform as their data foundation. Does that sound a bit vague? Let’s get more specific here…
Who hasn’t used PayPal to send a secure payment? Would you like to know how PayPal is managing the data integration architecture to support and analyze 11.6 million payments per day? Hear PayPal’s Architect chat with Informatica PowerCenter Product Managers. They will discuss Advanced Scaling, Metadata Management and Business Glossary. Would you like to learn how these PowerCenter capabilities can benefit your business? Add this session to your IW15 registration.
Verizon’s Architect will talk about consolidating 50+ legacy applications into a new application architecture. Wow, that’s a massive data management effort! Are you curious to learn more about how to successfully manage an application modernization effort of this scope using Informatica tools? Add this session to your IW15 registration.
Did you know that HP boasts one of the most complex and largest Informatica installations on the face of the earth? HP’s Informatica Shared Services architecture allows hundreds of projects throughout HP worldwide to use PowerCenter data integration capabilities. And they do so easily and cost effectively. Join the session to gain insight on the design, architecture, creation, support, and overall governance of this solution. Would you like to learn more from HP on how to maximize the benefits of your Informatica investment? Add this session to your IW15 registration.
You have probably had your tires replaced at Discount Tire. Would you like to learn more about how Discount Tire leverages the Informatica Platform to gain business advantage? Discount Tire’s architect will share how they test and monitor their Data Integration environment using PowerCenter capabilities, such as Data Validation Testing and Proactive Monitoring. They will discuss the benefit of doing so across multiple use cases. They will highlight Discount Tire’s migration to a new SAP application and how the Informatica platform supports this migration. Would you like to improve how you test and monitor you critical business processes? Add this session to your IW15 registration.
Finally if your organization is planning any type of application modernization or rationalization, you will not want to miss this Informatica customer panel. This session will features experts from Cisco, Verizon and Discount Tire. Speakers will share their experience, insights and best practices for Application Consolidation and Migration. As we discussed in a previous blog post, failure rates for these projects are staggeringly high. Arm yourself with proven best practices for data management. This has been shown to increase the success of application modernization projects! To learn more Add this session to your IW15 registration.
We hope you can join us at Informatica World 2015! Come and enjoy the wealth of experience and insights shared by these industry experts and many others. See you there!
First off, let me get one thing off my chest. If you don’t pay close attention to your data, throughout the application consolidation or migration process, you are almost guaranteed delays and budget overruns. Data consolidation and migration is at least 30%-40% of the application go-live effort. We have learned this by helping customers deliver over 1500 projects of this type. What’s worse, if you are not super meticulous about your data, you can be assured to encounter unhappy business stakeholders at the end of this treacherous journey. The users of your new application expect all their business-critical data to be there at the end of the road. All the bells and whistles in your new application will matter naught if the data falls apart. Imagine if you will, students’ transcripts gone missing, or your frequent-flyer balance a 100,000 miles short! Need I say more? Now, you may already be guessing where I am going with this. That’s right, we are talking about the myths and realities related to your data! Let’s explore a few of these.
Myth #1: All my data is there.
Reality #1: It may be there… But can you get it? if you want to find, access and move out all the data from your legacy systems, you must have a good set of connectivity tools to easily and automatically find, access and extract the data from your source systems. You don’t want to hand-code this for each source. Ouch!
Myth #2: I can just move my data from point A to point B.
Reality #2: You can try that approach if you want. However you might not be happy with the results. Reality is that there can be significant gaps and format mismatches between the data in your legacy system and the data required by your new application. Additionally you will likely need to assemble data from disparate systems. You need sophisticated tools to profile, assemble and transform your legacy data so that it is purpose-fit for your new application.
Myth #3: All my data is clean.
Reality #3: It’s not. And here is a tip: better profile, scrub and cleanse your data before you migrate it. You don’t want to put a shiny new application on top of questionable data . In other words let’s get a fresh start on the data in your new application!
Myth #4: All my data will move over as expected
Reality #4: It will not. Any time you move and transform large sets of data, there is room for logical or operational errors and surprises. The best way to avoid this is to automatically validate that your data has moved over as intended.
Myth #5: It’s a one-time effort.
Reality #5: ‘Load and explode’ is formula for disaster. Our proven methodology recommends you first prototype your migration path and identify a small subset of the data to move over. Then test it, tweak your model, try it again and gradually expand. More importantly, your application architecture should not be a one-time effort. It is work in progress and really an ongoing journey. Regardless of where you are on this journey, we recommend paying close attention to managing your application’s data foundation.
As you can see, there is a multitude of data issues that can plague an application consolidation or migration project and lead to its doom. These potential challenges are not always recognized and understood early on. This perception gap is a root-cause of project failure. This is why we are excited to host Philip Russom, of TDWI, in our upcoming webinar to discuss data management best practices and methodologies for application consolidation and migration. If you are undertaking any IT modernization or rationalization project, such as consolidating applications or migrating legacy applications to the cloud or to ‘on-prem’ application, such as SAP, this webinar is a must-see.
So what’s your reality going to be like? Will your project run like a dream or will it escalate into a scary nightmare? Here’s hoping for the former. And also hoping you can join us for this upcoming webinar to learn more:
Webinar with TDWI:
Successful Application Consolidation & Migration: Data Management Best Practices.
Date: Tuesday March 10, 10 am PT / 1 pm ET
Don’t miss out, Register Today!
1) Gartner report titled “Best Practices Mitigate Data Migration Risks and Challenges” published on December 9, 2014
2) Harvard Business Review: ‘Why your IT project may be riskier than you think’.
The verdict is in. Data is now broadly perceived as a source of competitive advantage. We all feel the heat to deliver good data. It is no wonder organizations view Analytics initiatives as highly strategic. But the big question is, can you really trust your data? Or are you just creating pretty visualizations on top of bad data?
We also know there is a shift towards self-service Analytics. But did you know that according to Gartner, “through 2016, less than 10% of self-service BI initiatives will be governed sufficiently to prevent inconsistencies that adversely affect the business”?1 This means that you may actually show up at your next big meeting and have data that contradicts your colleague’s data. Perhaps you are not working off of the same version of the truth. Maybe you have siloed data on different systems and they are not working in concert? Or is your definition of ‘revenue’ or ‘leads’ different from that of your colleague’s?
So are we taking our data for granted? Are we just assuming that it’s all available, clean, complete, integrated and consistent? As we work with organizations to support their Analytics journey, we often find that the harsh realities of data are quite different from perceptions. Let’s further investigate this perception gap.
For one, people may assume they can easily access all data. In reality, if data connectivity is not managed effectively, we often need to beg borrow and steal to get the right data from the right person. If we are lucky. In less fortunate scenarios, we may need to settle for partial data or a cheap substitute for the data we really wanted. And you know what they say, the only thing worse than no data is bad data. Right?
Another common misperception is: “Our data is clean. We have no data quality issues”. Wrong again. When we work with organizations to profile their data, they are often quite surprised to learn that their data is full of errors and gaps. One company recently discovered within one minute of starting their data profiling exercise, that millions of their customer records contained the company’s own address instead of the customers’ addresses… Oops.
Another myth is that all data is integrated. In reality, your data may reside in multiple locations: in the cloud, on premise, in Hadoop and on mainframe and anything in between. Integrating data from all these disparate and heterogeneous data sources is not a trivial task, unless you have the right tools.
And here is one more consideration to mull over. Do you find yourself manually hunting down and combining data to reproduce the same ad hoc report over and over again? Perhaps you often find yourself doing this in the wee hours of the night? Why reinvent the wheel? It would be more productive to automate the process of data ingestion and integration for reusable and shareable reports and Analytics.
Simply put, you need great data for great Analytics. We are excited to host Philip Russom of TDWI in a webinar to discuss how data management best practices can enable successful Analytics initiatives.
And how about you? Can you trust your data? Please join us for this webinar to learn more about building a trust-relationship with your data!
- Gartner Report, ‘Predicts 2015: Power Shift in Business Intelligence and Analytics Will Fuel Disruption’; Authors: Josh Parenteau, Neil Chandler, Rita L. Sallam, Douglas Laney, Alan D. Duncan; Nov 21 2014
How would you like to wake up to an extra billion dollars, or maybe nine, in the bank? This has happened to a teacher in India. He discovered to his astonishment a balance of $9.8 billion in his bank account!
How would you like to be the bank who gave the client an extra nine Billion dollars? Oh, to be a fly on the wall when the IT department got that call. How do you even begin to explain? Imagine the scrambling to track down the source of the data error.
This was a glaringly obvious error, which is easily caught. But there is potential for many smaller data errors. These errors may go undetected and add up hurting your bottom line. How could this type of data glitch happen? More importantly, how can you protect your organization from these types of errors in your data?
A primary source of data mistakes is insufficient testing during Data Integration. Any change or movement of data harbors risk to its integrity. Unfortunately there are often insufficient IT resources to adequately validate the data. Some organizations validate the data manually. This is a lengthy, unreliable process, fraught with data errors. Furthermore manual testing does not scale well to large data volumes or complex data changes. So the validation is often incomplete. Finally some organizations simply lack the resources to conduct any level of data validation altogether.
Many of our customers have been able to successfully address this issue via automated data validation testing. (Also known as DVO). In a recent TechValidate survey, Informatica customers have told us that they:
- Reduce costs associated with data testing.
- Reduce time associated with data testing.
- Increase IT productivity.
- Increase the business trust in the data.
Customers tell us some of the biggest potential costs relate to damage control which occurs when something goes wrong with their data. The tale above, of our fortunate man and not so fortunate bank, can be one example. Bad data can hurt a company’s reputation and lead to untold losses in market-share and customer goodwill. In today’s highly regulated industries, such as healthcare and financial services, consequences of incorrect data can be severe. This can include heavy fines or worse.
Using automated data validation testing allows customers to save on ongoing testing costs and deliver reliable data. Just as important, it prevents pricey data errors, which require costly and time-consuming damage control. It is no wonder many of our customers tell us they are able to recoup their investment in less than 12 months!
TechValidate survey shows us that customers are using data validation testing in a number of common use cases including:
- Regression (Unit) testing
- Application migration or consolidation
- Software upgrades (Applications, databases, PowerCenter)
- Production reconciliation
One of the most beneficial use cases for data validation testing has been for application migration and consolidation. Many SAP migration projects undertaken by our customers have greatly benefited from automated data validation testing. Application migration or consolidation projects are typically large and risky. A Bloor Research study has shown 38% of data migration projects fail, incurring overages or are aborted altogether. According to a Harvard Business Review article, 1 in 6 large IT projects run 200% over budget. Poor data management is one of the leading pitfalls in these types of projects. However, according to Bloor Research, Informatica’ s data validation testing is a capability they have not seen elsewhere in the industry.
A particularly interesting example of above use case is in the case of M&A situation. The merged company is required to deliver ‘day-1 reporting’. However FTC regulations forbid the separate entities from seeing each other’s data prior to the merger. What a predicament! The automated nature of data validation testing, (Automatically deploying preconfigured rules on large data-sets) enables our customers to prepare for successful day-1 reporting under these harsh conditions.
And what about you? What are the costs to your business for potentially delivering incorrect, incomplete or missing data? To learn more about how you can provide the right data on time, every time, please visit www.datavalidation.me
Question: What do American Airlines, Liberty Mutual, Discount Tire and MD Anderson all have in common?
a) They are all top in their field.
b) They all view data as critical to their business success.
c) They are all using Agile Data Integration to drive business agility.
d) They have spoken about their Data Integration strategy at Informatica World in Vegas.
Did you reply all of the above? If so then give yourself a Ding Ding Ding. Or shall we say Ka-Ching in honor of our host city?
Indeed Data experts from these companies and many more flocked to Las Vegas for Informatica World. They shared their enthusiasm for the important role of data in their business. These industry leaders discussed best practices that facilitate an Agile Data Integration process.
American Airlines recently completed a merger with US Airways, making them the largest airline in the world. In order to service critical reporting requirements for the merged airlines, the enterprise data team undertook a huge Data Integration task. This effort involved large-scale data migration and included many legacy data sources. The project required transferring over 4TB of current history data for Day 1 reporting. There is still a major task of integrating multiple combined subject areas in order to give a full picture of combined reporting.
American Airlines architects recommend the use of Data Integration design patterns in order to improve agility. The architects shared success-factors for merger Data Integration. They discussed the importance of ownership by leadership from IT and business. They emphasized the benefit of open and honest communications between teams. They architects also highlighted the need to identify integration teams and priorities. Finally the architects discussed the significance of understanding cultural differences and celebrating success. The team summarized with merger Data Integration lessons learned : Metadata is key, IT and business collaboration is critical, and profiling and access to the data is helpful.
Liberty Mutual, the third largest property and casualty insurer in the US, has grown through acquisitions. The Data Integration team needs to support this business process. They have been busy integrating five claim systems into one. They are faced with a large-scale Data Integration challenge. To add to the complexity, their business requires that each phase is completed in one weekend, no data is lost in the process and that all finances balance out at the end of each merge. Integrating all claims in a single location was critical for smooth processing of insurance claims. A single system also leads to reduced costs and complexity for support and maintenance.
Liberty Mutual experts recommend a methodology of work preparation, profiling, delivery and validation. Rinse and repeat. Additionally, the company chose to utilize a visual Data Integration tool. This tool was quick and easy for the team to learn and greatly enhanced development agility.
Discount Tire, the largest independent tire dealer in the USA, shared tips and tricks from migrating legacy data into a new SAP system. This complex project included data conversion from 50 legacy systems. The company needs to combine and aggregate data from many systems, including customer, sales, financial and supply chain. This integrated system helps Discount Tire make key business decisions and remain competitive in a highly competitive space.
Discount Tire has automated their data validation process in development and in production. This reduces testing time, minimizes data defects and increases agility of development and operations. They have also implemented proactive monitoring in order to accomplish early detection and correction of data problems in production.
MD Anderson Cancer Center is the No. 1 hospital for cancer care in the US according to U.S. News and World Report. They are pursuing the lofty goal of erasing cancer from existence. Data Integration is playing an important role in this fight against cancer. In order to accomplish their goal, MD Anderson researchers rely on integration of vast amounts of genomic, clinical and pharmaceutical data to facilitate leading-edge cancer research.
MD Anderson experts pursue Agile Data Integration through close collaboration between IT and business stakeholders. This enables them to meet the data requirements of the business faster and better. They shared that data insights, through metadata management, offer a significant value to the organization. Finally the experts at MD Anderson believe in ‘Map Once, Deploy Anywhere’ in order to accomplish Agile Data Integration.
So let’s recap, Data Integration is helping:
– An airlines continue to serve its customers and run its business smoothly post-merger.
– A tire retail company to procure and provide tires to its customers and maintain leadership
– An insurance company to process claims accurately and in a timely manner, while minimizing costs, and
– A cancer research center to cure cancer.
Not too shabby, right? Data Integration is clearly essential to business success!
So OK, I know, I know… what happens in Vegas, stays in Vegas. Still, this was one love-fest I was compelled to share! Wish you were there. Hopefully you will next year!
To learn more about Agile Data Integration, check out this webinar: Great Data by Design II: How to Get Started with Next-Gen Data Integration
A Data Integration Developer, a Data Analyst and a Business Analyst go into a bar… Heard that one? You probably didn’t. They never made it to the bar. They are still back at the office, going back and forth for the umpteenth time on the data requirements for the latest report…
Sounds familiar? If so, you are not alone. Many IT departments are struggling to meet the data needs of their business counterparts. Spreadsheets, emails and cocktail napkins have not proven themselves an effective tool for relaying data requirement by the business. The process takes too long and leaves both sides frustrated and dissatisfied with the outcome. IT does not have the bandwidth to meet the ever-increasing and rapidly changing data needs of the business.
The old-fashioned “waterfall” approach to data integration simply won’t cut it anymore in the fast-paced data-driven world. There has to be a better way. Here at Informatica, we believe that an end-to-end Agile Data Integration process can greatly increase business agility.
We start with a highly collaborative process, whereby IT and the Analyst work closely together through an iterative process to define data integration requirements. IT empowers the analyst with self-service tools that enable rapid prototyping and data profiling. Once the analyst is happy with the data they access and combine, they can use their tool to seamlessly share the output with IT for final deployment. This approach greatly reduces the time-to-data, and not just any data, the right data!
The ability to rapidly generate reports and deliver new critical data for decision-making is foundational to business agility. Another important aspect of business agility is the ability to scale your system as your needs grow to support more data, data types, users and projects. We accomplish that through advanced scaling capabilities, such as grid support and high availability, leading to zero downtime, as well as improved data insights through metadata management, lineage, impact analysis and business glossary.
Finally, we need to continue to ensure agility when our system is in production. Data validation should be performed to eliminate data defects. Trying to manually validate data is like looking for a needle in a haystack, very slowly… Automating your data validation process is fast and reliable, ensuring that the business gets accurate data all the time.
It is just as important to become more proactive and less reactive when it comes to your data in production. Early detection of data process and workflow problems through proactive monitoring is key to prevention.
Would you like to see a 5X increase in the speed of delivering data integration projects?
Would you like to provide the system reliability you need as your business grows, and ensure that your business continues to get the critical data it requires without defects and without interruption?
To learn more about how Agile Data Integration can enable business agility, please check out the demonstration of the newly-released PowerCenter 9.6, featuring David Lyle, VP Product Strategy at Informatica and the Informatica Product Desk experts. This demo webinar is available on demand.
Now you can experience the next best thing by attending InformaticaWorld 2014 and hearing the American Airlines US Airways Data Architects talk about the data challenges they faced. They will discuss the role of architecture in M&A, integrating legacy data, lessons learned, and best practices in Data Integration.
While you are at the show, you will have the opportunity to hear many industry experts discuss current trends in Agile end-to-end Data Integration.
Agile Data Integration Development
To deliver the agility that your business requires, IT and Business must pursue a collaborative Data Integration process, with the appropriate Analyst self-service Data Integration tools. At InformaticaWorld, you can learn about Agile Data Integration development from the experts at GE Aviation, who will discuss Agile Data Integration for Big Data Analytics. Experts from Roche, will discuss how Agile Data Integration has lead to a 5x reduction in development time, improved business self-service capabilities and increased data credibility.
Another aspect of agility is your ability to scale your Data Warehouse to rapidly support more data, data sources, users and projects. Come hear the experts from Liberty Mutual share challenges, pitfalls, best practices and recommendations for those considering large-scale Data Integration projects, including successful implementation of complex data migrations, data quality and data distribution processes.
The management of an enterprise-scale Data Warehouse involves the operation of a mature and complex mission-critical environment, which is commonly driven through an Integration Competency Center (ICC) initiative. You now have the need to inspect and adapt your production system and expedite data validation and monitoring processes through automation, so that data issues can be quickly caught and corrected and resources can be freed up to focus on development.
The experts from University of Pittsburgh Medical Center, along with Informatica Professional Services experts, will discuss best practices, lessons learned and the process of transitioning from ‘analytics as project’ to an enterprise initiative through the use of an Integration Competency Center.
Hear from the Informatica Product Experts
You will have many opportunities to hear directly from the Informatica product experts about end-to-end Data Integration Agility delivered in the recent 9.6 release of PowerCenter.
See PowerCenter 9.6 in Action
Don’t miss the opportunity to see live demos of the cool new features of PowerCenter 9.6 release at the multitude of hands-on labs being offered at InformaticaWorld this year.
For example you can learn how to empower business users through self-service Data Integration with PowerCenter Analyst tool; how to reduce testing time of Data Integration projects through automated validation tests; and how to scale your Data Integration with High Availability and Grid.
The sessions we described here are a sampling of the rich variety of sessions that will be offered on Data Integration at the show. We hope that you will join us at InformaticaWorld this year in Las Vegas on May 13-15 and as you plan your visit, please check out the complete listing of sessions and labs that are focused on Data Integration.
Please feel free to leave a comment and let us know which InformaticaWorld session/s you are most looking forward to! See you there!