Tag Archives: Agile
Question: What do American Airlines, Liberty Mutual, Discount Tire and MD Anderson all have in common?
a) They are all top in their field.
b) They all view data as critical to their business success.
c) They are all using Agile Data Integration to drive business agility.
d) They have spoken about their Data Integration strategy at Informatica World in Vegas.
Did you reply all of the above? If so then give yourself a Ding Ding Ding. Or shall we say Ka-Ching in honor of our host city?
Indeed Data experts from these companies and many more flocked to Las Vegas for Informatica World. They shared their enthusiasm for the important role of data in their business. These industry leaders discussed best practices that facilitate an Agile Data Integration process.
American Airlines recently completed a merger with US Airways, making them the largest airline in the world. In order to service critical reporting requirements for the merged airlines, the enterprise data team undertook a huge Data Integration task. This effort involved large-scale data migration and included many legacy data sources. The project required transferring over 4TB of current history data for Day 1 reporting. There is still a major task of integrating multiple combined subject areas in order to give a full picture of combined reporting.
American Airlines architects recommend the use of Data Integration design patterns in order to improve agility. The architects shared success-factors for merger Data Integration. They discussed the importance of ownership by leadership from IT and business. They emphasized the benefit of open and honest communications between teams. They architects also highlighted the need to identify integration teams and priorities. Finally the architects discussed the significance of understanding cultural differences and celebrating success. The team summarized with merger Data Integration lessons learned : Metadata is key, IT and business collaboration is critical, and profiling and access to the data is helpful.
Liberty Mutual, the third largest property and casualty insurer in the US, has grown through acquisitions. The Data Integration team needs to support this business process. They have been busy integrating five claim systems into one. They are faced with a large-scale Data Integration challenge. To add to the complexity, their business requires that each phase is completed in one weekend, no data is lost in the process and that all finances balance out at the end of each merge. Integrating all claims in a single location was critical for smooth processing of insurance claims. A single system also leads to reduced costs and complexity for support and maintenance.
Liberty Mutual experts recommend a methodology of work preparation, profiling, delivery and validation. Rinse and repeat. Additionally, the company chose to utilize a visual Data Integration tool. This tool was quick and easy for the team to learn and greatly enhanced development agility.
Discount Tire, the largest independent tire dealer in the USA, shared tips and tricks from migrating legacy data into a new SAP system. This complex project included data conversion from 50 legacy systems. The company needs to combine and aggregate data from many systems, including customer, sales, financial and supply chain. This integrated system helps Discount Tire make key business decisions and remain competitive in a highly competitive space.
Discount Tire has automated their data validation process in development and in production. This reduces testing time, minimizes data defects and increases agility of development and operations. They have also implemented proactive monitoring in order to accomplish early detection and correction of data problems in production.
MD Anderson Cancer Center is the No. 1 hospital for cancer care in the US according to U.S. News and World Report. They are pursuing the lofty goal of erasing cancer from existence. Data Integration is playing an important role in this fight against cancer. In order to accomplish their goal, MD Anderson researchers rely on integration of vast amounts of genomic, clinical and pharmaceutical data to facilitate leading-edge cancer research.
MD Anderson experts pursue Agile Data Integration through close collaboration between IT and business stakeholders. This enables them to meet the data requirements of the business faster and better. They shared that data insights, through metadata management, offer a significant value to the organization. Finally the experts at MD Anderson believe in ‘Map Once, Deploy Anywhere’ in order to accomplish Agile Data Integration.
So let’s recap, Data Integration is helping:
– An airlines continue to serve its customers and run its business smoothly post-merger.
– A tire retail company to procure and provide tires to its customers and maintain leadership
– An insurance company to process claims accurately and in a timely manner, while minimizing costs, and
– A cancer research center to cure cancer.
Not too shabby, right? Data Integration is clearly essential to business success!
So OK, I know, I know… what happens in Vegas, stays in Vegas. Still, this was one love-fest I was compelled to share! Wish you were there. Hopefully you will next year!
To learn more about Agile Data Integration, check out this webinar: Great Data by Design II: How to Get Started with Next-Gen Data Integration
If you have been following publications in the Potential at Work Community or any number of Linkedin discussions such this one on the DrJJ group (a think-tank for information management best practices), you will have noticed the Agile methodology topic come up time and time again. For instance, check out the article Architect Your Way From Sluggish to Speed or the video Focus on Agility Adaptability. It hasn’t always been this way. For many years the architectural focus was on RASP.
So might read the subject line in a memo to business users from IT staff responsible for implementing a new application system. Changing requirements in a project is one of the most frustrating (for both business and IT staff) and time-consuming activities in a large project; so much so that sometimes it is the cause of massive project delays or even cancellation. But there is something wrong with the subject line; it presumes that the business users are to blame. They are not. Let’s explore the real root causes and the solutions to them. (more…)
Several Informatica bloggers have been writing about Lean Data Management for about a year including Agile Data Integration, Open Data, and Lean Data Warehouse to name a few. And now the book is finally available from book stores. OK, to be fair, the book is about the broader topic of Lean IT and I was one of eight authors that contributed to it; I wrote the chapter on Lean Data Management: The Invisible Dimension. Here is the introduction to the chapter: (more…)
The cover of the September 10 issue of ComputerWorld caught my attention; the headline was Rebirth of Re-Engineering. I was intrigued how the analysts and pundits would spin Business Process Reengineering since I hadn’t seen the BPR acronym much since it fell out of favor around the turn of the century. As it turns out, the NEW BPR is all about Lean and Agile and is being led by IT. Wow! (more…)
This article explores Agile Data Integration and Business Intelligence practices and contrasts leading practices and technologies. First some definitions.
Agile DI is the application of agile techniques (iterative/incremental development, cross-functional self-organizing teams, rapid/flexible response to change, etc.) to address data integration challenges such as migrating data between systems or consolidating data from multiple systems. Agile BI is the application of agile techniques to address business intelligence challenges such as identifying and analyzing data to support better business decision-making. These two disciplines sometimes overlap or support each other. For example, you might use Agile DI to move data into a data warehouse and Agile BI to get it out of the warehouse in a useful form. (more…)
Collaborative learning is essential for transforming work activities that involve a high degree of uncertainty and creativity into a lean value stream. These characteristics are common in enterprise integration initiatives due to unclear and inconsistent data definitions across multiple silos, rapidly changing requirements and lack of perfect knowledge around end-to-end processes. Traditional approaches generally end up propagating the integration hairball which is inefficient and wasteful – and certainly not Lean. You could say that these value streams are simply immature processes that lack standards and metrics, which is true, but the practitioners that are involved in the process don’t see it that way. They see themselves as highly skilled professionals solving complex unique problems and delivering customized solutions that fit like a glove. But yet, the outside observer who looks at the end-to-end process at the macro level sees patterns that are repeated over and over again and what appears to be a great deal of “reinventing the wheel.” (more…)
If you’re reading this article, you’re probably interested in big data, but don’t really know what you’re looking for with big data or how you’ll find it. Don’t feel confused. It’s not like traditional analytics, where you know the structure of the data – the relations across sources, the dimensions to build, the calculations to perform – and the reports you need. Big data can be completely unstructured, with no clear relationships. And you don’t know what you’re looking for until you find patterns. A complaint might come in about an online shopping cart being wiped out, which is what happened to my wife with a big toy retailer during some online Christmas shopping. I’ll tell you right now they ended up losing a lot of money. If they’re using big data, they might find a pattern of the steps that made her hit that bug. Then they might search for all customers that had the same problem, get their e-mail addresses or names, and do a recovery campaign. I hope the retailer is using big data properly. My wife would receive a call, and get that order. They’d be happy, and I’d be happy.
I had the opportunity to review and comment on the draft of a new Hadoop technical guide. It’s great to see the published paper: Technical Guide: Unleashing the Power of Hadoop with Informatica. This guide outlines the following five steps to get started with Hadoop from a data integration perspective.
(1) Select the Right Projects for Hadoop Implementation
Choose projects that fit Hadoop’s strengths and minimize its disadvantages. Enterprises use Hadoop in data-science applications for log analysis, data mining, machine learning and image processing involving unstructured or raw data. Hadoop’s lack of fixed-schema works particularly well for answering ad-hoc queries and exploratory “what if” scenarios. Hadoop Distributed File System (HDFS) and MapReduce address growth in enterprise data volumes from terabytes to petabytes and more; and the increasing variety of complex multi-dimensional data from disparate sources. (more…)
In the February 26, 2011 edition of InformationWeek, Chris Murphy wrote a compelling article reinforcing the need for IT to move faster. He highlighted several CIOs and what they are doing to move more quickly. Examples included:
- Dropping project cycle time from six to three months based on analyzing project data and baselining techniques
- Developing data centers and software development models that work off of a common set of standards for everything (factory approach)
- Deploying a minimal set of functionality, e.g. iPod touch for POS, and then seeing what suggestions for new innovation are unveiled
- Adopting new delivery models such as Agile (more…)