Tag Archives: Agile
Several Informatica bloggers have been writing about Lean Data Management for about a year including Agile Data Integration, Open Data, and Lean Data Warehouse to name a few. And now the book is finally available from book stores. OK, to be fair, the book is about the broader topic of Lean IT and I was one of eight authors that contributed to it; I wrote the chapter on Lean Data Management: The Invisible Dimension. Here is the introduction to the chapter: (more…)
The cover of the September 10 issue of ComputerWorld caught my attention; the headline was Rebirth of Re-Engineering. I was intrigued how the analysts and pundits would spin Business Process Reengineering since I hadn’t seen the BPR acronym much since it fell out of favor around the turn of the century. As it turns out, the NEW BPR is all about Lean and Agile and is being led by IT. Wow! (more…)
This article explores Agile Data Integration and Business Intelligence practices and contrasts leading practices and technologies. First some definitions.
Agile DI is the application of agile techniques (iterative/incremental development, cross-functional self-organizing teams, rapid/flexible response to change, etc.) to address data integration challenges such as migrating data between systems or consolidating data from multiple systems. Agile BI is the application of agile techniques to address business intelligence challenges such as identifying and analyzing data to support better business decision-making. These two disciplines sometimes overlap or support each other. For example, you might use Agile DI to move data into a data warehouse and Agile BI to get it out of the warehouse in a useful form. (more…)
Collaborative learning is essential for transforming work activities that involve a high degree of uncertainty and creativity into a lean value stream. These characteristics are common in enterprise integration initiatives due to unclear and inconsistent data definitions across multiple silos, rapidly changing requirements and lack of perfect knowledge around end-to-end processes. Traditional approaches generally end up propagating the integration hairball which is inefficient and wasteful – and certainly not Lean. You could say that these value streams are simply immature processes that lack standards and metrics, which is true, but the practitioners that are involved in the process don’t see it that way. They see themselves as highly skilled professionals solving complex unique problems and delivering customized solutions that fit like a glove. But yet, the outside observer who looks at the end-to-end process at the macro level sees patterns that are repeated over and over again and what appears to be a great deal of “reinventing the wheel.” (more…)
If you’re reading this article, you’re probably interested in big data, but don’t really know what you’re looking for with big data or how you’ll find it. Don’t feel confused. It’s not like traditional analytics, where you know the structure of the data – the relations across sources, the dimensions to build, the calculations to perform – and the reports you need. Big data can be completely unstructured, with no clear relationships. And you don’t know what you’re looking for until you find patterns. A complaint might come in about an online shopping cart being wiped out, which is what happened to my wife with a big toy retailer during some online Christmas shopping. I’ll tell you right now they ended up losing a lot of money. If they’re using big data, they might find a pattern of the steps that made her hit that bug. Then they might search for all customers that had the same problem, get their e-mail addresses or names, and do a recovery campaign. I hope the retailer is using big data properly. My wife would receive a call, and get that order. They’d be happy, and I’d be happy.
I had the opportunity to review and comment on the draft of a new Hadoop technical guide. It’s great to see the published paper: Technical Guide: Unleashing the Power of Hadoop with Informatica. This guide outlines the following five steps to get started with Hadoop from a data integration perspective.
(1) Select the Right Projects for Hadoop Implementation
Choose projects that fit Hadoop’s strengths and minimize its disadvantages. Enterprises use Hadoop in data-science applications for log analysis, data mining, machine learning and image processing involving unstructured or raw data. Hadoop’s lack of fixed-schema works particularly well for answering ad-hoc queries and exploratory “what if” scenarios. Hadoop Distributed File System (HDFS) and MapReduce address growth in enterprise data volumes from terabytes to petabytes and more; and the increasing variety of complex multi-dimensional data from disparate sources. (more…)
In the February 26, 2011 edition of InformationWeek, Chris Murphy wrote a compelling article reinforcing the need for IT to move faster. He highlighted several CIOs and what they are doing to move more quickly. Examples included:
- Dropping project cycle time from six to three months based on analyzing project data and baselining techniques
- Developing data centers and software development models that work off of a common set of standards for everything (factory approach)
- Deploying a minimal set of functionality, e.g. iPod touch for POS, and then seeing what suggestions for new innovation are unveiled
- Adopting new delivery models such as Agile (more…)
For those of you who have been following my blog for some time, you no doubt remember the May article on THE NEXT BIG THING and the 10 Weeks to Lean Integration series. You also hopefully recall the discussions about the Integration Factory being the dominant integration platform technology for the next decade. While elements of the factory capability have been around for years, the Informatica 9 platform now offers the ability to build high-speed, efficient, “green” integrations with seamless collaboration between Business and IT. (more…)
There are many agile development methods and practices including Extreme Programming (XP), Scrum, Test Driven Development, Continuous Integration, Lean Software Development and Pair Programming to name a few. Core principles that most agile methods prescribe include incremental iterations with short time frames, minimal planning, teamwork, collaboration, and process adaptability throughout the life-cycle of the project.
While the concepts are easy enough to grasp and make a lot of sense, putting the principles into practice is hard as evidenced by more than one agile project that has failed. So what are some of the best practices that enable success? To narrow down the list, the focus of this article is on data integration scenarios. And to move beyond concepts and make it more practical I will focus even more specifically on the Informatica platform.
The question this article will try to answer is “what are the top recommendations for agile methods in order to accelerate time-to-implementation for integration solutions based on the Informatica platform?” In no particular order or priority, there are seven techniques which come to mind. (more…)