Tag Archives: Agile
If you have been following publications in the Potential at Work Community or any number of Linkedin discussions such this one on the DrJJ group (a think-tank for information management best practices), you will have noticed the Agile methodology topic come up time and time again. For instance, check out the article Architect Your Way From Sluggish to Speed or the video Focus on Agility Adaptability. It hasn’t always been this way. For many years the architectural focus was on RASP.
So might read the subject line in a memo to business users from IT staff responsible for implementing a new application system. Changing requirements in a project is one of the most frustrating (for both business and IT staff) and time-consuming activities in a large project; so much so that sometimes it is the cause of massive project delays or even cancellation. But there is something wrong with the subject line; it presumes that the business users are to blame. They are not. Let’s explore the real root causes and the solutions to them. (more…)
Several Informatica bloggers have been writing about Lean Data Management for about a year including Agile Data Integration, Open Data, and Lean Data Warehouse to name a few. And now the book is finally available from book stores. OK, to be fair, the book is about the broader topic of Lean IT and I was one of eight authors that contributed to it; I wrote the chapter on Lean Data Management: The Invisible Dimension. Here is the introduction to the chapter: (more…)
The cover of the September 10 issue of ComputerWorld caught my attention; the headline was Rebirth of Re-Engineering. I was intrigued how the analysts and pundits would spin Business Process Reengineering since I hadn’t seen the BPR acronym much since it fell out of favor around the turn of the century. As it turns out, the NEW BPR is all about Lean and Agile and is being led by IT. Wow! (more…)
This article explores Agile Data Integration and Business Intelligence practices and contrasts leading practices and technologies. First some definitions.
Agile DI is the application of agile techniques (iterative/incremental development, cross-functional self-organizing teams, rapid/flexible response to change, etc.) to address data integration challenges such as migrating data between systems or consolidating data from multiple systems. Agile BI is the application of agile techniques to address business intelligence challenges such as identifying and analyzing data to support better business decision-making. These two disciplines sometimes overlap or support each other. For example, you might use Agile DI to move data into a data warehouse and Agile BI to get it out of the warehouse in a useful form. (more…)
Collaborative learning is essential for transforming work activities that involve a high degree of uncertainty and creativity into a lean value stream. These characteristics are common in enterprise integration initiatives due to unclear and inconsistent data definitions across multiple silos, rapidly changing requirements and lack of perfect knowledge around end-to-end processes. Traditional approaches generally end up propagating the integration hairball which is inefficient and wasteful – and certainly not Lean. You could say that these value streams are simply immature processes that lack standards and metrics, which is true, but the practitioners that are involved in the process don’t see it that way. They see themselves as highly skilled professionals solving complex unique problems and delivering customized solutions that fit like a glove. But yet, the outside observer who looks at the end-to-end process at the macro level sees patterns that are repeated over and over again and what appears to be a great deal of “reinventing the wheel.” (more…)
If you’re reading this article, you’re probably interested in big data, but don’t really know what you’re looking for with big data or how you’ll find it. Don’t feel confused. It’s not like traditional analytics, where you know the structure of the data – the relations across sources, the dimensions to build, the calculations to perform – and the reports you need. Big data can be completely unstructured, with no clear relationships. And you don’t know what you’re looking for until you find patterns. A complaint might come in about an online shopping cart being wiped out, which is what happened to my wife with a big toy retailer during some online Christmas shopping. I’ll tell you right now they ended up losing a lot of money. If they’re using big data, they might find a pattern of the steps that made her hit that bug. Then they might search for all customers that had the same problem, get their e-mail addresses or names, and do a recovery campaign. I hope the retailer is using big data properly. My wife would receive a call, and get that order. They’d be happy, and I’d be happy.
I had the opportunity to review and comment on the draft of a new Hadoop technical guide. It’s great to see the published paper: Technical Guide: Unleashing the Power of Hadoop with Informatica. This guide outlines the following five steps to get started with Hadoop from a data integration perspective.
(1) Select the Right Projects for Hadoop Implementation
Choose projects that fit Hadoop’s strengths and minimize its disadvantages. Enterprises use Hadoop in data-science applications for log analysis, data mining, machine learning and image processing involving unstructured or raw data. Hadoop’s lack of fixed-schema works particularly well for answering ad-hoc queries and exploratory “what if” scenarios. Hadoop Distributed File System (HDFS) and MapReduce address growth in enterprise data volumes from terabytes to petabytes and more; and the increasing variety of complex multi-dimensional data from disparate sources. (more…)
In the February 26, 2011 edition of InformationWeek, Chris Murphy wrote a compelling article reinforcing the need for IT to move faster. He highlighted several CIOs and what they are doing to move more quickly. Examples included:
- Dropping project cycle time from six to three months based on analyzing project data and baselining techniques
- Developing data centers and software development models that work off of a common set of standards for everything (factory approach)
- Deploying a minimal set of functionality, e.g. iPod touch for POS, and then seeing what suggestions for new innovation are unveiled
- Adopting new delivery models such as Agile (more…)