Collaborative learning is essential for transforming work activities that involve a high degree of uncertainty and creativity into a lean value stream. These characteristics are common in enterprise integration initiatives due to unclear and inconsistent data definitions across multiple silos, rapidly changing requirements and lack of perfect knowledge around end-to-end processes. Traditional approaches generally end up propagating the integration hairball which is inefficient and wasteful – and certainly not Lean. You could say that these value streams are simply immature processes that lack standards and metrics, which is true, but the practitioners that are involved in the process don’t see it that way. They see themselves as highly skilled professionals solving complex unique problems and delivering customized solutions that fit like a glove. But yet, the outside observer who looks at the end-to-end process at the macro level sees patterns that are repeated over and over again and what appears to be a great deal of “reinventing the wheel.”
So how can we solve this problem? How can a business analyst in New York defining a new BI report, work with a handful of analysts from business units around the country each of which have pieces of the required data, who rely on software developers in Bangalore to build the infrastructure elements who are working for a project architect in Chicago. How can they all work together to eliminate waste in the value stream and deliver a high quality solution quickly?
Collaborative Learning is the term used to describe situations where multiple people attempt to learn something together. As per Wikipedia, it is where “People engaged in collaborative learning capitalize on one another’s resources and skills….collaborative learning is based on the model that knowledge can be created within a population where members actively interact by sharing experiences and take on asymmetric roles.”
One example of collaborative learning is Agile Data Integration as John Haddad wrote about recently. This is where analysts, developers and designers work together in a high-performance team to collaborate on source-to-target ETL mapping specifications using role-based tools and a central metadata repository. The tools and central repository enable the team to share information as they refine BI reporting needs, as they discover source data quality issues, as they learn about data inconsistencies from difference databases and as they build and test the mapping software. Each person on the team has specific skills and part of the knowledge needed to build a total solution.
In traditional waterfall methodologies, the team members communicate and learn through documents (such as MS-Word or Excel) that are passed around in a formal review/signoff process. This is NOT lean or collaborative and is slow and wasteful. Each person must read through pages and pages of documentation written by others and if they learn something new that impacts their work, they must modify their document, create a new version and send it out again to the rest of the team. The process of learning across the diverse team is agonizingly slow!
The more rapidly the team can collaborate and learn the best way to connect source and target data, the greater their ability to eliminate waste and accelerate solution delivery – a 50%-90% reduction in lead time is common.
For more information, visit Informatica Best Practices. Or if you are skeptical about the claim that it’s possible to achieve up to 90% reduction in lead time, you should visit Informatica World in May 2012 www.informaticaworld.com and talk to people who have actually done it.
Tune in next week for my follow-up article on how to apply collaborative learning not just to one project, but how to make it sustainable across the enterprise.