Tag Archives: Stewardship
Ok, I know it’s a little late to post 2013 technology predictions, but with so many good ones published already, I figured I’d sandbag a little and not only post a few of my own but also share a few of my favorites so far. For me, it starts with Mary Meeker’s Internet Trends presentation. 2013 is going to be a year of “re-imagining” enterprise software, from social, to mobile, to cloud, to Big Data and Analytics.
Are BI managers and professionals sometimes too eager to please the business? Are centralized BI efforts slowing down progress? Should BI teams address requirements before the business even asks for them? These questions may seem counter-intuitive, but Wayne Eckerson, director of research for TDWI, says that the best intentions for BI efforts in many organizations may actually result in sluggish projects, duplication of effort, and misaligned priorities between BI teams and the business. (more…)
At a conference last fall, I heard Martin Brodbeck, executive director for strategic architecture at Pfizer, describe how his company, a $48-billion pharmaceutical giant, was able to employ master data management (MDM) to bring together data assets from across its global enterprise into a single, centralized data definition.
The key ingredient to Pfizer’s success in this area, Brodbeck said, was not technology by itself, but enterprise governance. Pfizer’s MDM effort was led by an internal business sponsor, who helped promote the concept to the rest of the global enterprise. “Master data management is much more about governance than it is about technology,” he pointed out. (more…)
Data governance has a big job: establishing the processes, policies, standards, organization, and technologies required to manage and ensure the availability, accessibility, quality, consistency, auditability and security of data in an organization.
In many instances it is seen as too overwhelming or daunting to undertake, like Don Quixote chasing windmills. Too often people are immobilized by analysis paralysis and fail to move forward after initial meetings and excitement. But don’t stop before you get started. Data governance is like the force-field around your data, protecting that “single version of the truth.”
OK, so maybe “force-field” isn’t be best analogy. Perhaps data governance would be better explained with a story. After all, there’s nothing “sci-fi” about reality or customer success. (more…)
In my previous posts, I have discussed building the business case for data quality as well as the role that a data quality dashboard plays in supporting this case. As previously noted, these efforts will directly impact your ability to articulate the need to pursue a data quality initiative. The reason for returning to this topic is that I have recently participated in multiple discussions with a variety of companies that were either in the process of forming a data governance council or in the process of building the internal business case to support exploring a data governance initiative. In these discussions two common threads were present – the role of data quality in the data governance initiative and the need to change the culture within the organization if data governance is going to succeed. Although these are only two aspects to consider when pursuing a data governance initiative, they are directly tied to the underlying success or failure of the program. (more…)
Happy New Year! I look forward to discussing a myriad of Enterprise Data Management topics with you this year. My work with customers never stops and I’ve made a 2008 resolution to share as much of their success as possible. I’ll start with one of the oldest but least addressed problems in Data Integration.
Have you ever asked yourself or been asked, “Where did that number come from?” or, if you’re in IT, have you been confronted by your business colleagues with “Those numbers don’t make sense!” I find these to be very common questions that consume hours and days of business and IT analyst time. Think about it, at the grass roots level of every company or organization, the amount of time spent deciphering numbers from reports is staggering.
This challenge starts from the very beginning of intelligence gathering, underlying data from operational systems. It’s why the first step in any data integration project (DW, Migration, MDM, Consolidation, etc…) is to understand and map out the nature and location of the data appropriate for the business problem at hand. An estimated 70 percent of the time spent on any corporate application development is dedicated to finding, identifying, reconciling, and verifying data, and then determining the consequences of modifying the data. This is what makes traditional integration projects so time- and resource-intensive—and what makes metadata so useful in exercising internal control or streamlining a myriad of related activities. The recent Informatica Release 8.5 launch highlighted “data lineage” for helping IT resolve questions for the business as well as providing “self service” for answering data-related questions for analysts and developers.
It’s time to stop getting bossed around by technology. IT groups too often create their list of projects and priorities based on a logical and mainly technical viewpoint. Their agenda is driven by upgrading hardware and software to take advantage of new capabilities that vendors have built into their products, along with support and maintenance.
A couple of significant trends have occurred since Y2Kand the internet boom:
The first wave was a drive to reduce costs by combining hardware and software platforms and licenses, along with labor reductions. Mission accomplished.
The next wave was recognition by business executives and groups that this BI, CPM and DW stuff could be leveraged to increase revenue, improve responsiveness to customers, develop products faster, etc. A funny thing happened; IT did not get the memo! IT too often still bases their IT budget and plan on technology priorities and sometimes vendor priorities rather than what the business needs.
The madness has got to stop.
Business is crying for new or expanded BI, CPM and DW programs. Many, maybe most, of your business top initiatives need data. Every one of your data projects should be directly tied to a priority business initiative.
One of the frustrations many IT people encountered with their data governance or integration competency center (ICC) programs is that the business is not really participating at the level needed and easily gets distracted.
Wake up, they are getting “distracted” by the priority business initiatives. If your data governance and ICC programs were an integral part of those business initiatives then the sponsoring business executives would not be distracted and would be committed.
EDM is a long-term program that requires participation and commitment from both IT and business. To get that they both need to be on the same page. And what better way for the CIO and IT group to understand their business and shape their data projects to what their business needs then by having their data projects be part of the data initiatives.
Most importantly, that’s how you get true business ROI.
As we discussed in my last post, the four organizational models most often used to implement ICCs are: best practices, technology standards, shared services and centralized services. The best approach is what fits your enterprise best in terms of people, politics and integration maturity.
Below are some pros and cons of each approach.
A word of caution about this approach is that it is often implemented on a virtual team basis. Although it is a very appealing prospect to gain this knowledge without adding more resources, too often enterprises fail to recognize the level of efforts that each contributor needs to make for the common good. If the investment in people is recognized and encouraged – by incorporating as objectives in employee reviews and substituting this work for lower priority work – then this approach can quite effective. If, however, this activity is just thrown on top of everyone’s already overloaded workload, then this model will not be effective.
There are some concerns with this approach. First, how will the ICC team make its selection without using various projects for requirements input?
Second, the ICC team often does not have the budget or resources to adequately pick a technology platform. This approach means that an additional budget item must be allocated to a pure IT technology project – often a difficult sell in today’s environment.
Third, while the ICC team is examining the technology options, other projects are moving forward without the guidance that the ICC can provide. In other words, more data silos may be built just as the ICC looks at ways to prevent them.
Finally, what are the real incentives for each project to adopt the common technology platform? Many corporate cultures will allow new projects to pick another platform anyway, because these projects must still deliver their objectives in addition to picking up the requirements of the technology platform. Many projects will justify moving forward on their own because of their limited budget, resources and expertise.
This organization enables the enterprise to develop deep data integration skills because the ICC specializes in that area. This approach works extremely well, especially when reinforced by strong business and IT sponsorship along with supporting budgetary investments. The main caution (which Don also points out in his post ) is that decentralized companies may be reluctant to use the shared services approach. However, with strong financial incentives and recognition that many IT services already operate in this manner, initial resistance can be overcome.
This is an extremely effective model if the corporate culture supports centralized development and control. However, in a decentralized culture, the shared services model will have a better chance of success.
Data integration needs to become an enterprise-wide infrastructure endeavor to stop the proliferation of data silos and provide your enterprise with the timely, accurate and appropriate business information it needs. The right vision, strategy and architecture will show you where you need to go; sponsorship and organization will help you make it happen.
You have the sponsorship and budget, now how do you set up your Integration Competency Center (ICC)?
There are several approaches to organizing the role and responsibilities of an ICC. The approach you take depends on your enterprise’s people, politics and processes. As mentioned in the last post, one approach is a centralized organization or centralized services model. However, that approach does not work in every situation or corporate culture, so don’t think you need to make it an ultimate goal.
The four organizational models most often used to implement ICCs are:
• Best practices
• Technology standards
• Shared services
• Centralized services
Although the models are organized in an increasing level of control and scope/size of an ICC, it is neither necessary nor recommended for every enterprise to “progress” up the ICC organization ladder. The best fit model is dependent on your situation, not some esoteric ideal model.
The simplest ICC model is one that documents and shares best practices across integration projects. This enables each project to leverage what was learned from previous integration development efforts without reinventing the wheel. It saves time and allows enables each project team to concentrate on expanding the overall enterprise’s integration expertise, thereby contributing new or improved best practices based on what they’ve learned. In this manner, enterprise integration expertise continues to grow project-by-project rather than being lost as each project is completed and resources inevitably scatter.
The second ICC model involves not only publishing best practices that can be used by all integration projects, but also recommending standardizing on a common integration platform. This approach goes beyond sharing ideas by enabling new project teams to leverage existing technology platform recommendations and avoid the costly and time-consuming selection process. And, if the integration standard is followed, it is possible that there is a potential pool of expertise in the enterprise that might be enlisted to work on this project. Standardizing on software platforms brings the potential for sharing infrastructure across projects (and avoiding costs for independent infrastructure.)
The third organizational ICC model involves creating a group of dedicated people to develop the integration components of individual projects. In this services model, the ICC operates as a subcontractor that can be tapped for the individual integration projects. The ICC determines the strategy, designs the architecture and selects the technology platform as prerequisites to its development efforts. The individual project teams are still responsible for the development effort; the ICC assigns resources for the project to complete its integration portion.
The final organizational model is a completely centralized ICC operation. Integration work is elevated to a status as its own program. In the centralized services model, the ICC operates as the systems integrator responsible for all integration work throughout the enterprise. All data integration components are pulled out of projects and placed into the integration program. Just as with the shared services model, the ICC determines the strategy, designs the architecture and selects the technology platform as prerequisites to its development efforts. However, unlike the previous model, the ICC operates the entire integration project.
Next post we will examine some of the pros and cons of each ICC model so that you can determine what may work best for your enterprise.