Tag Archives: Methodology
So might read the subject line in a memo to business users from IT staff responsible for implementing a new application system. Changing requirements in a project is one of the most frustrating (for both business and IT staff) and time-consuming activities in a large project; so much so that sometimes it is the cause of massive project delays or even cancellation. But there is something wrong with the subject line; it presumes that the business users are to blame. They are not. Let’s explore the real root causes and the solutions to them. (more…)
I was talking to a customer the other day who is about to embark on a data quality quest. He asked me to explain my view of a data quality initiative. I explained to him that data quality is a never ending process. I said it was like dropping a pebble in the water. The initial ring is small but slowly expands outward. The data quality process is similar. You start small, with one application or one subject area, show some success then look to expand the process with additional applications or subject areas. Then expand further to additional business units or business functions until you have encompassed the entire enterprise. Then just when you think you’re done, you need to continue to monitor and repair data because it will degrade over time.
While what I said was true, he said that the business would reject that description out of hand. Business believes that all IT projects are never ending projects. It is this thought process that always pitches Business against IT. Business believes that IT never delivers a finished project. His analogy is that Data Quality is more like building a building. It takes a lot of time and effort to get the foundation right, then you need to add the structure, electrical, water, communication, and finishing work. Then you move in and begin to use the building but you’re not done. (more…)
I should start out by saying that this blog article is NOT about the High Performance Computing and Communication Act of 1991 which resulted in the National Information Infrastructure (NII), or the Enterprise Integration Act of 2002(1), or the U.S. National Intelligence Strategy for sharing information which is derived from the Intelligence Reform and Terrorism Prevention Act of 2004(2). I am not talking about man-made laws, but rather about laws of nature like gravity and electromagnetism. (more…)
In the current economic environment, where IT organizations are on shoestring budgets and every project requires a strong financial justification, Total Cost of Ownership (TCO ) has yet again become a key concern. Looking at TCO for data integration is no exception.
Most folks understand that TCO should look at both upfront and ongoing costs. And the technology/productivity angles are fairly obvious, even if they are not always easy to quantify—how much time and resource can a technology save both in upfront development and the downstream maintenance and administration. Technology-based factors such as ease of use, functional capabilities, and scalability/performance all fall into this equation. (more…)