Tag Archives: Governance
Last year, while still an analyst with Forrester Research, OCDQ “Blogger in Chief” Jim Harris and I coordinated dueling blogs taking polar-opposite stances on the debate over whether Data Governance initiatives should embrace an approach to optimize their agility or an approach to formalize the necessary bureaucracy. (You can read Jim’s blog here and my blog here.)
That was a fun exercise, but our clear conclusion was that aspects of both agility and bureaucracy are necessary to some extent for data governance to deliver real business value. (more…)
In the world of big data, getting access to data and making sense of it is often times a more important consideration than managing sheer volume itself. Companies that are successful in unlocking true value from big data open themselves up to a world of insight for better understanding of things like customer preferences, satisfaction and regional purchasing differences. Doing this obviously is often harder than it seems due to the variety of information itself, leading to standardization and duplication issues. Ownership is often an issue as well, with departmental lines being the most common constraint to sharing important data across the enterprise. (more…)
Looking back at some of my Informatica Perspectives posts over the past year or so, I reflected on some common themes about data management and data governance, especially in the context of master data management and particularly, master data models. As both the tools and the practices around MDM mature, we have seen some disillusionment in attempts to deploy an MDM solution, with our customers noting that they continue to hit bumps in the road in the technical implementation associated with both master data consolidation and then with publication of shared master data.
Almost every issue we see can be characterized into one of three buckets: (more…)
In my last post I started to talk about ideas for classifying the data management issues, with the reasoning that it will help to determine the feasibility that the expectation that acquiring a particular solution will actually address the core issues. I actually have used this categorization with some of our customers, and the process of classification does lend some clarity when considering solutions. There are five categories: (more…)
The cost for 1GB of magnetic disk storage 20 years ago was $1,000 – now it’s eight cents. 1GB is enough to store about 20 thousand letter-size scanned documents. To store the same number of paper documents would require two four-drawer filing cabinets which would cost about $400. The cost of electronic data storage is five thousand times less than paper storage.
Costs have dropped consistently 40% per year which accounts for the more than 12,000 times reduction in cost since 1992. The cost for RAID or mainframe disk storage is somewhat greater, but the historical trend for other storage devices has been similar and the forecast for the foreseeable future is that costs will continue to decrease at the same rate. Twenty years from now we will be able to buy one tera-byte of storage for a penny. (more…)
The “Dodd-Frank Wall Street Reform and Consumer Protection Act” has recently been passed by the US federal government to regulate financial institutions. Per this legislation, there will be more “watchdog” agencies that will be auditing banks, lending and investment institutions to ensure compliance. As an example, there will be an Office of Financial Research within the Federal Treasury responsible for collecting and analyzing data. This legislation brings with it a higher risk of fines for non-compliance. (more…)
The CIO of GT Inc. (the fictitious name of a real company) met with his middleware vendor rep to deliver some depressing news.
“We established an outsourced factory delivery model two years ago using the productivity tools that you sold us and we made it our enterprise standard. The factory results however, are discouraging use of your integration platform. Projects are not getting approved by the business because of high costs, or else project teams are working around the standard and building hand-coded solutions. Did I make a mistake in buying your software?” (more…)
Gartner recently released their 2011 Magic Quadrant for Data Quality Tools and I’m happy to announce that Informatica is positioned in the Leaders’ quadrant. We believe our position is a testament to the fact that customers like Station Casinos and U.S. Xpress continue to turn to Informatica to solve their most critical data quality challenges.
The publishing of the Magic Quadrant is often a great opportunity to reflect on the state of the data quality market. It should come as no surprise that data quality as a business imperative isn’t going away any time soon. We are continuing to see customers looking for help and expertise in solving a wide range of data quality problems, largely associated with data governance initiatives, master data management (MDM), business intelligence and application modernization. And the association of data quality in these areas is only getting stronger. (more…)
Why does one software project cost twice as much as another? Is it because it is developing twice as much functionality as the other? If you contract with two system integrators, how can you tell which one is more productive? In a multi-year outsourcing arrangement, is your supplier getting more or less efficient year over year?
An enduring challenge in the software industry is establishing a standard unit of measurement that expresses the amount of business functionality in a given information system so that questions like these can be addressed. Most organizations have not adopted a formal measure, but of those that have, the most widely accepted measure is function points which were defined by Allan Albrecht in 1979. But are function points an effective metric for integration projects? (more…)
Applications are retired (sunset or decommissioned) when they become dormant or read-only. This occurs as a result of mergers and acquisitions or through modernization efforts and is a natural part of Application Information Lifecycle Management. While the applications may be no longer needed, the data they contain cannot be discarded. As a result, many organizations have hundreds, even thousands, of defunct applications that are consuming budget dollars, taking up data center space, complicating IT management, and generally just getting in the way. The challenge is getting rid of applications without getting rid of the data which is tightly coupled to them. (more…)